多站点TYPO3 v9,一个rootpage上的多个域的不同robots.txt [英] Multisite TYPO3 v9, distinct robots.txt for multiple domains on one rootpage
问题描述
出于营销目的,我是否维护一个具有两个不同域的相同网站,在 TYPO3 v8 中,我只需在根页面上添加一个域记录,并通过 realurl 为每个站点创建一个带有打字稿的个性化 robots.txt)...
For marketing purposes do I maintain one identical website with two different domains, in TYPO3 v8 I would simply add a domain record on the root page and create a personalised robots.txt with typoscript for each site trough realurl) ...
使用 v9 我找不到办法做到这一点,我尝试手动在 config.yaml 中输入各种注释,但没有任何效果(即我尝试复制 url 的注释)...
With v9 I cannot find a way to do this, I tried to enter various anottations in config.yaml manually, but nothing works (i.e. I tried to replicate the annotation for the url)...
routes:
-
route: robots.txt
type: staticText
content: "User-agent: *\r\nDisallow: /"
contentVariants:
-
content: "User-agent: *\r\nAllow: /"
condition: 'getenv("HTTP_HOST") == "2dn-domain.com"'
有谁知道有效的注释或不同的方法...
does anyone know a working annotation, or a different approach ...
推荐答案
我喜欢留在常规"解决方案中,所以我找到了一个中间地带:
I like to stay within the 'regular' solutions so I found a middleground:
在后端输入Route Type = Page, File or URL [uri]
使用值 t3://page?type=201
以解决机器人的页面类型
with value t3://page?type=201
so as to address a page type for robots
并使用 Typoscript 定义条件机器人文件:
and with Typoscript you define your conditional robots file:
# Theme robots.txt
robots = PAGE
robots {
typeNum = 201
config {
disableAllHeaderCode = 1
additionalHeaders.10.header = Content-Type:text/plain;charset=utf-8
xhtml_cleaning = 0
admPanel = 0
debug = 0
index_enable = 0
removeDefaultJS = 1
removeDefaultCss = 1
removePageCss = 1
INTincScript_ext.pagerender = 1
sourceopt.enabled = 0
}
10 = TEXT
10.value (
User-Agent: *
Allow: /
# indexed search
User-agent: googlebot
Disallow: /*?tx_indexedsearch
# folders
Disallow: /typo3/
Disallow: /typo3conf/
Allow: /typo3conf/ext/
Allow: /typo3temp/
# parameters
Disallow: /*?id=* # non speaking URLs
Disallow: /*&id=* # non speaking URLs
Disallow: /*cHash # no cHash
Disallow: /*tx_powermail_pi1 # no powermail thanks pages
Disallow: /*tx_form_formframework # no forms
# sitemap
Sitemap: {$theme.configuration.sitemap}
)
}
# Adwords Site closed
[globalString = ENV:HTTP_HOST=adw-domain.com]
robots.10.value (
User-Agent: *
Disallow: /
)
[global]
我还在 constants.typoscript 中为 seo 站点设置了一个常量:
I also set a constant in constants.typoscript for the seo site:
theme.configuration {
sitemap = /?eID=dd_googlesitemap
sitemap = http://seo-domain.com/sitemap/seo-domain.xml
}
这篇关于多站点TYPO3 v9,一个rootpage上的多个域的不同robots.txt的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!