是否需要在robots.txt中单独禁止相关子文件夹? [英] Does related subfolders need to be disallowed separately in robots.txt?
问题描述
禁止 robots.txt 中的某些文件夹是否会禁止其相关子文件夹?示例:
Will disallowing certain folder in robots.txt disallow its related subfolders? Example:
Disallow:/folder/
将匹配:
/folder/page
/folder/subfolder/page
或者它只会匹配:
/folder/page
那么如果第二种情况是真的,我是否需要分别禁止第二个和后续子文件夹?
So if the second case is true, do I need to disallow second and subsequent subfolder separately?
Disallow: /folder/
Disallow /folder/subfolder/
Disallow /folder/subfolder/onemorefolder
推荐答案
Robots.txt 没有文件夹"的概念,它只是字符串.您在 Disallow
中指定的任何内容都是 URL 路径的开头.
Robots.txt has no concept of "folders", it’s just strings. Whatever you specify in Disallow
is the beginning of the URL path.
Disallow:/
阻止路径以 /
开头的任何 URL(= 所有页面).
Disallow: /
blocks any URL whose path starts with /
(= all pages).
Disallow:/foo
阻止任何路径以 /foo
开头的 URL:
Disallow: /foo
blocks any URL whose path starts with /foo
:
/foo
/foobar
/foo.html
/foo/bar
/foo/bar/doe
Disallow:/foo/
阻止路径以 /foo/
开头的任何 URL:
Disallow: /foo/
blocks any URL whose path starts with /foo/
:
/foo/
/foo/bar.html
/foo/bar
/foo/bar/doe
这篇关于是否需要在robots.txt中单独禁止相关子文件夹?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!