禁止目录内容,但robots.txt中的“允许目录"页面 [英] Disallow directory contents, but Allow directory page in robots.txt
问题描述
此方法可用于禁止目录下的页面,但仍允许该目录url上的页面吗?
Will this work for disallowing pages under a directory, but still allow a page on that directory url?
Allow: /special-offers/$
Disallow: /special-offers/
允许:
www.mysite.com/special-offers/
www.mysite.com/special-offers/
但阻止:
www.mysite.com/special-offers/page1
www.mysite.com/special-offers/page1
www.mysite.com/special-offers/page2.html
www.mysite.com/special-offers/page2.html
等
推荐答案
研究过Google自己的 robots.txt文件,他们正是在做我所质疑的事情.
Having looked at Google's very own robots.txt file, they are doing exactly what I was questioning.
在136-137行,他们有:
At line 136-137 they have:
Disallow: /places/
Allow: /places/$
因此,他们阻止了场所下的所有内容,但允许使用根场所URL.与我的语法唯一的不同是顺序,Disallow
是第一个.
So they are blocking any thing under places, but allowing the root places URL. The only difference with my syntax is the order, the Disallow
being first.
这篇关于禁止目录内容,但robots.txt中的“允许目录"页面的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!