机器人排除特殊字符的URL [英] Robots Excluding url with special character
问题描述
寻找与robots.txt有关的推荐,我们不想排除整个网站错误
Look for recommendation as pertains to robots.txt and we dont want to go wrong in excluding the entire site
以下命令适用于排除所有具有反斜杠的网址在URL中编码
Is the below command apt for excluding all url with backslash encoded in url
Disallow: /*\
只会在网站的URL路径中排除url的反斜杠或%22。有一些页面已被反斜杠编入索引,并在Google网站管理员中重复出现。
Will it only exclude url's having backslash or %22 in the site url path. There are some pages been indexed with backslash and coming in as duplicate in Google webmaster.
上述命令是否以任何或任何方式阻止或阻止网站到搜索引擎除了使用反斜杠的url
Does the above command does not hinder or block site to search engines in any or either way except for url with backslash to it
推荐答案
为了更新,我们解决了通过htaccess应用301重定向而不是通过机器人屏蔽.txt
To update, we resolved it by rather applying 301 redirect through htaccess rather than blocking through robots.txt
这篇关于机器人排除特殊字符的URL的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!