机器人排除特殊字符的URL [英] Robots Excluding url with special character

查看:162
本文介绍了机器人排除特殊字符的URL的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

寻找与robots.txt有关的推荐,我们不想排除整个网站错误

Look for recommendation as pertains to robots.txt and we dont want to go wrong in excluding the entire site

以下命令适用于排除所有具有反斜杠的网址在URL中编码

Is the below command apt for excluding all url with backslash encoded in url

 Disallow: /*\

只会在网站的URL路径中排除url的反斜杠或%22。有一些页面已被反斜杠编入索引,并在Google网站管理员中重复出现。

Will it only exclude url's having backslash or %22 in the site url path. There are some pages been indexed with backslash and coming in as duplicate in Google webmaster.

上述命令是否以任何或任何方式阻止或阻止网站到搜索引擎除了使用反斜杠的url

Does the above command does not hinder or block site to search engines in any or either way except for url with backslash to it

推荐答案

为了更新,我们解决了通过htaccess应用301重定向而不是通过机器人屏蔽.txt

To update, we resolved it by rather applying 301 redirect through htaccess rather than blocking through robots.txt

这篇关于机器人排除特殊字符的URL的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆