调整htAccess以避免重复内容并避免谷歌 pandas 和合作? [英] tuning htAccess to avoid duplicated content and avoid problems with google panda & co?

查看:100
本文介绍了调整htAccess以避免重复内容并避免谷歌 pandas 和合作?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

大家好,



简介和代码示例:

我的一位朋友告诉我,我应该更新我们公司网站的HTACCESS以避免向Google提供重复内容,他的建议是更新.htAccess文件添加以下行:

Hello all,

Introduction and code sample:
A friend of mine told me that I should update the HTACCESS of our company site to avoid giving duplicate contents to Google, his recommendation would be to update the .htAccess file adding these lines:

RewriteCond %{HTTP_HOST} ^codeproject.com [NC]
RewriteRule ^/?(.*) http://www.codeproject.com/$1 [R=permanent,L]

RewriteCond %{HTTP_HOST} ^codeproject.es [NC]
RewriteRule ^/?(.*) http://www.codeproject.com/$1 [R=permanent,L]





这里的想法是告诉索引蜘蛛不要索引我们拥有的所有域,就好像它们将不同的内容重定向到主域。



当然我试图更改htAccess以查看结果是什么,但似乎没有任何更新,所以我回到原来的...



我担心真正的影响有多个域似乎指向相同的内容,如果他是对的,那么我们就有问题了。



我能理解主要想法,我甚至可以理解我已经放入htAccess文件的代码行,但我怎么能确定托管公司在购买域名时,他们自己没有采取任何措施来避免这种错误的情况吗?



注意:

在最近的一条评论中,建议我使用robots.txt来避免这种情况。据我所知,这不是我需要的,因为robots.txt用于告诉蜘蛛要抓取哪些文件以及哪些文件要避免抓取。在我的情况下,我有多个域指向相同的网页,因此,就像我会有不同的网页具有相同的内容,这将意味着受到谷歌的惩罚,因为它可以被理解为黑帽SEO技术。这就是为什么我的朋友建议我使用301重定向的原因。


问题:

有没有有什么方法可以知道我们是否受到Google的惩罚?

任何提示?

一个很好的解释性链接?



如果我想避免谷歌惩罚并让我的不同域名指向我的主域名,这是正确的做法吗?



一如既往地谢谢你提前。 :thumbsup:



The idea here would be to tell the indexing spiders not to index all the domains we have as if they would have different contents redirecting the spiders to the main domain.

Of course I've tried to change the htAccess to see which would be the result, but nothing seem to be updated so I've went back to the original one...

I'm worried on the real impact of having multiple domains that seem to be pointing to the same content, if he is right, then we have a problem.

I can understand the main idea and I can even understand the code lines that I've put into the htAccess file, but how can I be sure that the hosting company has not done anything to avoid this wrong situation by themselves at the moment of buying the domains?

Note:
In one recent comment, It has been recommended me to use "robots.txt" to avoid that. As far as I know, this is not what I need as "robots.txt" is used to tell the spiders what files to crawl and what files to avoid crawling. In my case, I have multiple domains that point to the same web pages, therefore, it is like I would have different web pages with the same content and that would mean to be punished by Google as it can be understood as a black hat SEO technique. This is the reason why my friend recommended me to use that 301 redirect.

Questions:
Is there any way to know if we are being punished by Google?
Any hint?
A good explanatory link?

If I want to avoid Google punishment and keep my different domains pointing to my main domain, which would be the correct way of doing it?

As always thank you in advance. :thumbsup:

推荐答案

1 [R =永久,L]

RewriteCond%{HTTP_HOST} ^ codeproject.es [NC]
RewriteRule ^ /?(。*)http://www.codeproject.com/
1 [R=permanent,L] RewriteCond %{HTTP_HOST} ^codeproject.es [NC] RewriteRule ^/?(.*) http://www.codeproject.com/


1 [R =永久,L]
1 [R=permanent,L]





这里的想法是告诉索引蜘蛛不要索引我们拥有的所有域名,就好像他们将不同内容重定向到主域名一样。



当然我试图改变htAccess以查看结果会是什么,但似乎没有任何更新,所以我回到原来的......



我担心多个域似乎指向相同内容的真正影响,如果他是对的,那么我们就有问题了。



我能理解主要想法,我甚至可以理解我已经放入htAccess文件的代码行,但我怎么能确定托管公司在购买域名时,他们自己没有采取任何措施来避免这种错误的情况吗?



注意:

在最近的一条评论中,建议我使用robots.txt来避免这种情况。据我所知,这不是我需要的,因为robots.txt用于告诉蜘蛛要抓取哪些文件以及哪些文件要避免抓取。在我的情况下,我有多个域指向相同的网页,因此,就像我会有不同的网页具有相同的内容,这将意味着受到谷歌的惩罚,因为它可以被理解为黑帽SEO技术。这就是为什么我的朋友建议我使用301重定向的原因。


问题:

有没有有什么方法可以知道我们是否受到Google的惩罚?

任何提示?

一个很好的解释性链接?



如果我想避免谷歌惩罚并让我的不同域名指向我的主域名,这是正确的做法吗?



一如既往地谢谢你提前。 :thumbsup:



The idea here would be to tell the indexing spiders not to index all the domains we have as if they would have different contents redirecting the spiders to the main domain.

Of course I've tried to change the htAccess to see which would be the result, but nothing seem to be updated so I've went back to the original one...

I'm worried on the real impact of having multiple domains that seem to be pointing to the same content, if he is right, then we have a problem.

I can understand the main idea and I can even understand the code lines that I've put into the htAccess file, but how can I be sure that the hosting company has not done anything to avoid this wrong situation by themselves at the moment of buying the domains?

Note:
In one recent comment, It has been recommended me to use "robots.txt" to avoid that. As far as I know, this is not what I need as "robots.txt" is used to tell the spiders what files to crawl and what files to avoid crawling. In my case, I have multiple domains that point to the same web pages, therefore, it is like I would have different web pages with the same content and that would mean to be punished by Google as it can be understood as a black hat SEO technique. This is the reason why my friend recommended me to use that 301 redirect.

Questions:
Is there any way to know if we are being punished by Google?
Any hint?
A good explanatory link?

If I want to avoid Google punishment and keep my different domains pointing to my main domain, which would be the correct way of doing it?

As always thank you in advance. :thumbsup:


这就是我最后的工作方式:



That's how it worked for me at the end:

RewriteCond %{HTTP_HOST} ^codeproject.com


这篇关于调整htAccess以避免重复内容并避免谷歌 pandas 和合作?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆