linux主机使用htaccess屏蔽垃圾蜘蛛方式

作者:稳网互联来源:官方更新时间:2018-05-22点击数:9306

对于大量的不良垃圾蜘蛛抓取网站内容,浪费流量的行为,使用我司linux主机的用户可以在网站根目录下面建立一个 .htaccess文件对其进行屏蔽.

 

规则如下.

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} "Bingbot|MSNbot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC]
RewriteRule ^(.*)$ http://127.0.0.1/$1 [R=301,L]

 

 

此外,对于已经建立过.htaccess的客户,只需要添加以下的内容即可.

RewriteCond %{HTTP_USER_AGENT} "Bingbot|MSNbot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC]
RewriteRule ^(.*)$ http://127.0.0.1/$1 [R=301,L]