云来云网

云来云往

-只为个人记录生活

Copyright © 2023-现在 云来云往

滇ICP备08000573号

登录 注册 导航
当前位置:网站首页 > 技术学习 正文

apache、iis6、ii7独立ip主机屏蔽拦截蜘蛛抓取(适用vps云主机服务器)

yun发布于1年前 (2023-05-30) 316 浏览 0 评论

最近服务器一直爆红,网站也没有多少流量。发现服务器一些垃圾蜘蛛十分可恨,浪费占用大连服务器资源,根本不顾服务器的性能,有多大劲就使多劲,不停的抓取,而且无视Robots协议。

可以用以下方法实现屏蔽。

Linux下 规则文件.htaccess(手工创建.htaccess文件到站点根目录)

<IfModule mod_rewrite.c>
RewriteEngine On
#Block spider
RewriteCond %{HTTP_USER_AGENT}   "SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu"   [NC]
RewriteRule !(^robots\.txt$) - [F]
</IfModule>

windows2003下 规则文件httpd.conf 

#Block spider
RewriteCond %{HTTP_USER_AGENT}   (SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu)   [NC]
RewriteRule !(^/robots.txt$) - [F]

windows2008下 web.config

<?xml version="1.0" encoding="UTF-8"?>
  <configuration>
      <system.webServer>
       <rewrite>  
         <rules>         
<rule name="Block spider">
      <match url="(^robots.txt$)"   ignoreCase="false" negate="true" />
      <conditions>
        <add   input="{HTTP_USER_AGENT}"   pattern="SemrushBot|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|curl|perl|Python|Wget|Xenu|ZmEu"   ignoreCase="true" />
      </conditions>
      <action   type="AbortRequest" />
</rule>
        </rules>  
        </rewrite>  
       </system.webServer>
  </configuration>

Nginx对应屏蔽规则:代码需添加到对应站点配置文件server段内

if ($http_user_agent ~ "Bytespider|Java|PhantomJS|SemrushBot|Scrapy|Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|perl|Python|Wget|Xenu|ZmEu|^$"   )
{
  return 444;
}

注:“{HTTP_USER_AGENT}”所在行中是不明蜘蛛名称,根据需要添加以"|"为分割。

附各大蜘蛛名字:


google蜘蛛:googlebot


百度蜘蛛:baiduspider


百度手机蜘蛛:baiduboxapp


yahoo蜘蛛:slurp


alexa蜘蛛:ia_archiver


msn蜘蛛:msnbot


bing蜘蛛:bingbot


altavista蜘蛛:scooter


lycos蜘蛛:lycos_spider_(t-rex)


alltheweb蜘蛛:fast-webcrawler


inktomi蜘蛛:slurp


有道蜘蛛:YodaoBot和OutfoxBot


热土蜘蛛:Adminrtspider


搜狗蜘蛛:sogou spider


SOSO蜘蛛:sosospider


360搜蜘蛛:360spider


更多不明蜘蛛可在度娘自行搜索。

本文暂时没有评论,来添加一个吧(●'◡'●)

取消回复欢迎 发表评论: