左右不逢缘 发表于 2025-2-27 21:41:43

分享代码!看到还有很多人不知道怎么屏蔽辣鸡蜘蛛!!!

如果用的是宝塔面板!

宝塔-网站-设置-配置文件

把以下代码添加进去!
解决百分之90以上的垃圾蜘蛛!
服务器瞬间就清净了!
#禁止Scrapy等工具的抓取if ($http_user_agent ~* (Scrapy|HttpClient|crawl|curb|git|Wtrace)) {   return 403;}#禁止指定UA及UA为空的访问if ($http_user_agent ~* "CheckMarkNetwork|Synapse|Nimbostratus-Bot|Dark|scraper|LMAO|Hakai|Gemini|Wappalyzer|masscan|crawler4j|Mappy|Center|eright|aiohttp|MauiBot|Crawler|researchscan|Dispatch|AlphaBot|Census|ips-agent|NetcraftSurveyAgent|ToutiaoSpider|EasyHttp|Iframely|sysscan|fasthttp|muhstik|DeuSu|mstshash|HTTP_Request|ExtLinksBot|package|SafeDNSBot|CPython|SiteExplorer|SSH|MegaIndex|BUbiNG|CCBot|NetTrack|Digincore|aiHitBot|SurdotlyBot|null|SemrushBot|Test|Copied|ltx71|Nmap|DotBot|AdsBot|InetURL|Pcore-HTTP|PocketParser|Wotbox|newspaper|DnyzBot|redback|PiplBot|SMTBot|WinHTTP|Auto Spider 1.0|GrabNet|TurnitinBot|Go-Ahead-Got-It|Download Demon|Go!Zilla|GetWeb!|GetRight|libwww-perl|Cliqzbot|MailChimp|SMTBot|Dataprovider|XoviBot|linkdexbot|SeznamBot|Qwantify|spbot|evc-batch|zgrab|Go-http-client|FeedDemon|Jullo|Feedly|YandexBot|oBot|FlightDeckReports|Linguee Bot|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|EasouSpider|LinkpadBot|Ezooms|^[      DISCUZ_CODE_0      ]quot; ) {      return 403; }#禁止非GET|HEAD|POST方式的抓取if ($request_method !~ ^(GET|HEAD|POST)$) {    return 403;}    #屏蔽垃圾蜘蛛    if ($request_method ~ ^(HEAD)$ ) {                return 444 "FUCK U";    }    if ($http_range ~ "\d(9,)") {                return 444;    }    if ($http_user_agent ~* (Amazonbot|SemrushBot|python|Linespider|crawler|DingTalkBot|simplecrawler|ZoominfoBot|zoombot|Neevabot|coccocbot|Facebot|YandexBot|Adsbot|DotBot|Applebot|DataForSeoBot|MJ12bot|BLEXBot|trendictionbot0|trendictionbot|AhrefsBot|hubspot|opensiteexplorer|leiki|webmeup|TinyTestBot|Symfony|PetalBot|proximic|GrapeshotCrawler|YaoSouBot|serpstatbot|Scrapy|Go-http-client|CCBot|CensysInspect|facebookexternalhit|GPTBot|ClaudeBot|Python-urllib|meta-externalagent|Yisouspider)) { return 444;   }    #禁止访问的文件或目录    location ~ ^/(\.user.ini|\.htaccess|\.git|\.env|\.svn|\.project|LICENSE|README.md)    {      return 404;    }

婷姐 发表于 2025-2-27 21:42:32

先收藏个再说

浅生 发表于 2025-2-27 21:42:45

学习了

浅生 发表于 2025-2-27 21:43:04

收藏了

独家记忆 发表于 2025-2-27 21:43:38

如果在robots.txt里面直接用
User-agent: *
Disallow: /
和你这个代码有什么区别?
当然,robots.txt里面会保留常见蜘蛛的访问。

Crystαl 发表于 2025-2-27 21:44:08

学习了 学习了 学习了

独家记忆 发表于 2025-2-27 21:44:19

robots.txt 不行吗?

IT618发布 发表于 2025-2-27 21:44:52

支持一下!

TyCoding 发表于 2025-2-27 21:45:23

感谢分享

apwl 发表于 2025-2-27 22:38:25

学习了,收藏一下,感谢佬
页: [1]
查看完整版本: 分享代码!看到还有很多人不知道怎么屏蔽辣鸡蜘蛛!!!