写个规则,只放行蜘蛛和指定浏览器,我就是这么搞的,有攻击的时候就启用这个规则- set $spider '0'; if ($http_referer ~* (enbs.cn|baidu.com)){ set $spider '${spider}1'; } if ($http_referer ~* (google.com|g.com)){ set $spider '0'; } if (-e $request_filename){ set $spider '${spider}1'; }if ($http_user_agent ~* 'spider|bingbot|360Spider') { set $spider '${spider}1';}if ($http_user_agent ~* 'baiduboxapp|Android|iPhone|nokia|Windows NT 10.0; Win64; x64') { set $spider '${spider}1';} if ($spider = '0') { rewrite ^/(.*)$ https://enbs.cn/juzi/show-322.html last; }
复制代码 |