Solved Any way to stop these attacks which are eating bandwidth?

Yes, but they are on their own vps or dedi, not on shared hosting. Cloudflare is no option for shared hosting.
And I've already checked douzens of ip's which unfortunately came all over the world not mainly from Brazil, also for example from US datacenters.

However, I managed to find a good block myself. Because I guess these are all infected pc's. And all were using old Chrome versions as I mentioned before. So I now block everything below Chrome 119 and their fun is over.
It rained 403's for them wil normal users with up to date browsers had no problems logging in. The bandwidth eating is over now.
Hi Richard;
you should take care of blocking above Chrome/99.x versions
some googlebot using Chrome/99.0.4844.84 version check your logs first

we have a lot of Chrome/99.0.4844.84 versions in logs still googlebot using

like this

66.249.73.164 - - [21/Oct/2025:04:43:20 +0300] "GET /en HTTP/1.1" 200 8357 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.84 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

you can do it whit modsecurity

Add this too rule in your rules set in .conf file
and change the ID:123456 number

this will give error 406 if Chrome/xx versions
SecRule REQUEST_HEADERS:User-Agent "(Chrome/[0-9][0-9])" "deny,status:406,id:123456,nolog,msg:'Bad Chrome'"


this will allow if the user agent if exit in your whitelist.txt

SecRule REQUEST_HEADERS:User-Agent "@pmFromFile whitelist.txt" "allow,id:123456,nolog,ctl:ruleEngine=Off"


Creat a file whitelist.txt wher your rules edit .conf file
and add lines whitelist.txt

Googlebot
Google-InspectionTool
Googlebot-Image


and restart Apache



Regards
 
Yes it does. In the community version you can choose 3 free blocklists iirc. Or buy commercial ones of course.
But you can opt-out on sharing the banned ip's. But if you take no part in the community, you don't get the complete blocklists.

You can maintain your own blocklists if you want. On the cs engine a simple command can add a decision to block an ip, subnet, ASN (like the cloud customer asn's which brings instant peace) for any amount of time.
Like the cluster block function in csf but more client/server arch.
It's customisable in the sense that you can change config files or add your own ideas for a specific check. Or make a check more/less stricter.
@sysdev interesting setup! I had not heard of crowdsecurity before.

I found crowdsecurity is French so you being in NL might make your GDPR worries less? 😬

On my VPS I get regular "flood request attacks" from cloud consumer IPs which I sometimes add to csf.

Do you have some list of cloud consumers IPs/ASNs or do you just add them "on the go".
 
you should take care of blocking above Chrome/99.x versions
I know, the highest number which I've encountered in the logs was version 100. I'm blocking everything below version 115.
But I don't use mod_security.

@johannes I use this line in my .htaccess but it might get to my httpd-includes.conf with the other bad bots if other sites get attacked too.
Code:
BrowserMatchNoCase "Chrome\/\b(0*(?:[1-9][0-9]?|115))\b" bad_bot
Order deny,allow
Deny from env=bad_bot
This blocks every Chrome version below 115.
Seems this is not quite correct and blocks everything below 100, no matter if you have 115 there or not. But 100 is not used by bots as far as I could see and -is- used by Bingbot. So it should still be good.
I still don't quite understand how to correctly test these things with the regexp tester.

I thought this might also block Google bot, but seems Google is using a 141 version.
Code:
Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/141.0.7390.107 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
So seems we're fine with google.

Edit: adjustes regexp explanation.
 
Last edited:
Back
Top