Hi Richard;Yes, but they are on their own vps or dedi, not on shared hosting. Cloudflare is no option for shared hosting.
And I've already checked douzens of ip's which unfortunately came all over the world not mainly from Brazil, also for example from US datacenters.
However, I managed to find a good block myself. Because I guess these are all infected pc's. And all were using old Chrome versions as I mentioned before. So I now block everything below Chrome 119 and their fun is over.
It rained 403's for them wil normal users with up to date browsers had no problems logging in. The bandwidth eating is over now.
you should take care of blocking above Chrome/99.x versions
some googlebot using Chrome/99.0.4844.84 version check your logs first
we have a lot of Chrome/99.0.4844.84 versions in logs still googlebot using
like this
66.249.73.164 - - [21/Oct/2025:04:43:20 +0300] "GET /en HTTP/1.1" 200 8357 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.84 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
you can do it whit modsecurity
Add this too rule in your rules set in .conf file
and change the ID:123456 number
this will give error 406 if Chrome/xx versions
SecRule REQUEST_HEADERS:User-Agent "(Chrome/[0-9][0-9])" "deny,status:406,id:123456,nolog,msg:'Bad Chrome'"
this will allow if the user agent if exit in your whitelist.txt
SecRule REQUEST_HEADERS:User-Agent "@pmFromFile whitelist.txt" "allow,id:123456,nolog,ctl:ruleEngine=Off"
Creat a file whitelist.txt wher your rules edit .conf file
and add lines whitelist.txt
Googlebot
Google-InspectionTool
Googlebot-Image
and restart Apache
Regards