Any way to stop these attacks which are eating bandwidth?

Richard G

Verified User
Joined
Jul 6, 2008
Messages
14,501
Location
Maastricht
Seem some sites on our servers are under attack in some way. Don't know exactly what they are doing but it's eating up bandwith of the users, while it looks as they are just requesting content.
Recently I used a bad bot blocker in the httpd-includes.conf file and things got fairly quiet.

Now they are using loads of ip's with old Chrome versions to do a lot of requests.

There are a couple of things which do catch the eye.
First of all, almost all of them are from Macintosh OS X all kind of 10.x versions in the log line and old Chrome versions are used like this:

Code:
189.6.12.85 - - [19/Oct/2025:00:47:34 +0200] "GET http://www.some-website.nl/forum/viewtopic.php?p=964374&sid=d6db3f0bd6a3ff92e823afeebaffb819 HTTP/1.1" 200 8328 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Safari/537.36"
200.181.149.72 - - [19/Oct/2025:00:47:34 +0200] "GET http://www.some-website.nl/forum/viewforum.php?f=157&sid=deb313a2da7fdd7c0cacea1c13669ba5 HTTP/1.1" 301 311 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/537.36 (KHTML, like Gecko, Mediapartners-Google) Chrome/87.0.4280.90 Safari/537.36"

Blocking these ip's has no use as there are douzens of different ip's doing this. And it's causing high load too now.
Is there a simple way to block everything below for example version 110 of Chrome or maybe a better way to block this behaviour?

It's just plain crazy, look at this, the phpbb forum stats:
In total there are 17082 users online :: 4 registered, 1 hidden and 17077 guests (based on users active over the past 15 minutes)
 
Last edited:
phpBB is really annoying to combat bots, because of the session id that is in the URL. Because of that stupid bots think every pageview is a new page. Topics back in the phpBB forum goes back > 10 years with this issue. I remember this was discussed a few months ago, but the only way I found is fix is to put Cloudflare proxy on with bot protection + block Asia region. I saw in the stats that >50.000+ requests were blocked every hour. This went on for a few hours when finally it calmed down.
 
but the only way I found is fix is to put Cloudflare proxy on with bot protection + block Asia region.
Well... unfortunately there are a few problems here. To be able to use Cloudflare proxy we would have to move the domain namesevers to Cloudflare which I don't like.
The other things is that the ip's are not only Asian.

But there might be another way.
At this moment I'm blocking old Chrome clients to they are receiving a 403 continuously, but I'm doing it in the .htaccess at the moment and am considering to move it to the httpd-includes.conf which I also use for other bad bots.
I don't know if that is better or not than .htaccess.
But I have to make it better. As for now it's about 7K less, but still some Chrome 6 and other Chrome 8 versions are able to pass.

This is the code I used, probably not optimal, found it on internet:
Code:
BrowserMatchNoCase "Chrome/(8[789]|9\d)\." bad_bot
Order deny,allow
Deny from env=bad_bot

So most of them are getting a 403 now, but a lot still pass like these:
Chrome/82.0.4078.2
Chrome/79.0.3945.130
Chrome/62.0.3202.94

So the 62 I understand, but the 82 and 79 I don't understand as I thought this blocks every 7* and 8* version. And odd thing is that 88 is in fact blocked, but 82 is not.

I would like to block every Chrome below 120 if possible.
Or do it the other way around and block all chrome, but only allow above 120.

I could use this (if that is correct) to block all Chrome versions:
Code:
BrowserMatchNoCase "Chrome/" bad_bots
but then I don't know how to only allow versions 120 and higher.
 
I use crowdsecurity to analyse weblogs and this blocks about 15k+ ips from botnets based on community blocklists.

And when our monitoring still sees a too high load it runs a script to count all crap in the logfiles (403/403/50x per ip, number of requests, many sensitive urls like phpmyadmin/wp-admin etc) of the last 15 minutes and just blocks the top 20 with csf for 5 minutes. If they get above some threshold.

A bit blunt but it works.
 
That might be a good idea too indeed.
But if somebody could give the correct line to block everything below 120 I should be fine for the moment and then I'm going to have a look at crowdsecurity this week. Seems interesting.
 
<RequireAll>
Require all granted
# Matcht Chrome/0..119
Require not expr "%{HTTP_USER_AGENT} =~ /Chrome\/(?:\d|[1-9]\d|1[01]\d)/i"
</RequireAll>

or

RewriteEngine On
# Matcht Chrome/0..119
RewriteCond %{HTTP_USER_AGENT} "(?:^| )Chrome/(?:\d|[1-9]\d|1[01]\d)" [NC]
RewriteRule ^ - [F]

Same but allowing brave, opera, that windows thing.:

<RequireAll>
Require all granted
Require not expr "%{HTTP_USER_AGENT} =~ /Chrome\/(?:\d|[1-9]\d|1[01]\d)/i && %{HTTP_USER_AGENT} !~ /(Edg|OPR|OPiOS|Brave)/i"
</RequireAll>

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} "(?:^| )Chrome/(?:\d|[1-9]\d|1[01]\d)" [NC]
RewriteCond %{HTTP_USER_AGENT} !(Edg|OPR|OPiOS|Brave) [NC]
RewriteRule ^ - [F]

Fiddle around with it, some might work.
 
Ehmz.. thank you. but can I also use that in my system somehow?
BrowserMatchNoCase "Chrome/(8[789]|9\d)\." bad_bot

So could I use this then?
BrowserMatchNoCase "(?:^| )Chrome/(?:\d|[1-9]\d|1[01]\d)" bad_bot
or does it need to be like this?
BrowserMatchNoCase "Chrome/(?:\d|[1-9]\d|1[01]\d)" bad_bot?

Ah well... I will just test it, thank you!

Edit: don't work. I will try some further regexp testing (really hate that stuff... LoL).

Edit2: I used the second alinea (rewrite) seems to work... I'll test it further.
 
Last edited:
Could you elaborate on this please? What is it exactly? Is it about this https://github.com/crowdsecurity/crowdsec project?
Yups,

I drop weblogs from 'difficult' servers via rsyslog on a vps running the security engine only. This engine uses an ip's history and a bunch of rules to decide if an ip should be banned. This is the gather and decide part.

On multiple webservers (even if they do not send their logs to my crowdsec vps) I install the crowdsec-firewall-bouncers. (Well, actually my ai decides that, but that is another story).
A fw bouncer connects to the engine vps and gets block updates every 10 seconds or so. It then updates it's own ip/nftables to reflect the ban or expiry of a ban.

Sharing ip's and community is somewhat questionable in the Netherlands due to privacy regulations. But it shares only banned ip's and you usually get banned for good reasons. So I take my chances on that one.
 
Thank you for the details. It looks very interesting.

Sharing ip's and community is somewhat questionable in the Netherlands due to privacy regulations. But it shares only banned ip's and you usually get banned for good reasons. So I take my chances on that one.

The software does not offer an option to stop sharing blocked IPs?
 
Ehmz.. thank you. but can I also use that in my system somehow?


So could I use this then?
BrowserMatchNoCase "(?:^| )Chrome/(?:\d|[1-9]\d|1[01]\d)" bad_bot
or does it need to be like this?
BrowserMatchNoCase "Chrome/(?:\d|[1-9]\d|1[01]\d)" bad_bot?

Ah well... I will just test it, thank you!

Edit: don't work. I will try some further regexp testing (really hate that stuff... LoL).

Edit2: I used the second alinea (rewrite) seems to work... I'll test it further.
Yeah, it's flaky usually too. You could simply check if it starts with '12' for now. But bot's rotate useragents every few minutes, seconds sometimes. Switching between old and new versions. Botnets are pretty smart and range their ip's how they can be used. An ip being banned for using the wrong useragent is still useful to be used against your server. Maybe bruteforcing, ddos, ip->ip, spam.
Or even your next server.
 
Thank you for the details. It looks very interesting.



The software does not offer an option to stop sharing blocked IPs?
Yes it does. In the community version you can choose 3 free blocklists iirc. Or buy commercial ones of course.
But you can opt-out on sharing the banned ip's. But if you take no part in the community, you don't get the complete blocklists.

You can maintain your own blocklists if you want. On the cs engine a simple command can add a decision to block an ip, subnet, ASN (like the cloud customer asn's which brings instant peace) for any amount of time.
Like the cluster block function in csf but more client/server arch.
It's customisable in the sense that you can change config files or add your own ideas for a specific check. Or make a check more/less stricter.
 
Last edited:
This is a hack, and only works with LiteSpeed (and maybe OLS?), but for some problematic sites, I've been pushing everything through captcha if it has `Mozilla` in the UA. The clients where I've employed this love it.

It's a similar pattern employed by Anubis. https://anubis.techaro.lol/

Code:
RewriteCond %{HTTP_USER_AGENT} Mozilla [NC]
RewriteRule .* - [E=verifycaptcha]
 
Back
Top