One of the websites gets extreme bandwidth

Vanja

Verified User
Joined
Feb 2, 2024
Messages
14
Hey guys,
I have a website that gets 100Gb of traffic but this isn't a busy website. Its a WordPress and it's completely clean and protected with totally blocking the login page (except for my IP). I have checked every single file on that website manually (images too), in hope I can find anything suspicious but nothing.
There is a firewall installed and its capturing a few blocks a day. Still the website gets a lot of bandwidth.
Is there a way in Direct Admin to see an exact file activity for the website?
 
I had something like that, and found out it were a couple of bots eating all the bandwith.
You could go into the user account and then check the website traffic with AWstats for example, look which ip's are eating the most bandwidth.
Also check the /var/log/httpd/domains/domain.com.log file to check if you see long lists with the same ip in a row.

In case of my customer it was facebook bot which for some reason eat everything, while he didn't even have a FB page.
If it's the facebook bot, don't bother using robots.txt as the FB bot decides itself if it listens to it or not, and often does not, so in that case you have to block that one in .htaccess for example.
I also had an amazon bot on my personal hobby site which did the same.
 
I had something like that, and found out it were a couple of bots eating all the bandwith.
You could go into the user account and then check the website traffic with AWstats for example, look which ip's are eating the most bandwidth.
Also check the /var/log/httpd/domains/domain.com.log file to check if you see long lists with the same ip in a row.

In case of my customer it was facebook bot which for some reason eat everything, while he didn't even have a FB page.
If it's the facebook bot, don't bother using robots.txt as the FB bot decides itself if it listens to it or not, and often does not, so in that case you have to block that one in .htaccess for example.
I also had an amazon bot on my personal hobby site which did the same.
Thanks Richard,
I did block that FB bot iP as it was indeed so active and was hitting the various files constantly.
I did some additional amazon bots IP's but there are too many of those.
Do you know when the bandwith counter resets in Direct Admin as I think its been building for maybe longer than a month.
To my understanding it should reset once a month? Or maybe I am wrong?
 
I did block that FB bot iP
That might be of little use as it will turn up again since these FB bots use a lot of ip's.
So blocking them by the Wordfence plugin might be a better solution or use the .htaccess file method.

Code:
BrowserMatchNoCase "LieBaoFast" bad_bot
BrowserMatchNoCase "Mb2345Browser" bad_bot
BrowserMatchNoCase "zh-CN" bad_bot
BrowserMatchNoCase "Baiduspider" bad_bot
BrowserMatchNoCase "facebook" bad_bot
BrowserMatchNoCase "facebookexternalhit/1.1" bad_bot
BrowserMatchNoCase "facebookcatalog/1.0" bad_bot
Order deny,allow
Deny from env=bad_bot

There are some more in there, but this is just to give an example how it can be done. So you can also block all amazon bots this way, as long as they identify as such. ;) No matter which ip they use.

As for the reset of the counters, that's once a month yes. There is a command to reset counters if I'm not mistaken, if you really need it.
You can read about it here:
 
I know this won't help you as it could be a client site...

I use Blocked on some of my private domains - as per configuration, it can block Tor, proxies, VPN, Hosting IP Blocks, bad bots, etc..... It's not free (it has a trial), but.....
 
I would avoid installing plugins like Wordfence, primarily due to their high resource consumption.
Instead, I recommend the following steps:
  1. Block Known Bots: Identify and block malicious bots based on their user-agent and Autonomous System Numbers (ASNs) - there are publicly lists available.
  2. Throttle Requests: Implement rate limiting to restrict the number of requests a single client can make within a specific time frame (e.g 10 requests per minute).
  3. Prevent Static Content Hotlinking: Disable hotlinking to stop unauthorized websites from directly linking to your static assets, such as images or videos.
 
The problem with WordStress and the above is that if you do not know how websites work (using WP, so, I guess, not much), you'd need to install plugins to help people do the above, which will, in turn, increase processor and resource power due to plugins.
#justsaying
 
Back
Top