Excessive awstats usage?

Thank you Ditto.
It might be a nice feature if this choosable option would be present by default for all users when awstats=1 in directadmin.conf.
 
The feature is available as default for all users in the newest DirectAdmin version, at least it is when using Enhanced skin. I also has awstats=1 in directadmin.conf, and as said, the feature is available. Just log in and see for yourself.
 
Ah LoL, that's what I misunderstood. I thought the awstats=1 must be present in the user.conf to have that option. And I have no awstats=0 or 1 there.
So only the awstats=0 is needed there for totally disabling this option for the specific useraccount.

Clear now, thank you!
 
H richard I'm seeing the same issue and i'm sure its a bot as you suggested years ago since i have been getting very high bot traffic. Did you come up with a solution to this issue?
 
Did you come up with a solution to this issue?
Yep, I had another more recent thread about it here (click).

But I also discovered, especially on phpBB forums there was a wide spread attack of bots which did not identify as such, but used older Chrome versions, older than v100.

So I added these to my bad bot list and blocked all of the serverwide. But you can also block them in a .htaccess file like this.

This is the list I'm using at the moment.
Code:
BrowserMatchNoCase "adscanner" bad_bot
BrowserMatchNoCase "Amazonbot" bad_bot
BrowserMatchNoCase "Applebot" bad_bot
BrowserMatchNoCase "aiohttp" bad_bot
BrowserMatchNoCase "Amazonbot" bad_bot
BrowserMatchNoCase "Amazonbot/0.1" bad_bot
BrowserMatchNoCase "anthropic-ai" bad_bot
BrowserMatchNoCase "Applebot" bad_bot
BrowserMatchNoCase "AspiegelBot" bad_bot
BrowserMatchNoCase "Baiduspider" bad_bot
BrowserMatchNoCase "Barkrowler" bad_bot
BrowserMatchNoCase "BLEXBot" bad_bot
BrowserMatchNoCase "BoardReader" bad_bot
BrowserMatchNoCase "Bytespider" bad_bot
BrowserMatchNoCase "ChatGPT-User" bad_bot
BrowserMatchNoCase "ClaudeBot" bad_bot
BrowserMatchNoCase "Datanyze" bad_bot
BrowserMatchNoCase "Dotbot" bad_bot
BrowserMatchNoCase "Go-http-client" bad_bot
BrowserMatchNoCase "GPTBot" bad_bot
BrowserMatchNoCase "ImagesiftBot" bad_bot
BrowserMatchNoCase "Kinza" bad_bot
BrowserMatchNoCase "LieBaoFast" bad_bot
BrowserMatchNoCase "MauiBot" bad_bot
BrowserMatchNoCase "Mb2345Browser" bad_bot
BrowserMatchNoCase "meta-externalagent" bad_bot
BrowserMatchNoCase "MicroMessenger" bad_bot
BrowserMatchNoCase "MJ12bot" bad_bot
BrowserMatchNoCase "msnbot-media" bad_bot
BrowserMatchNoCase "msnbot-MM" bad_bot
BrowserMatchNoCase "nbot" bad_bot
BrowserMatchNoCase "Petalbot" bad_bot
BrowserMatchNoCase "Scrapy" bad_bot
BrowserMatchNoCase "Scrapy *\([a-zA-Z]+\) *(.+)" bad_bot
BrowserMatchNoCase "SemrushBot" bad_bot
BrowserMatchNoCase "SemrushBot-BA" bad_bot
BrowserMatchNoCase "SemrushBot-BM" bad_bot
BrowserMatchNoCase "SemrushBot-COUB" bad_bot
BrowserMatchNoCase "SemrushBot-CT" bad_bot
BrowserMatchNoCase "SemrushBot-SI" bad_bot
BrowserMatchNoCase "SemrushBot-SWA" bad_bot
BrowserMatchNoCase "serpstatbot" bad_bot
BrowserMatchNoCase "SiteAuditBot" bad_bot
BrowserMatchNoCase "Sogou" bad_bot
BrowserMatchNoCase "spaziodat" bad_bot
BrowserMatchNoCase "SplitSignalBot" bad_bot
BrowserMatchNoCase "YaK" bad_bot
BrowserMatchNoCase "YandexBot" bad_bot
BrowserMatchNoCase "YandexImages" bad_bot
BrowserMatchNoCase "Chrome\/\b(0*(?:[1-9][0-9]?|119))\b" bad_bot

SetEnvIfNoCase  Request_URI "\.env" bad_bot
SetEnvIfNoCase  Request_URI "xmlrpc\.php" bad_bot

<If "%{ENV:BAD_BOT} == 1">
    Require all denied
</If>

Or you can just use the same but Apache 2.2 alike with at the end even without the if /if statements.
Code:
Order deny,allow
Deny from env=bad_bot
Be sure you never use the Require all and Deny from together in a file that most likely will not work.

Feel free to use it and ofcourse adjust the blocking of bots to your needs.
 
Richard, thank you so much I will try this and see how it goes. But dont truly bad bots just not report their user agent?
 
But dont truly bad bots just not report their user agent?
You're welcome.
Well... I haven't seen bad bots before this year not stating their user agent. They probably are not bots but probably rather a bot net which uses these older Chrome versions. Those Chrome attacks came from tens of thousends of different unique ip's even more than 100K as some vlogger reported.

So yes we have 2 kinds of bat bots now.
1.) The ones not obeying the robots.txt or other limitations and doing what they want but identified by their name.
2.) The really bad bots (probably botnets) eating bandwidth and not stating any name.
 
You're welcome.
Well... I haven't seen bad bots before this year not stating their user agent. They probably are not bots but probably rather a bot net which uses these older Chrome versions. Those Chrome attacks came from tens of thousends of different unique ip's even more than 100K as some vlogger reported.

So yes we have 2 kinds of bat bots now.
1.) The ones not obeying the robots.txt or other limitations and doing what they want but identified by their name.
2.) The really bad bots (probably botnets) eating bandwidth and not stating any name.
Agree I think its #2 that is causing the problems
 
DDOS bot + website behind cloudflare ( Not config under ddos option ) is the hell.

ratelimit via nginx ( realip_module ) can stop this. Not firewall, not filter via user-agent.

Because they gonna request multiple 30-50 req per IP and fake agent as normal request.
 
DDOS bot + website behind cloudflare ( Not config under ddos option ) is the hell.

ratelimit via nginx ( realip_module ) can stop this. Not firewall, not filter via user-agent.

Because they gonna request multiple 30-50 req per IP and fake agent as normal request.
It seems everyone going to cloudfare as a solution is some unforseen disaster waiting to happen, but it does seem like there is no choice
 
Back
Top