Block bad robots with modsecurity and csf in da

xerox

Verified User
Joined
Jul 16, 2019
Messages
151
Hello,

I experience issues with some bad robots especially Yandex bots which are doing DOS-like requests to the server.

I tried to setup modsecurity to block the robots but the apache will give me error after restart:

Code:
Starting The Apache HTTP Server...
AH00526: Syntax error on line 1 of /usr/local/directadmin/custombuild/custom/modsecurity/conf/block_user_agents.conf:
ModSecurity: Found another rule with the same id
httpd.service: main process exited, code=exited, status=1/FAILURE
kill: cannot find process ""
httpd.service: control process exited, code=exited status=1
Failed to start The Apache HTTP Server.
Unit httpd.service entered failed state.
httpd.service failed.

I am not sure where is the issue. Any help is much appreciated! :)

Here's the instructions of the setup...

First, create new folders and file for the custom stuff:
Code:
mkdir /usr/local/directadmin/custombuild/custom/modsecurity
mkdir /usr/local/directadmin/custombuild/custom/modsecurity/conf
cd /usr/local/directadmin/custombuild/custom/modsecurity/conf
nano block_user_agents.conf

Then put this code to the conf file:
Code:
SecRule REQUEST_HEADERS:User-Agent "@pmFromFile badbots.txt" "id:350001,rev:1,severity:2,log,msg:'BAD BOT - Detected and Blocked. '"

Now, create an new file called badbots.txt into the same dir:
Code:
nano badbots.txt

Let's put some bad bots inside:
Code:
AhrefsBot
Anonymizer
Attributor
Baidu
Bandit
BatchFTP
Bigfoot
Black.Hole
Bork-edition
DataCha0s
Deepnet Explorer
desktopsmiley
DigExt
feedfinder
gamingharbor
heritrix
ia_archiver
Indy Library
Jakarta
Java
juicyaccess
larbin
linkdex
Missigua
MRSPUTNIK
Nutch
panscient
plaNETWORK
Snapbot
Sogou
TinEye
TwengaBot
Twitturly
User-Agent
Viewzi
WebCapture
XX
Yandex
YebolBot
MJ12bot
masscan
baidu
Yandex
RSSingBot
Scanbot
betaBot
DotBot
SemrushBot
mj12bot
FeedFetcher
seoscanners.net
Moreover
ltx71
inboundlinks.win
sitebot

Make the files executable:
Code:
chmod +x block_user_agents.conf
chmod +x badbots.txt

Before restarting the server, we include our custom conf to httpd:
Code:
nano /etc/httpd/conf/extra/httpd-includes.conf

Put this line of code inside the file:
Code:
Include /usr/local/directadmin/custombuild/custom/modsecurity/conf/*.conf

Finally, rewrite confs which will also restart apache:
Code:
cd /usr/local/directadmin/custombuild
./build rewrite_confs
 
Last edited:
I found the issue:
The file was already copied to /etc/modsecurity.d/block_user_agents.conf which caused the error in restart.

I am facing now another issue with that setup.

modsecurity is now capturing all the bad bots to /var/log/httpd/modsec_audit.log, but the CSF does not want to block IP-s permanently which have caused 403 errors.

Current setup:
LF_MODSEC = 5
LF_MODSEC_PERM = 1
LF_APACHE_403 = 200
LF_APACHE_403_PERM = 1
MODSEC_LOG = /var/log/httpd/error_log

I think i have to change MODSEC_LOG path to /var/log/httpd/modsec_audit.log or what?

If anyone have experience in this field, let me know.

Thanks.
 
Last edited:
UPDATE: I switched to Comodo rules/plugin which has the option to block user-agents, but still the CSF do not block anything from comodo triggers.

If anyone have the same experience, please let me know.
 
Hello,

Does it make any difference if you change MODSEC_LOG to /var/log/httpd/modsec_audit.log ?

Do you see anything related to blocked bots in /var/log/httpd/modsec_audit.log ?

p.s. Never used mod_sec+csf to ban bots, so I'm not even sure it will work, and still...
 
UPDATE: I switched to Comodo rules/plugin which has the option to block user-agents, but still the CSF do not block anything from comodo triggers.

If anyone have the same experience, please let me know.

Did you have any luck with this? Having same issue with not blocking despite comodo rules has been setup with useragent.
 
You need to include the error logs from your domains as well. If you don't use nginx replace nginx with httpd in the line below.
Code:
MODSEC_LOG = "/var/log/nginx/error_log /var/log/nginx/domains/*.error.log"

If that does not work, you need to make a custom regex in csf.
 
I know it old thread, just adding info if any one else come looking for this. I followed this KB and its working, no errors and modscurity link under Admin -> Server is showing bad bots being blocked.

 
Just a follow up on this, as their are number of bad bots crawling around domain.com.error.log is filling up with bad bot block message. Is there a way to write modsecurity logs to different location or edit this mod security rule so that its not logging.
Not logging at all could be an issue if in future if bot cannot crawl site and you're not sure why ...

Code:
SecRule REQUEST_HEADERS:User-Agent "@pmFromFile bad_bot_list.txt" "phase:2,t:none,t:lowercase,log,deny,severity:2,status:406,id:1100000,msg:'Custom WAF Rules: WEB CRAWLER/BAD BOT'"

Thanks
 
Just a follow up on this, as their are number of bad bots crawling around domain.com.error.log is filling up with bad bot block message. Is there a way to write modsecurity logs to different location or edit this mod security rule so that its not logging.
Not logging at all could be an issue if in future if bot cannot crawl site and you're not sure why ...

Code:
SecRule REQUEST_HEADERS:User-Agent "@pmFromFile bad_bot_list.txt" "phase:2,t:none,t:lowercase,log,deny,severity:2,status:406,id:1100000,msg:'Custom WAF Rules: WEB CRAWLER/BAD BOT'"

Thanks

You could try and change the ModSec rule from log to nolog which should prevent rule matches from appearing in both the error and audit logs.

Code:
SecRule REQUEST_HEADERS:User-Agent "@pmFromFile bad_bot_list.txt" "phase:2,t:none,t:lowercase,nolog,deny,severity:2,status:406,id:1100000,msg:'Custom WAF Rules: WEB CRAWLER/BAD BOT'"
 
Back
Top