Block IP addresses which overload server

like create new subdomain to test the filter.
I had tested on another this server with another domain and same problem.
Tested on another server with another domain and there it works as expected.
Other server same thing.

But testing with a subdomain without .htaccess is a good idea. I will try that.
 
Last edited:
Well... it's not the .htaccess even without .htaccess they still are not blocked. Only on this server it seems. Only 301's. On multiple domains.
They have not visited the subdomain yet.

I'm just wondering if the httpd-includes.conf will not just only give a 301. When testing on the other server I thought it worked, but there one domain had it in the .htaccess and when it's in the .htaccess it will give a 403.

@stefantriep You've got it in the httpd-includes.conf or in the .htaccess?
 
@Richard G
301 Maybe it from http to https rewrite from DA feature ?

And don't need to wait , you can simulate request using pure CURL or PHP curl.
 
Hi Richard;

1. I test your rules in Apache 2.4.63 and 2.4.65. non of them your rule blocking for me
2. if you want to log rewrites from htaccess in the domain
take backup of the file first then open file /etc/httpd/cof/htttp.conf

find the line LogLevel

change it LogLevel alert rewrite:trace8 save and restart Apache

now you will see in your domainame.error.log lines [rewrite:trace8]
this is the logs from your htaccess in domain
check if you can see something ther.


3. your rules stating with <IfModule mod_rewrite.c>

according docs https://httpd.apache.org/docs/current/mod/mod_setenvif.html

BrowserMatchNoCase and SetEnvIfNoCase need mod_setenvif

my opinion rules should start whit
<IfModule mod_setenvif.c>



also check apachectl -M command following modules loaded


setenvif_module (static)
rewrite_module (static)
env_module (static)
cache_module (static)


sorry about your issue hope you will find solution for it
i prefer to use modsecurity.

by the way SetEnvIfNoCase Request_URI "xlmrpc\.php" bad_bot xlmrpc wrong it should be xmlrpc

Best Regards
 
Hello Hostmavi.

1.) Exactly, I tried everything and nothing blocks, the only thing blocking is if I put rules (like in post #39) in the .htaccess. But that's not server wide. But thank you for confirming that they don't work for you either. I was already thinking I was going crazy.

2.) Thank you. But this was not really what I was looking for. I just wanted to have an option to block all these bots server wide, not only for 1 domain. As for the redirect, that is most likely caused by this piece of required .htaccess from Xenforo itself:
Code:
        RewriteCond %{REQUEST_FILENAME} -f [OR]
        RewriteCond %{REQUEST_FILENAME} -l [OR]
        RewriteCond %{REQUEST_FILENAME} -d
        RewriteRule ^.*$ - [NC,L]

        RewriteRule ^(data|js|styles|install) - [NC,L]
        RewriteRule ^.*$ index.php [NC,L]

3.) I've found various examples on the internet, various started with <IfModule mod_rewrite.c> but I tested with the mod_setenvif.c and in the httpd-includes.conf that did not work either. I will be testing some more later this night.

All mentioned apache modules are loaded. If I'm correct they are installed by DA by default.

by the way SetEnvIfNoCase Request_URI "xlmrpc\.php" bad_bot xlmrpc wrong it should be xmlrpc
I took that over from post #40. For regexp things and \ in things i'm a copy and paste guy. :)

i prefer to use modsecurity.
I would like that, but when I enabled it first, it blocked way too much and got customer complaints. Also I've seen on the forums that several rules needs to be adjusted to prevent this. I'm 61 and don't know regexp stuff so I like to keep things easy the last few years that I'm doing this without my customers complaining. :)

Seems a lot of things do not work in the httpd-includes.conf while they are correctly formatted. Makes me wonder if that does anything anyway that file. Some things are checked. Try using an "deny from all" in there, LoL... apache won't restart. So what -can- one use in there, it's not doing what the docs say it should.

As for now I just use the htaccess see if I can find another serverwide option.
 
301 Maybe it from http to https rewrite from DA feature ?
Looks like it. I tested with the empty new subdomain. And when accessing via http I got the 301 in the log.
When using directly with https then I got a 200.

But seems we discovered with Hostmavi that the complete httpd-includes.conf isn't doing anything for blocking at all.
Also used the example in #39 which works in .htaccess but does nothing in the httpd-includes.conf so I wonder what can en can not be used in there.

If you only need to add extra Apache config code to the system, but don't need to remove any existing code, the best way is to add your changes to /etc/httpd/conf/extra/httpd-includes.conf as this file will not be touched by CustomBuild or DirectAdmin.
With this explanation from the docs, one would expect things to work.
Which it does when used like in post #18 although that causes a security risk with the "require all granted" so I'm going to test this in another way to see if I can fix it with httpd-includes that way.
 
I might want to try this. But how should this be done in virtualhost or the Directory thing so it's server wide?

I listed CUSTOM files that you can use for it in the post #57

And depending on which file of the two you will pick up, directives from it will be added in the corresponding context. Choose the one that fits your needs. The both will work. Just rewrite configs after you add your directives in the files.

Related:

- https://docs.directadmin.com/webservices/apache/customizing.html
- https://docs.directadmin.com/webservices/apache/customizing.html#custom-httpd-templates-read-order
 
@Richard G
Hi, I don't have time to test until now, the issued is just you need to cover with "location /" block, otherwise any "RewriteRule" condition or not, won't work at all.

I just updated the fixed in my reply #40

The original of my rules have something like this
Code:
<location /phpMyAdmin>
    <IfModule mod_rewrite.c>

    </IfModule>
</location>
 
directives from it will be added in the corresponding context.
Ah oke so I don't need to put the Virtualhost things in there myself? Because that was my concerning part. I'm not familiar with that.

@Hostmavi I will try that one later on too. Yesterday I tried one with <Location / > and then I got an error notices that this was not allowed or something, don't remember exactly but will try later this evening to prevent apache going down again due to me testing things. :)
 
Thank you! Both in the /etc/httpd/conf/extra/httpd-includes.conf and giving 403 and not 301?


both of them giving only 403
with and without your .htaccess
# Redirect HTTP to HTTPS on the same host
RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

# Redirect non-www to www (HTTPS only)
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^ https://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
 
For me my post #18
But that still contain the security risk that when using that one, then .htaccess files with .htpasswd do not ask for password anymore like reported in post #37.

Unfortunately post #40 has the same issue. The result shows that the httpd-includes.conf is not -adding- things but also -overruling- things. At least it overrules local .htaccess files
.
So in fact the way in post #40 seems to work, but overrules the .htaccess so for example with Xenforo, when checking new posts (and some other things), you will see a text 404.html or 404.shtml instead of the new post page.

To me that prooves that the httpd-includes.conf is not usable for this purpose with just apache, no matter which method.
 
@Richard G

Ok look like something surprise me, since my phpmyadmin and roundcube still work well.... I'm not expect this happening.

Let remove "RewriteRules" and use basic thing.
Code:
BrowserMatchNoCase "aiohttp" bad_bot
BrowserMatchNoCase "zh_CN" bad_bot

SetEnvIfNoCase  Request_URI "\.env" bad_bot
SetEnvIfNoCase  Request_URI "xmlrpc\.php" bad_bot

<If "%{ENV:BAD_BOT} == 1">
    Require all denied
</If>

And just testing by open the webpage "/a1/a2/a3/a4/xmlrpc.php".
It should return code as "403 Forbidden", whether the file exists or not


This is only "denied", no any "granted", so it should working with ".htpasswd"
 
Ok look like something surprise me, since my phpmyadmin and roundcube still work well.... I'm not expect this happening.
There are 2 different things.
a.) When using the "require all granted" option, then it overrules .htpasswd files for some reason.
b.) When using the rewrite option, then it overrules the rewrite for (for example) Xenforo in .htaccess causing problems.

Let remove "RewriteRules" and use basic thing.
You just suggested the one I wanted to try today, by only using the deny. But your's is better I think.

I tested with visiting https://www.myforum.com/.env and got a 403. I couldn't test with xmlrpc.php as this is already forbidden via an apache custom template. The .env got me to a 403 forbidden page.

Tested .htaccess with .htpasswd and that is also working good.

I've not see bots yet.
I'm going to fill it with the bots and then see if that also generates 403's. If yes, then we have the solution. I will report back.
 
YES!!! That looks like the solution, thank you very much!!

Code:
47.128.110.22 - - [15/Aug/2025:18:53:56 +0200] "GET /robots.txt HTTP/1.1" 403 3044 "-" "Mozilla/5.0 (Linux; Android 5.0) AppleWebKit/537.36 (KHTML, like Gecko) Mobile Safari/537.36 (compatible; Bytespider; [email protected])"
 
Back
Top