Solved httpd + php-fpm74 take a lot of resources and site unavailable

cDGo

Verified User
Joined
Sep 21, 2012
Messages
93
Hello,

As of this morning on one of our servers, the services httpd + php-fpm74 use 4 and over 6GB on Memory.
I cannot seem to find where this is comming from.
I can see that the server's IP takes over 70 connections.
How do I start to find out why there is so much resources in use?
Or any other advice is welcome.

The control panel it self works flawless.
phpmyadmin is also not reachable.
If I reset both services, the website comes up again, and is as fast as ever
Within 15 seconds, the php-fpm goes just over 4GB and then the http follows.

When I limit access to the site to my IP address only every thing stays well.
 
Last edited:
It's DDos Attack, don't ask me how to solves the issued, find the old threads or hire someone else to config the firewall.
 
After all, it was the php_fpm_max_children_default setting in directadmin.conf.
It was missing, so using this instruction I created the default vaule of 100.
And that wasn't enough, so raised it to 200.
And then the solution daunted, although the site was not as fast as normal, it stayed up!
So raising the value up to 400 solved the issue better but not entirely.

Now php-fpm74 uses 7,5 GB and http 1,45 GB
But what bothers me is that the servers IP is now using 330 connections?
What could cause this, I've never seen the servers own IP in the listing.
 
Last edited:
I think 400 is way to much. Normally 100 and most certainly 200 should be good.
We had a lot of scrapers from Google Cloud and Alibaba which continuously were causing high load and some bad bots.
Since they kept going we decided to block google cloud completely for the time being.

You might have a look in your server-status and logs if that is not causing the issue in your case too.

In CSF we allowed a max of 300 connections with the ct_limit setting and everything more will get blocked bij csf for a certain time.
 
Hi Richard,

It was with Alibaba cloud on thursday (29th) when all this started for me too.
Could you please explain how you did "block google cloud"?

I did set some values like this:
CT_LIMIT = 150
CT_PORTS = 80,443
SYNFLOOD_RATE = 75/s
SYNFLOOD_BURST = 25

And also tried these:
SYNFLOOD_RATE = 30/s
SYNFLOOD_BURST = 30
CONNLIMIT = 80;20,443;15

But it did not solve it, then i found out that setting php_fpm_max_children_default did get things (slower but) running again.
Still trying to find out how to get this server back to it's previous performance again.
 
Last edited:
Could you please explain how you did "block google cloud"?
Sure. I created a text file with all their ip ranges.
Then I put this in my public_html and added it to csf.blocklist to fetch like this:
MYBLOCK|86400|0|https://www.mydomain.com/cloudblocklist.txt
Be aware that you need ipset to be installed and have a bit of ram to use this. Probably not a good idea to do this for people on a vps with 2 or 4 GB RAM I guess.
We also use a DENY_IP_LIMIT of 15K.

However, those are lot of ip's. And we have a Hetzner server which has kind of a firewall too. So I added 2 of these and an Alibaba Cloud range in their firewall and then traffic is already blocked before it reaches our server.
Load calmed down instantly.

Next to that I use the httpd-includes.conf from apache to block bad bots.
I did it from a post here:
but the rest of that thread might also be interesting for you.
 
Ok, thanks, this is all new to me so I dived in to this alreay.

Uptil now I use the htaccess file to use "deny from", and that list grows so probably it slows down traffic too.
And 8G firewall and could always add problem IP's to the list and that worked just fine.
Although this was in place before too this incident.

And using the same way, I just couldn't get control over it.
Like the attack or over usage was done differently.

So out of precausion for "next time", I will try your method too.

The first question is whether the external cloudblocklist.txt method is as fast and has the same load on the server as the htaccess method?
In the htaccess there are also IP ranges like 213.230.0.0/16 can that be added to the cloudblocklist too?
Most likely it will, but the one example I could find for such list, only had full IP's listed.

Then ipset is installed, but do I need to do anything further?
In my csf.blocklists I uncommented ABUSEIPDB (with my key) but I'm not sure if that actually works, king of like your first respond in the thread you've mentioned.

You also shared your robot.txt, but isn't that a gentlemans agreement method?
And the bot's of today are just plain bitches?
I therefor used the 8G firewall, in my htaccess which worked very well for me when I first was testing with it.

Then in the same thread, you are asking for an explanation on how to add the bad bots line per line, whixh is then shown in the imporved example of Ohm J.
Actually these are only a few bots that are used in the 8G firewall.
So then the same question pops up, will the httpd-include.conf give a beter performance than htaccess (per site).
I think if you host (significant) more then one site on a server probably it will give beter performance.
 
The first question is whether the external cloudblocklist.txt method is as fast and has the same load on the server as the htaccess method?
I think it's faster as ip's don't reach the webserver anymore if I'm correct. They are read as needed by the firewall already so early refusal.
I'm not 100% though.

As for the cloudblocklist.txt you can add single ip's or CIDR ranges, single line per ip or per range so the same as in the .htaccess file.

I also use ABUSEIPDB with my key to both report and I use the blocklist too. At the moment I've limited that one to 10K ip's.

The httpd-includes.conf is a different system than a .htaccess file. I don't know about performace, probably kind of the same I guess, again not sure about it.
However, with .htaccess you have to check the ip of the bot and block them. With the httpd-includes that is not necessary, and if ip's are change for whatever reason, they will still be blocked while you have to change the ip in .htaccess.
My complete list is in the thread on the first post. You just have to convert the lines to the example like used in the httpd-includes.conf example.

So I limit .htaccess blocks in my case to bots which don't identify as a bot, and those can't really be blocked by the httpd-includes anyway.

We do not host a lot of sites on a server, there are around 70 customers I gues and more domains at the moment on the server where the attacks are heavy momentarily.
 
Hi Richard, Thank you again for sharing the information.
As we're now up and running again, I'll do the testing later at night.

Have you looked at the 8G-firewall, they have a lot of bad bots in there listing, which could also be added to the httpd-include.conf.
I will update here with my findings.
 
Have you looked at the 8G-firewall,
Oh sorry no I haven't. I thought because I spoke of RAM before that you had 8G for the firewall, lol... wrong interpretation on my behalve.
I will have look at it later this evening and see waht I can add. Thanks!
 
Ok, So I've added the 8G firewall entries into the httpd-include.conf, and had one problem.
The first section starts with this:
# 8G:[CORE]
ServerSignature Off
Options -Indexes
RewriteEngine On
RewriteBase /

When I restarted the http service, it was broke, so I removed this part, and it started working again.
Next thing I did was to add RewriteEngine On in each section of the 8G firewall rule set.
I just do not know if that was neccessary or not.
After 30 minutes it seems like the memory usage of http, mysql and php-fpm74 are slightly lower then just before this change.
I will monitor this tommorow when the server is more active.

I've reduced the value of php_fpm_max_children_default from 400 to 200 for now.

Now the next part would be the https://www.mydomain.com/cloudblocklist.txt
But with this I've got the question that the link points to a specific users domain and not a global setting on the webserver.
Is that intentionally or is it also possible to do this global/server wide?
 
Last edited:
Now the next part would be the https://www.mydomain.com/cloudblocklist.txt
But with this I've got the question that the link points to a specific users domain and not a global setting on the webserver.
Is that intentionally or is it also possible to do this global/server wide?
As far as I know it doesnt matter, where the blocklist is located. We have it under our hostname public dir. As the list gets imported into CSF, CSF blocks serverwide.
 
Is that intentionally or is it also possible to do this global/server wide?
It does work globally, because that is not a user domain, I'm using my companydmain for it. :)
It's not used by apache, but by the firewall so that is the reason it does not matter in which domain you put it, it's just so CSF can find it.
 
Back
Top