[HOWTO] Limit directadmin users max processes on linux

scsi

Verified User
Joined
Aug 19, 2008
Messages
4,695
This guide is to create a limit on the amount of processes directadmin users can run at a single time. It uses /etc/security/limits.conf

This example would limit all directadmin users to 20 processes max.

If you want to limit to a different number of processes change the 20 in user_create_post.sh to a different number.

Hopefully its useful to someone. I had to create this because of some abusive users.

1. Create the following file if it doesnt already exist: /usr/local/directadmin/scripts/custom/user_create_post.sh

Insert the following code:

Code:
#!/bin/sh

#Add user to /etc/security/limits.conf
echo "$username hard nproc 20" >> /etc/security/limits.conf

Save the file and run the following commands:

Code:
chown diradmin:diradmin /usr/local/directadmin/scripts/custom/user_create_post.sh
chmod 750 /usr/local/directadmin/scripts/custom/user_create_post.sh

2. Create the following file if it doesnt exist: /usr/local/directadmin/scripts/custom/user_destroy_post.sh

Insert the following code:

Code:
#!/bin/sh

# Remove user from limits.conf
sed -i '' -e "s/"$username.*"//g" /etc/security/limits.conf

Save the file and run the following commands:

Code:
chown diradmin:diradmin /usr/local/directadmin/scripts/custom/user_destroy_post.sh
chmod 750 /usr/local/directadmin/scripts/custom/user_destroy_post.sh

Thats it. Enjoy.
 
Last edited:
What happens when a user needs more on a certain moment?
Will the website stop working or are there other risks?
 
Let me answer the question.

It mostly affects CRON and SHELL tasks, and you'll see in logs something like this:

Code:
 Mar  2 12:00:03 server crond[5268]: CRON (username) ERROR: failed to open PAM security session: РеÑ￾урÑ￾ временно недоÑ￾тупен

Will it affect browsing your web-site or not, it depends on your configuration. If you use mod_ruid2/suPHP and do a file hosting, where you give files with a PHP script, then you surely would get into a limit of 20 concurrent downloads from a single user account. In other cases (in normal situations) Apache processes lives too little time, so it's rather hard to get close to the limit.
 
Alex,

I think you're posting that it's not a good idea to use 20 as a limit with mod_ruid2. Am I right? If so, what might be a good limit? We've got a client who could use limits, so I'd like some suggestions.

Jeff
 
Jeff,

No, I'm not. I was pointing on how it would or would not affect Apache, and browsing a single site. As a hosting company we've got nevertheless the same limit of 20 processes per user, and that is stated in our TOS as well as some other limits. More to say according to our experience the limit stays invisible almost for all of our customers, and I hardly could remember any complain about that.
 
Thanks, zEitEr, for the clarification. Do you use some method for early warning or notification? I suppose this must be the method (or similar to the method) companies use who offer unlimited bandwidth or disk space, but need some way to limit their clients from taking over the whole server.

Jeff
 
Do you use some method for early warning or notification?

Nothing of that kind, if I understand you correct. But we do send warnings about disk space usage.

I suppose this must be the method (or similar to the method) companies use who offer unlimited bandwidth or disk space, but need some way to limit their clients from taking over the whole server.

I don't believe in "unlimited" resources, everything is limited in this or that meaning. For example if we do not count traffic of our customers, and do not limit bandwidth directly, we do limit number of concurrent connections to a VirtualHost from a single IP with different methods, and concurrent connections to a VirtualHost from different IPs directly or with other limits. If you try to buy "unlimited" disk space from some company (not us, we do limit disk space), you might run into a limited number of disk inodes, or a limited size of a single file, and who knows what else you might meet there: limited number of MySQL queries, limited CPU, RAM, etc.
 
Thanks, zEitEr, for the clarification. Do you use some method for early warning or notification? I suppose this must be the method (or similar to the method) companies use who offer unlimited bandwidth or disk space, but need some way to limit their clients from taking over the whole server.

Jeff
The only way I can think of warning/notifying the user, would be to track the log file and start counting the amount of time the limit was exceeded.
I don't think there's a linux way of knowing someone has reached 80% of the limit (20 processes in this thread's example), since it's not even logged.

am I right?
 
Though I've never thought of this, but you seem to be right. I've never worried about this much, as users usually can't overuse it.
 
And with this you've limited all users on your server including system users (apache,root,mail, etc).
 
yes and its what you said to do ! and i remove what you said to me to do...

i want add all users clients, NOT user services

so anyway i found how for just put all list of users who are in /home
 
yes and its what you said to do ! and i remove what you said to me to do...

i want add all users clients, NOT user services

so anyway i found how for just put all list of users who are in /home

Do you mind sharing how you achieved this?
 
I think this method is for php CGI or mod_ruid2 not for CLI (run by apache user). Am I right ?
 
Hi,

Thanks for this.

I was using this and found the destroy script left a blank line which in time would make a messy file with lots of users being added/removed (just in the interests of keeping a tidy file).

Also I was advised the current remove script would remove "joesmith2" when removing "joesmith".

So we updated /usr/local/directadmin/scripts/custom/user_destroy_post.sh to:
sed -i "/^${username} /d" /etc/security/limits.conf

I've tested it and it seems to work well.

Thanks again.
 
@scsi,

I just revisited this today because of a link to it in: this recent thread. Did you set the limit of 20 processes arbitrarily? Does that limit continue to work for you or have you adjusted it?

Thanks.

Jeff
 
Back
Top