dolphi
Verified User
<EDIT>
It should be main customer OS user, and per domain OS user. For example if a customer has 3 domains, he will have 4 OS users: username, domain1-www, domain2-wwww, domain3-www. This will also isolate the 3 different web sites that the customer has
</EDIT>
By default DirectAdmin installs php with mod_php and mod_ruid2.
With this configuration, the public_html folder is owned by the user,
the FTP access is using the account's user, static files are accessed by the user (let's call it username)
and php is also impersonating to that user.
So we have a situation that php scripts of the web site are running as the
owner of the files, and we have no way to prevent a maliciaus script from
writing to the customr's public_html folder. We could set file permissions to 0400 but then
our customer will not be able to modify files, and a malicious script could simply change the file permissions and then corrupt it.
In a different setup, the php is executed as the user apache or nobody. This
has a flaw that when someone uploads a file via the web site, the owner of that
file is nobody/apache, and the customer that is using FTP (impersonated to usrname) cannot delete these files (unless
chmod has been used to fix the permission immidiatly after the web upload)
The first problem (that all these services: ftp, php and apache) are using the user that is the owner of the files,
can be solved by creating for each customer two unix users:
for example the two following users:
username
and
username-www
The first one (username) will be the owner of the files. And it is the main account
of the customer.
The second one (username-www), is used by apache and php (in mod_ruid2 setup).
We will then set the files in the customer's account with these permissions:
Owner and group of the files will be username:username-www
(This can be fixed by the following command)
Files permissions will be:
0750 for folders, and 0640 for files:
For wordpress we might add: (write permissions for the uploads folder)
(in the above command we assume that the current folder is one folder above the public_html folder).
And to enhance the setup for wordpress, in Custom HTTPD in the directadmin for the customer's domain
we will add:
The above custom httpd, will prevent php files from executing in the the wordpress
uploads folder. (We could do that with .htaccess file that reside in the uploads folder but then we have a flaw: a malicious script can delete that file)
Now php scripts can uploads files only to the uploads folder. And if someone found a way to upload a php script to that folder, he will not be
able to execute it.
But we have a problem with this setup. If a customer uploads an image to the wp-content/uploads file, the owner of that file
is username-www and not username. So our customer will not be able to delete that uploaded file via FTP or ssh or the file manager in directadmin.
He might be able to use the "reset owner" button in DirectAdmin's file manager on the uploaded file.
We can solve this owner problem (username-www is the owner instead of username for web uploads) using ACLs:
execute the command:
The above command is doing something marvelous. Every file or folder that
is created (no matter by who, by root or by username-www) has an additional user (username)
with rw permissions, and username-www group with read permissions.
Now with the two users setup (username as the main account and username-www for php and apache),
files that are uploaded via web upload will have the owner of username-www, but username will also
be able to delete or edit them.
Also we need to change the default httpd.conf so that httpd processes will impersonate username-www
In custom/virtual-host2.conf change the mod_ruid2 configuration
so that the user running the web site will be username-www instead of username:
where in the beginning of the file you can add a definition for GROUPWEB and USERWEB so that
they will be username-www (you must also create the user username-www).
For example add at the beginning of custom/virtual-host2.conf:
and then
in custombuild run
(but you probably don't want to that unless you have added the www users and changed the files permissions for all accounts!)
Another issue is that by default, the /home/username folder has the following owner and group:
username:access (owner is username, and group of the folder is access)
and permissions of 0750.
That creates a problem, php scripts in the web site will not be able to access the public_html since the script is running as user username-www and group username-www
the do not have x permissions for /home/username.
To solve that problem we can again use ACLs.
and run the command:
That command grants execute permissions for the home folder of the customer for username-www, and it will
be able now to run php scripts.
Shouldn't the dual users setup with ACLs as described here be the default for DirectAdmin?
It should be main customer OS user, and per domain OS user. For example if a customer has 3 domains, he will have 4 OS users: username, domain1-www, domain2-wwww, domain3-www. This will also isolate the 3 different web sites that the customer has
</EDIT>
By default DirectAdmin installs php with mod_php and mod_ruid2.
With this configuration, the public_html folder is owned by the user,
the FTP access is using the account's user, static files are accessed by the user (let's call it username)
and php is also impersonating to that user.
So we have a situation that php scripts of the web site are running as the
owner of the files, and we have no way to prevent a maliciaus script from
writing to the customr's public_html folder. We could set file permissions to 0400 but then
our customer will not be able to modify files, and a malicious script could simply change the file permissions and then corrupt it.
In a different setup, the php is executed as the user apache or nobody. This
has a flaw that when someone uploads a file via the web site, the owner of that
file is nobody/apache, and the customer that is using FTP (impersonated to usrname) cannot delete these files (unless
chmod has been used to fix the permission immidiatly after the web upload)
The first problem (that all these services: ftp, php and apache) are using the user that is the owner of the files,
can be solved by creating for each customer two unix users:
for example the two following users:
username
and
username-www
The first one (username) will be the owner of the files. And it is the main account
of the customer.
The second one (username-www), is used by apache and php (in mod_ruid2 setup).
We will then set the files in the customer's account with these permissions:
Owner and group of the files will be username:username-www
(This can be fixed by the following command)
Code:
chown -R username:username-www public_html
Files permissions will be:
0750 for folders, and 0640 for files:
Code:
find public_html -type d -exec chmod 0750 {} \;
find public_html -type f -exec chmod 0640 {} \;
For wordpress we might add: (write permissions for the uploads folder)
Code:
find public_html/wp-content/uploads -type d -exec chmod 0770 {} \;
find public_html/wp-content/uploads -type f -exec chmod 0660 {} \;
(in the above command we assume that the current folder is one folder above the public_html folder).
And to enhance the setup for wordpress, in Custom HTTPD in the directadmin for the customer's domain
we will add:
Code:
<Directory "|DOCROOT|/wp-content/uploads">
AllowOverride None
<FilesMatch "\.(inc|php|phtml|phps|php56|php70)$">
Order Deny,Allow
Deny from All
</FilesMatch>
</Directory>
The above custom httpd, will prevent php files from executing in the the wordpress
uploads folder. (We could do that with .htaccess file that reside in the uploads folder but then we have a flaw: a malicious script can delete that file)
Now php scripts can uploads files only to the uploads folder. And if someone found a way to upload a php script to that folder, he will not be
able to execute it.
But we have a problem with this setup. If a customer uploads an image to the wp-content/uploads file, the owner of that file
is username-www and not username. So our customer will not be able to delete that uploaded file via FTP or ssh or the file manager in directadmin.
He might be able to use the "reset owner" button in DirectAdmin's file manager on the uploaded file.
We can solve this owner problem (username-www is the owner instead of username for web uploads) using ACLs:
execute the command:
Code:
find public_html -type d -exec setfacl -m d:u:username:rwX,d:g:username-www:r-X {} \;
The above command is doing something marvelous. Every file or folder that
is created (no matter by who, by root or by username-www) has an additional user (username)
with rw permissions, and username-www group with read permissions.
Now with the two users setup (username as the main account and username-www for php and apache),
files that are uploaded via web upload will have the owner of username-www, but username will also
be able to delete or edit them.
Also we need to change the default httpd.conf so that httpd processes will impersonate username-www
In custom/virtual-host2.conf change the mod_ruid2 configuration
so that the user running the web site will be username-www instead of username:
Code:
<IfModule mod_ruid2.c>
RMode config
RUidGid |USERWEB| |GROUPWEB|
#RGroups apache |SECURE_ACCESS_GROUP|
RGroups @none
</IfModule>
where in the beginning of the file you can add a definition for GROUPWEB and USERWEB so that
they will be username-www (you must also create the user username-www).
For example add at the beginning of custom/virtual-host2.conf:
Code:
|?USERWEB=`USER`-www|
|?GROUPWEB=`GROUP`-www|
in custombuild run
Code:
./build rewrite_confs
Another issue is that by default, the /home/username folder has the following owner and group:
username:access (owner is username, and group of the folder is access)
and permissions of 0750.
That creates a problem, php scripts in the web site will not be able to access the public_html since the script is running as user username-www and group username-www
the do not have x permissions for /home/username.
To solve that problem we can again use ACLs.
and run the command:
Code:
setfacl -m u:username-www:--x /home/username
That command grants execute permissions for the home folder of the customer for username-www, and it will
be able now to run php scripts.
Shouldn't the dual users setup with ACLs as described here be the default for DirectAdmin?
Last edited: