patrickkasie
Verified User
Dear DirectAdmin forum,
I have been trying to get a default robots.txt working for all domains without overwriting any specific robots.txt that are already existing on the websites.
Here's what I have:
vpsxx.domain0.nl
domain1.nl
domain2.com
domain3.be
They all have a robots.txt by default, which is set like this:
/etc/httpd/conf/extra/httpd-alias.conf
This provides every single domain on the server with a default robots.txt. However, when domain1.nl wants to have a unique robots.txt, the default robots.txt overwrites that one.
My question is: how do you overwrite the default robots.txt and not the specific domain1.nl/robots.txt?
I have been trying to get a default robots.txt working for all domains without overwriting any specific robots.txt that are already existing on the websites.
Here's what I have:
vpsxx.domain0.nl
domain1.nl
domain2.com
domain3.be
They all have a robots.txt by default, which is set like this:
/etc/httpd/conf/extra/httpd-alias.conf
Code:
(...)
Alias /robots.txt /var/www/html/robots.txt
This provides every single domain on the server with a default robots.txt. However, when domain1.nl wants to have a unique robots.txt, the default robots.txt overwrites that one.
My question is: how do you overwrite the default robots.txt and not the specific domain1.nl/robots.txt?