Moving to a new server

nobaloney

NoBaloney Internet Svcs - In Memoriam †
Joined
Jun 16, 2003
Messages
26,113
Location
California
I've pretty much given up on making the DA server I've got in colo ever properlly manage email again, and I'm going to build another one tomorrow.

Does anyone have a good working script or set of scripts for moving all sites between two DirectAdmin servers?

Jeff
 
I need a tool like this too, since I'm moving clients from one machine to another (DA) machine.

I saw this:

http://www.directadmin.com/upcoming.html

And it says "upcoming"...but so does FreeBSD, and that's available now...so maybe this tool is too?

If not, I'm going to have to write something myself. It shouldn't be hard.
 
I don't think it's ready yet, at least not according to the email reply I got from DA support to my email to them :( .

Yes, I think one of us should write one ourselves.

Jeff
 
Easy task, very easy, the problem is keeping privileges for individual accounts..... Thats where the problems begin when writing a script, without going into a long process its not easily done. Or at least as far as i know it's not.

Chris
 
It's actually really easy and permissions aren't an issue so long as the accounts are recreated on the new server with the same usernames. User data/DA configs would be located in the same spots on each machine, so all you have to do is tar and ftp them over.

I'm getting another machine this month, I'll write the script and give it out when it's done.
 
loopforever said:
It's actually really easy and permissions aren't an issue so long as the accounts are recreated on the new server with the same usernames. User data/DA configs would be located in the same spots on each machine, so all you have to do is tar and ftp them over.

I'm getting another machine this month, I'll write the script and give it out when it's done.

Ftp from local to your server is often very slow, tarrring a directory wgetting it then unpacking is 100X quicker, you just have to then chown all the folders aswell as ensure the chmod remains as what it was originally.

Even that way would be far quicker than FTP, and would probably be easier to do using a shell script.. its really figuring how to get the script to know what name to chown a directory as.

Chris
 
ProWebUK said:
Ftp from local to your server is often very slow, tarrring a directory wgetting it then unpacking is 100X quicker, you just have to then chown all the folders aswell as ensure the chmod remains as what it was originally.

Even that way would be far quicker than FTP, and would probably be easier to do using a shell script.. its really figuring how to get the script to know what name to chown a directory as.

Chris

I suggested tarring the user's file, reason being that the -p flag will preserve the permissions of all the tarred files. This is critical. Once the tarball is made, FTP or SFTP would be the best option, I wouldn't suggest wget. With wget you'd have to make the tarballs publicly available via the web, that's a waste of time and not very safe, since anyone can download the files then. FTP can be done directly via command line and it would be very quick, especially since the FTP protocol is faster than anything Apache can pump out through port 80.
 
I mentioned placing password protection on the directory further up in the thread so it acted like the directadmin upgrade /setup system with the information in the URL. I guess either way the FTP would be better.... whats the resource usga elike though, not sure how the ensim backup works but i'm sure i heard rumours it uses up 80% of your resources when running a full server back.. quite shocking figures i think. i always preferred spreading large backups over time anyway.. with other control panels anyway :)

Chris
 
Yeah, the biggest issue here is the resources consumed tarring the files, but this won't be much of an issue, since I can set my script to schedule when the transfer takes place. Of course, if we want to do it manually, we can but it would be wise to do them at times when server access is not heavy.
 
One of the disadvantages of the Plesk backup utility we've come across as a Plesk Gold Partner was the fact that it shuts down Plesk (and all internet access including mail and web access) while it runs.

We've got one client who runs his backup every night; he starts it at 11 pm PST (-0800) that time seems to work well for him.

Jeff
 
loopforever said:
I suggested tarring the user's file, reason being that the -p flag will preserve the permissions of all the tarred files..

You still get the chown problem if the file is untarred as root though i assume?

The man description simply says

-p, --same-permissions, --preserve-permissions
extract all protection information

Which for some reason, appears to me like chmod only.... or am i wrong in thinking that?

Chris
 
Untar as root; the files will have the same permissions as they did on the server they were tarred on.

May I suggest that no one implement something as important as backup based on someone else's say so, but please try it yourself first on another machine, just to see if it works.

The small amount you'll pay for a temporary license for one month for another machine will be well worth it.

Jeff
 
jlasman said:
Untar as root; the files will have the same permissions as they did on the server they were tarred on.

Jeff hit the nail on the head. You won't lose ownership of files as long as root tars and untars them. But, -p is still critical, otherwise your permissions will get fscked up :p.
 
loopforever said:
But, -p is still critical, otherwise your permissions will get fscked up :p.

I figured the -p would hold the chmod permissions, but as you stated, 'ownership' isn't always what you would call permissions, which is why i was asking. Usually when root puts a file somewhere, it gets root ownership.

Chris
 
Back
Top