Howto: Full DirectAdmin backup

prophecy

Verified User
Joined
Jul 8, 2003
Messages
205
ProWebUK said:

/etc/passwd and /etc/group may be an idea....... I will look into backing up full users.
That would be awesome if you could just transfer all the users along too, probalby skipping the root user and all the users and groups that DA makes as well. I guess only the users for the sites themselves should be moved.
 

Icheb

Verified User
Joined
Sep 15, 2003
Messages
556
Location
The Netherlands
This system seems to be a bit better developed than the system i'm using at this time, but i would really like to be able to transfer the files offsite using rsync, since my company's datatraffic is limited to 7.5 GB/month at the moment.
At the moment i use rsync to first transfer all needed files offsite and than use our intranet server to tar it after which it goes to a certain backup drive...

Or is it just the best solution to use the current version and transfer the /home/backup/ dir ?
I only need to see to it that there isn't an <date> variable since this would require a new transfer of all files if using rsync to do it.

(yes, it's rsync over an ssh tunnel :D, so not the realy unsecure method :D)
 

ProWebUK

Verified User
Joined
Jun 9, 2003
Messages
2,326
Location
UK
You get the option to:

- store data on the local server
- ftp to an external location
- ssh to an external location

With this backup script :)

Chris
 

sander815

Verified User
Joined
Jul 29, 2003
Messages
474
when we add /home to the backup config it takes ages for it to finish, and we took it off teh backup...
WHat could cause that? /home isn't that big, 15 gb i think
 

ProWebUK

Verified User
Joined
Jun 9, 2003
Messages
2,326
Location
UK
sander815 said:
when we add /home to the backup config it takes ages for it to finish, and we took it off teh backup...
WHat could cause that? /home isn't that big, 15 gb i think
Give it a few minutes.... the script may appear to die on a few areas however its more than likely working.... you could try using tar -czf to tar it manually and see how long that takes... its more than likely not instant :)

Chris
 

prophecy

Verified User
Joined
Jul 8, 2003
Messages
205
sander815 said:
when we add /home to the backup config it takes ages for it to finish, and we took it off teh backup...
WHat could cause that? /home isn't that big, 15 gb i think
It took about half an hour for mine on the home directory, but that is because it's doing two processor intensive tasks

1. Compressing
2. MD5 checksumming

Both are worth the wait.
 

prophecy

Verified User
Joined
Jul 8, 2003
Messages
205
Icheb said:
This system seems to be a bit better developed than the system i'm using at this time, but i would really like to be able to transfer the files offsite using rsync, since my company's datatraffic is limited to 7.5 GB/month at the moment.
At the moment i use rsync to first transfer all needed files offsite and than use our intranet server to tar it after which it goes to a certain backup drive...

Or is it just the best solution to use the current version and transfer the /home/backup/ dir ?
I only need to see to it that there isn't an <date> variable since this would require a new transfer of all files if using rsync to do it.

(yes, it's rsync over an ssh tunnel :D, so not the realy unsecure method :D)
I think you are probably looking for something different, the backup guide here is for entire server backups in general, something you might do once a week or month or when you plan to move servers. This isn't a good backup system for daily backups or anything like that. rsync is much better suited for that and if you just rsynced your /home directory, that would work just fine to keep things backed up daily and it would not take long.

I guess the best thing for keeping things completely backed up would be to run this system say once a week and have an rsync run once each night.
 

Icheb

Verified User
Joined
Sep 15, 2003
Messages
556
Location
The Netherlands
prophecy said:
I guess the best thing for keeping things completely backed up would be to run this system say once a week and have an rsync run once each night.
It sounds great in theory, but really, at the moment we don't have the bandwidth to do this. The server bandwidth is ok, but our own company's isn't...
But it seems like a nice enough idea to impliment, a /home only rsync and a weekly backup transfer :)


Edit:
The backuping system seems to be working :)
The only problem i've found so far, is it's not really working on a production server, since all open mysql tables are skipped instead of dumped with current values...
So half my MySQL backup is missing.
 
Last edited:

pilpelet

Verified User
Joined
Oct 12, 2003
Messages
108
installation

Chris ,

The .install.sh doesnt work , its more like make install -d ./....
 
Last edited:

ProWebUK

Verified User
Joined
Jun 9, 2003
Messages
2,326
Location
UK
Re: installation

pilpelet said:
Chris ,

The .install.sh doesnt work , its more like make install -d ./....
typo I have just noticed, should be

./install.sh

not:

.install.sh

Chris
 

ClayRabbit

Verified User
Joined
Jan 3, 2004
Messages
260
Location
Russia
Nice program. Thank you :)

Just my 2 cents:

1) remove /home from custom.dirs
2) remove all entries in cpanel.dirs and cpanel.files
3) Make CPANEL_BK="1" in conf.sysbk

And you will get nice separate archives for each user's homedir :)
I think that's much better than ONE HUGE home backup.

(But anyway homedirs backup process is taking too much time... :(
I think we need some smart script for backuping userdirs in /home. Script must get latest ctime among user's files (but ignore files in userdir/domains/*/logs and userdir/domains/*/stats) and compare it to mtime of previous backup file. If first < second then we don't need to create new archive - just copy it from previous backup.
Unfortunately, i'm not very famliar with programming in *nix, and don't know how to write such command-file. Of course i can write it in Perl, but...)
 
Last edited:

pilpelet

Verified User
Joined
Oct 12, 2003
Messages
108
Hi ,

Worth to mention that the FTP option wont work on most DA installtions without installing "ncftpput" , by default it doesnt .


NcFTP installation : [updated]

wget ftp://ftp.ncftp.com/ncftp/binaries/ncftp-3.1.7-linux-x86-export.tar.gz
tar zxvf ncftp-3.1.7-linux-x86-export.tar.gz
rm -f -R ncftp-3.1.7-linux-x86-export.tar.gz
cd ncftp-3.1.7
make install

The installtion do it all and put all the needed files in place , after that the FTP option of sysbk will work .

If you need to scp to server B an automate the backup then it a bit more complicated , i wrote a full howto in this post :

http://www.directadmin.com/forum/showthread.php?s=&postid=14122#post14122
 
Last edited:

XYZed

Verified User
Joined
Mar 15, 2004
Messages
32
Hi,

Just seeing if the main thread has been updated with all the little recommendations/tweaks?

Also is this suitable for moving an entire server? - We are replacing all our hardware and need to move all servers to the new ones. I want to keep the same IP's etc (well 1 new one for the new server so can transfer the files). Yes I could just stick in the old HDD's, but they are PATA whilst new ones are SATA and RAID1 config.

So is this suitable for a complete move on a clean RH install ?

I guess it's treating it like a full restore from a dead HDD or hacked system. Fun part is there is only 20 servers to do :-(

Thanks.
 

netswitch

Verified User
Joined
Dec 15, 2003
Messages
175
Location
Belgium
The main problem is that thee is no automated restore script yet (or I missed it) and restoring a full server by hand is quite a huge piece of work...
 

XYZed

Verified User
Joined
Mar 15, 2004
Messages
32
Wonderful, I can see it's time to go and hire a few linux gurus for a few weeks.

or maybe:

cd /
tar -zcvf *.* all.tar.gz

then wget, and untar LOL - if only.

Amazing there isn't a restore script for the backups yet - thankfully never needed it.
 

nobaloney

NoBaloney Internet Svcs - In Memoriam †
Joined
Jun 16, 2003
Messages
26,119
Location
California
One of the posts in this thread says it doesn't backup up open mysql files, but I've checked the code and it DOES stop mysql before it backs up, so it shouldn't have any problems with mysql database backups. Is anyone still having problems with this?

Also...

I'm testing this backup program as a prelude to offering more serviceability...

One thing I'm not sure of...

Where it wants: MYSQL_ROOTPW=
in conf.sysbk...

Do I use the value of

mysql=

in usr/local/directadmin/scripts/setup.txt ?

Thanks.

Jeff
 

ProWebUK

Verified User
Joined
Jun 9, 2003
Messages
2,326
Location
UK
jlasman said:
Also...

I'm testing this backup program as a prelude to offering more serviceability...

One thing I'm not sure of...

Where it wants: MYSQL_ROOTPW=
in conf.sysbk...

Do I use the value of

mysql=

in usr/local/directadmin/scripts/setup.txt ?
Unless you have changed that password, which you should, that would be your mysql root password... so yes :)

Chris
 

nobaloney

NoBaloney Internet Svcs - In Memoriam †
Joined
Jun 16, 2003
Messages
26,119
Location
California
This is a client's machine and he hasn't changed the password.

I searched the forums but I can't find where I have to put the new password if I change it (besides in the sysbk script of course; that I know :) ).

So if I change the password, where do I put the new one?

Thanks.

Jeff
 

ProWebUK

Verified User
Joined
Jun 9, 2003
Messages
2,326
Location
UK
jlasman said:
So if I change the password, where do I put the new one?
Not sure what you mean, *how* do you chnage the root mysql password? - or if you have changed it..?

Chris
 
Top