Best way to Backup

redrain

Verified User
Joined
May 28, 2011
Messages
7
Hi

I was wondering what people think the best way to back up accounts is ... I have been backing up using the DA backup utility but it backs up to the server. What I am worried about is that if something went wrong with the box ... having the backups on the server may not make sense .. I noticed there was an ftp option .. would it make sense to set up a dedicated ftp server for backups?
 
We use the DirectAdmin admin level reseller backup option, and we ftp it to a dedicated backup server (4.75TB drive space).

We run rotation scripts on the server.

Jeff
 
I'm backing up all accounts to the local server as you do, but then run a script copying everything to a remote ftp server so I got the same backup on 2 servers. Either way you should get the backups to a remote server.

There are also people out there using rsync and mysql replication for more efficient resource handling. With the DA backup system, everything gets tarred which uses a lot of CPU, then the whole thing needs to be send to another server which uses a lot of traffic. With rsync and mysql replication only changes could be transfered. I'm going to look into it when I got the time, as it's a bit tricky and requires testing.
 
What os and ftp program

Ya, I have a pretty nice dell 2850 (700gig) that I thought I would use for a backup server ... but what os and ftp software do you guys use? ... this is probably a dumb question but how much is a gig crunched down 2 during a backup (just wondering how much drive space I would need on a back up server)

Lastly, is your backup server behind a NAT or do you keep it in front of it like the DA servers?

P.S> Are these random questions working ... lol .... I am getting them all wrong and they should be correct :p
 
Our current backup server (4.25TB expandable to 6TB backup space) runs CentOS; the OS is on a different partition from the various home partitions so we can keep CentOS on the latest level without worrying about our backups.

We build a standard minimal default CentOS server install using the standard ftp rpm provided with the CentOS install (sorry; I don't recall which it is right now). Our backup server is in our datacenter, and runs over an internable non-routable network, but we do have shell access (only) to the backup server from the public internet.

If you don't compress, then a gigabyte is a gigabyte. If you use compressed tarballs it can be significantly lower, but you'll have to test yourself as different kinds of files mean different averages.

Jeff
 
Jeff, should i ask you if you run via rsync or directly ftp from directadmin?

Im thinkin about build a server for backup and im wondering what's best.. if ftp directly to remote, or, backup local and rsync with remote (should be nice for less traffic and redundant copy).

true is that with a raid5/raid6 backup server should be not needed to have another remote backup if the backup server have multiple copy.. but.. murphy is always around :p

Regards
 
I don't use rsync; I directly ftp from DirectAdmin, using a private network inside our datacenter. We use the admin level reseller backujp system.

RAID (along with hot-swap drive bays, which we also use) should be considered only for help with keeping good uptime; it should never be considered a backup solution.

Traffic doesn't matter on a private network.

Jeff
 
Thanks for your reply Jeff.

I wasnt saying that raid is a backup solution, but, have raid disk should be a reason to DONT have a second backup location, cause, if disk fail, the raid will give you the possibility to dont lose backup that are stored in disk.

Ofc raid is not for backup, but just a good reason to dont have another backup in another server/location.

Regards
 
@Arieh

@Arieh

Ya ... I was kind of on the same thought pattern ... back up everything local then using a raid on network drive 2 download the backup file ...that way the network drive is behind the NAT and a bit more secure ... the only problem is that if a server died I would have to rebuild it then populate the local backup file then do the restore ... (I was wondering if that was going to be a pain..lol)

The other option was just to make a backup server like Jeff outside of the NAT and backup and restore from that. I was just curious what you think would be the best option and in your opinion what would be the best ftp software to install on the server (if I went that route)
 
Having your back-up server in a private network seems always safer to me, and that way you don't really have to worry about traffic either in most cases. I would still prefer a incremental backup, but the easiest way for now is still DA's own backup feature.

And like Jeff said, the ftp server doesn't really matter, they all seem to do the job. I myself use wget to pull the backups from the production server. But you could run any ftp on any OS you want on a backup server really if you want to push them. If you don't use a private network you could also consider a secure transfer, like sftp/ssh using for example rsync (still talking about the normal .tar.gz backups).
 
Sorry my thot was about server permomance not network traffic... my fault..

Anyway... for my own opinion now the way that should work best (for me ofc) is backup without zip (now is possible with directadmin) than rsync to backup server on private network so you can have incremental), than zip (or remove if you prefer) the local backup, zip the remote one by script and copy to another directory.

So you can have a directory always with .tar file rsynced and a directory with (for example) a weekly backup with all zipped file.. and.. as i do, remove the older then 4 weeks each time the script run.

As i sayd, is my own private opinion, everyone will use the one that best fill the needs, but maybe someone will find some usefull ideas on what i wrote.

Regards
 
If anybody wants backups to be stored in a remote data center — that's a good practice to have a copy in another geographic location — then he/she might need to encrypt it. VPS with big HDD is a good solution in that case. No need to worry about transferring via Internet, and no worry, that data can be stolen, as soon as it's encrypted with OpenSSL.
 
We use R1Soft for our backups. Incremental block-level backups to external storage, as well as incremental MySQL backups. Our backups are stored compressed and encrypted and are securely sent over a private VLAN.

Every backup takes about 50 minutes per hardware node. We have an average of 6GB changed data per node, including MySQL. This is compressed 2:1 most of the time.

We can do immediate restores up to ten days back on a per file or table, if we're talking MySQL, basis. However, we keep an archive of every backup on the 1st of the month up to a year back, just in case.

If this is the best way to do backups is up to you. We're more than happy with the current solution and besides a few glitches every now and then, there has not been a case where we had to disappoint our clients or ourselves.
 
R1 does this for us. It's not open source nor is it very cheap, but it's good and that's what important.
 
We use the DirectAdmin admin level reseller backup option, and we ftp it to a dedicated backup server (4.75TB drive space).

We run rotation scripts on the server.

Jeff

What kind of rotation scripts?
 
Our backups are saved to (for example):
Code:
/home/username/today
.
This example would work for one week of daily backups; you can create your own procedure for saving backups for as long as you'd like, depending on your needs and available space:

At the same time backups are scheduled to start on the hosting server we run something similar to this, running as username (so the starting directory is /home/username/ and the today directory will be built with the proper ownership & permissions):
Code:
# rm -Rf day7
# mv day6 day7
# mv day5 day6
# mv day4 day5
# mv day3 day4
# mv day2 day3
# mv today day2
# mkdir today

Note I haven't tested this and I'm not logged into my backup server; test it before you use it.

Jeff
 
Reseller Backups (Specific for Resellers)

Hi,

Is there any script that can run using cron job to delete the remote FTP storage backup before 7 days

W were using cPanel recently migrated to Direct Admin, please help
 
My quick 5c:

1. Keep your local backup on separate hard drive - not on the main one. Eg. mount it to /home/admin/backups

2. Never rely on the local backup only. Always keep a remote backup as well.

3. Never keep only the last backup (don't replace backups). Always keep at least one very old (month+), one old (more then a week) and one recent (daily backup). The more, the better. Sometimes customers screw up long time ago. And sometimes the backups themselves can be screwed up.

4. Test your backups regularly. Have one account which you regularly delete and restore. You must be sure that backups work. And when you test it, test all - website, e-mails, database.
 
Last edited:
P.S. The above is not regarding "live backups". Keeping "live" backup for guaranteed uptime is another complicated story.
 
Back
Top