Backup features

davidd1

Verified User
Joined
Jan 31, 2013
Messages
104
Hello and sorry for my bad English.

We have 3 servers:

1) Linux (centos 7) with Directadmin - 150 users
2) Linux (centos 6) with Directadmin - 200 users
3) Windows server with storage disk and ftp server.

In directadmin panel we do FTP-backup (admintools -> Admin Backup/Transfer), from server 1 to 3 and from server 2 to 3 every week.

When create backup (Admin Backup/Transfer - step 3 - ftp) can you please add 2 features
1) Limit QoS (bandwidth) ? = we don't wont to use all server bandwidth when backup transfer...
2) File Sync? = I mean before the backup move from server 1 to 3 - check if file size/file date is equal and skip...

Thank you!
 
Hello,

Though I don't work for Directadmin and I'm not a choice-maker for new features to be included, I could suggest that you do the following. It seems it's the right time for you to use your own script for copying backups:

/usr/local/directadmin/scripts/custom/ftp_upload.php

- How to convert ftp_upload.php to use ncftpput or curl instead of php
https://help.directadmin.com/item.php?id=111 (here you can learn how directadmin uses a script to copy backups).

- How to slow down the rate of backup to reduce load and to not flood a remote ftp server
https://help.directadmin.com/item.php?id=366


For existing users a final .tar.gz file with a backup will always have different mtime and size (so you won't achieve the desired with rsync here.), if you still have backups of removed users... why don't remove them?
 
Thank you for reply.

- How to slow down the rate of backup to reduce load and to not flood a remote ftp server > This is not LIMIT QoS (Bandwidth). it's onlt sleep 20 sec between each user. (we have users with 200GB of backup file).

so you won't achieve the desired with rsync here. = NOT correct, we have alot of users that have only 1 minisite (STATIC not dynamic) not change every week AND we have some users with DATA (mp3/mp4 files) that not upload every week. so the backup not change!

thank you :) !!!

 
Directadmin when creating backups includes user's metadata in to it, i.e. history, bandwidth usage and so on... so you will get different .tar.gz file every time you create it in Directadmin.

As for LIMIT QoS use firewall in this case or write your own script using any program language for it.


That's ideas on how to get it right now... of course you may wait for an official answer.
 
OK my English is so BAD!!! I'm very very sorry!

I try to explain:

File Sync? = I mean before the backup move from server 1 to 3 - check if file size/file date is equal and skip...

We have alot of users that not change for month... so we don't run backup to this users every day/week/2weeks. but we transfers all users from server 1 to server 3 every week.

I want a NEW BUTTON (let call him "MOVE BACKUP SYNC") when we click on it:
1) NOT NEW BACKUP CREATE!
2) Only move all exist file inside "/home/admin/
admin_backups/*.*" > to other server via FTPS AND SYNC.



As for LIMIT QoS use firewall in this case or write your own script using any program language for it.

firewall is not an option.
write your own script ?? > becuse of this we ask a new
feature :)
 
Last edited:
Why should directadmin build a feature if this actually has nothing to do with the controlpanel and you can use a oneliner to accomplish this?

You could drop something like this in a daily, weekly, whatever cronjob on serverC and it wil only sync backups that changed. You even can change the bandwidth used with the bwlimit parameter.

/usr/bin/rsync --bwlimit=8000 -a --delete --numeric-ids --relative --rsh="/usr/bin/ssh -p 22 -o Compression=no" root@serverA:/home/admin/admin_backps/ /home/admin/admin_backups/

Just make sure you enable passwordless login from serverC to serverA for the admin/root user. Whatever floats you boat :)
 
File Sync? = I mean before the backup move from server 1 to 3 - check if file size/file date is equal and skip...

It's clear, that you want Directadmin to skip transferring files if a local copy and remote one are of the same size and mtime. That's good. Modification time is not the only criteria to identify whether or not the file was modified, directadmin will need to check its size and probably md5-sum for this. OK, probably a good request. But check the list of members who have read this thread, there is no Directadmin staff in the list yet. So you need to contact them via tickets...

write your own script ?? > becuse of this we ask a new feature :)

Directadmin developers will probably add this feature if there is enough requests for this. Does anybody else want it?

If to check data of this article https://help.directadmin.com/item.php?id=366 (it was Last Modified: Apr 20, 2011, 12:45 am), the feature was not requested much since that time, so hardly will it be added. But still you have a chance for it, and in this case you'd better open a ticket to get a higher luck to be noticed.
 
I would like a feature like that. Currently my backups run for many hours, and I'd love to save the effort of backup up files that are already in the backup. Mostly I have a few users with FTP folders with huge files that very rarely change. ISO's and such. I currently have removed that user from backups because it is just too much. This feature would allow me to put him back in. I've tried clearing the checkbox "ftp accounts". But that doesn't seem to be enough.
 
Back
Top