[How-To] Add DirectAdmin current admin backup/restore cloud support

jca

Verified User
Joined
Oct 31, 2006
Messages
306
Location
Allen, TX
Hello,

I've used for the longest time the admin backup/restore function to do twice a week backup of all my users to a remote FTP. Lately I've been having issues with curl getting stuck uploading forcing me to kill the process for the backup to finish. This is a work in progress and would love some feedback.

I also have a couple of G Suite accounts that include unlimited Drive space, as well as some Office 365 with up to 25TB of storage. So I started to look on how to use that space to hold more historic data. As this is is public cloud, using encryption was a must.

I found this tool: https://rclone.org which is a very easy to use tool and supports a lot of cloud operators (including basic ones like ftp) so I decided to give it a go and so far I've been very happy with the results.

Please feel free to share any modifications or ideas of this.

All the commands are done using root.

First we install rclone:

Code:
curl https://rclone.org/install.sh | sudo bash


Then we need to add our cloud storage simply follow the guide from rclone:

Google Drive: https://rclone.org/drive/
Microsoft OneDrive: https://rclone.org/onedrive/
Amazon S3: https://rclone.org/s3/

to just show a couple.

Next, you add an encryption layer: https://rclone.org/crypt/

Finally, I encrypted my configuration file as an additional precaution: https://rclone.org/docs/#configuration-encryption


Now we need to configure DirectAdmin to perform the backups as required just to a local path, so rclone can read it, in my example I use /home/user_backups.

Only thing missing is to upload the files after DirectAdmin is done with the backup. For this we use: /usr/local/directadmin/scripts/custom/all_backups_post.sh

Here is my current script:

Bash:
#!/bin/bash

# Local Drive
LOCAL=/home/user_backups
LANG=C DOW=$(date +"%a")

#exit if running
if [[ $(/usr/sbin/pidof -x "$(basename "$0")" -o %PPID) ]]; then exit; fi

/usr/bin/rclone copy $LOCAL JCAFTP:$DOW/ --ftp-concurrency 2 --log-level INFO --log-file /home/backup_logs/jcaftp.log
/usr/bin/rclone move $LOCAL Crypt:"$(date +"%Y%m%d")"/ --fast-list --drive-stop-on-upload-limit --drive-chunk-size 128M --delete-empty-src-dirs --log-level INFO --log-file /home/backup_logs/crypt.log

You can also mount the GDrive partition using rclone in case you need to restore: https://rclone.org/commands/rclone_mount/
This will allow you to simply navigate to the folder and use DirectAdmin restore directly instead of having to copy the files locally. If you decide to do this, I highly recommend adding this to your script as you can make the mistake of moving files from gdrive over to gdrive using mount/move and lose those backup files:

Bash:
if /bin/findmnt $LOCAL -o FSTYPE -n | grep fuse; then
    echo "fuse file system, exitting"
    exit 1
fi

Now, The advantages of rclone over other solutions is that it will error out automatically and retry up to 3 times (you can configure this) to upload the file, you can also do parallel uploads which can speed things up in some scenarios. It supports a lot of backends and if one is missing, adding them is very easy and the main Dev is very active and helpful.

So far, I've been using rclone for close to 3 months, but now I've learned enough to share it and hope people benefit from this.

Ideally DirectAdmin could integrate it and allow us to do it directly, even allowing multiple destinations (as my current setup is) all via a nice GUI screen.

Please let me know your thoughts and any questions you may have.

Jose
 
Last edited:
  • Like
Reactions: BBM
Great job @jca and I look forward to see your plugin @youds !
Thanks both for sharing and commitment.

I am already testing backups to Onedrive and would like to see other options.
I already mounted a Onedrive folder and seems to work fine. But I need it to be mounted on startup.
- Is this useful? maybe it is the best to mount Onedrive when DA start backing up to local path and then when rclone copied to Onedrive I sould unmount it.

I did not thought about encrypt and seems very obvious. Security first.
- But what about CPU process encrypting large files?

Thanks in advance,
Santiago
 
I am already testing backups to Onedrive and would like to see other options.
I already mounted a Onedrive folder and seems to work fine. But I need it to be mounted on startup.
- Is this useful? maybe it is the best to mount Onedrive when DA start backing up to local path and then when rclone copied to Onedrive I sould unmount it.

I would recommend not mounting and copying files using fuse layer. It will slow things down a lot and you lose retry and hash protection. I highly recommend using copy/move instead. You can keep OneDrive mounted while you use the other commands. You would just need to perform a vfs refresh since Onedrive is not a polling backend.

Mounting is a little more complicated based on what you are going to do. I do not recommend using cache backend as it's EOL and instead to use vfs cache from mount command if it helps your usage. Just consider that any error that happens through mount won't be traced back to DirectAdmin and you could end up without a proper backup.

I did not thought about encrypt and seems very obvious. Security first.
- But what about CPU process encrypting large files?

Honestly CPU usage is very low and should not have any real impact in your system. I've seen people do encryption with no problem on a Raspberry with no performance hit and in my tests I get like 800mbps toward GDrive regardless of encryption and load does not even go above 1.0.
 
Thanks @jca!
This backups procedures are new for me (10 years with Cpanel) and I misunderstood the "mount" part.
I realized is only for "restoring" process.

I follow your steps and it works great, except when you talk about encryption. You suggests encrypt with Rclone, what about DA encrypt?
What should be the steps to decrypt a rclone backup moved to Onedrive?

- Using this Cloud storage service is a good practice for backing up a Web Hosting service?

Thanks in advance!
 
I follow your steps and it works great, except when you talk about encryption. You suggests encrypt with Rclone, what about DA encrypt?
What should be the steps to decrypt a rclone backup moved to Onedrive?

You can use either one or both. Depends on your legislation as I'm not sure rclone encryption is GDPR compliant (Europe). I use rclone as well and DirectAdmin encryption as this lets me encrypt the filenames as well as data in another layer which I appreciate when sending data to a 3rd party for storage.

- Using this Cloud storage service is a good practice for backing up a Web Hosting service?

That's up to you, based on your needs and comfort. I also keep another backup set in another server managed by me in another physical location (ftp backup in my script) so I will always try to use that set before pulling from the cloud. Just having that extra set on the cloud with my Google workplace subscription (or Office 365 in your case) is an easy/inexpensive way to keep (based on your comfort level) your set of backups.

I'm personally about to start syncing to OneDrive as well to keep a 3rd copy of my data. My subscription should also include unlimited storage, but based on what I read online you can directly increase it to 5TB, then contact support to increase it to 25TB and once you are full, they will create separate 25TB accounts forcing you to divide content. But even at 5TB seems like more than enough to keep a fair amount of historic backups.
 
Thanks @jca I will then use bot encryption methos, DA + Rclone.
I am using 1TB Onedrive (6TB family pack) and everything is fine for now. I do not know if there is a daily transfer limit.
It is cheaper than Backblaze B2. but I do not know if it 100% reliable.

- How do you proceed to restore with both encryption methods?
- Do you have a script or that?

Thanks in advance!
I hope DA add this plugin soon.
 
I am using 1TB Onedrive (6TB family pack) and everything is fine for now. I do not know if there is a daily transfer limit.
It is cheaper than Backblaze B2. but I do not know if it 100% reliable.
That should be enough. GDrive has a 10TB download and 750GB upload limit. PAID Onedrive customers have no specific limits, but are throttled based on load. Rclone should do this automatically.
Nothing is 100% reliable, there have been documented outages from Azure lately, but it's a pretty stable platform.

- How do you proceed to restore with both encryption methods?
- Do you have a script or that?
Easiest way would be to just mount the crypt partition. for example:

Code:
rclone mount Crypt: /backup --allow-other

That command will automatically decode the rclone encryption (it's totally transparent and on the fly)

I added the --allow-other flag so other accounts can access the mounted partition. Just be sure /backup folder exists and is empty.

From there just open DirectAdmin, go to the backup/restore settings and browse the folder to see the backup files. If you are using DirectAdmin encryption you will need to provide that password.

If you use your 1TB OneDrive space for other purposes, I would recommend using a folder and just encrypting that folder so it's easier for you. Also, notice that OneDrive has a pretty low character limit, so based on your folder structure and filename it might throw errors out due to path being too long. If that's the case try not to encrypt filename (just the content), you could also use the obscure filename option, but just remember that is not encryption and can be decoded quite easily.
 
Thank you very much @jca !

I asked about DA encryption + Rclone encryption because when I encrypt a backup file with DA, when I want to restore it, the file *.tar.gz.enc is not visible from Restore list. With Rclone encrypt would be enough?

Now I am testing Onedrive with my personal account, later I am going to use one entire account (1tb) for this backups.
And one per month backup over BackBlaze.

I am looking for good practice about cloud backups.
What do you recommend me?
 
I asked about DA encryption + Rclone encryption because when I encrypt a backup file with DA, when I want to restore it, the file *.tar.gz.enc is not visible from Restore list. With Rclone encrypt would be enough?
They should show up. If they don't I recommend reaching out to DirectAdmin support for assistance. If local encrypted backups are not showing up, either will they show if they are stored remotely.

Now I am testing Onedrive with my personal account, later I am going to use one entire account (1tb) for this backups.
And one per month backup over BackBlaze.
1tb should be enough unless you have a lot of data or want to keep too much historical data. You could also use rclone for BackBlaze as it's very well supported as well.

I am looking for good practice about cloud backups.
What do you recommend me?
It depends on your requirements, budget and agreement with your customers. I always keep a copy in a remote server and another in the cloud.
Restore requests SLA for me is 24 hours if less than a week and 96 hours if older, so this gives me time to grab the backup and include potential down time from the cloud provider. I include once weekly for free and offer daily for an additional (hefty) fee.
Following the 3-2-1 rule you need to always hold 3 copies of your data (remember RAID is not backup), 2 backup copies need to be in different devices and at least 1 copy off site. As everything not all people agree on this, but ideally have a copy in two different cloud providers at all times in case your OneDrive account is suspended or there is an outage you are not left with no backups.
 
Is this a pull process or a push process? Can it be used for a Pull backup server?
 
Is this a pull process or a push process? Can it be used for a Pull backup server?
It's a push process. DirectAdmin needs to generate the backups locally and the all_backups_post.sh script at the end will push it to the remote servers.

For restore, you need to mount the remote partition to a local folder using FUSE so DirectAdmin can see it, so it would be a pull made by DirectAdmin.

They are planning on using Borg.

Thanks, I saw that. Yet, my understanding is that Borg does not offer cloud support. Borg is simply a tool that would allow DirectAdmin to support incremental backups as well as a more robust backup solution. Yet, it does not solve the cloud backup part which is what rclone would solve as rclone is just a cloud enabled rsync style tool and not really designed to make a backup.
 
@jca last night I made a backup for all the accounts in my server and when copying to Onedrive, two of them where missing. I uséd your steps, exactly.

I see some errors in my rclone log file:
2020/11/03 03:37:22 ERROR : Attempt 2/3 failed with 6 errors and: invalidRange: fragmentOverlap: The uploaded fragment overlaps with data that has already been received.
- Do I have to change the rclone "move" command to avoid this issue?
- Any advise?
 
You can also mount the GDrive partition using rclone in case you need to restore: https://rclone.org/commands/rclone_mount/
too bad, there is no all_restore_pre.sh

all_restores_post.sh​

This hook script is called after all Admin or Reseller restores.

Environment variables​

  • ip_choice(select|file): restore user with ip from the file or from selection
  • ip(valid ip or "free_random"): when ip_choice="select" is set, DA will restore the User with the selected IP, if set to "free_random" restore the User with a random free IP
  • type (admin|reseller): type of backup
  • where (ftp|local): is backup local or done using ftp
  • if ftp is used:
    • ftp_username
    • ftp_password
    • ftp_path
    • ftp_ip
    • ftp_port
    • ftp_secure
  • local_path: path used in local backups
  • select[X]: selected user backup file relative to path. created for every selected backup file
 
@jca last night I made a backup for all the accounts in my server and when copying to Onedrive, two of them where missing. I uséd your steps, exactly.

I see some errors in my rclone log file:

- Do I have to change the rclone "move" command to avoid this issue?
- Any advise?
Let's cover our basics, what rclone version are you running? Run
Rich (BB code):
rclone version
and paste the result.

Also, can you share the exact command you used for the transfer?

Are you using your own client id for OneDrive? Check https://rclone.org/onedrive/#getting-your-own-client-id-and-key

Not using your own client id results in more severe throttling that can cause this error in both OneDrive and Google Drive.

If that doesn't fix it. we can try and limit so it runs less parallel operations.
 
Back
Top