Hello,
I've used for the longest time the admin backup/restore function to do twice a week backup of all my users to a remote FTP. Lately I've been having issues with curl getting stuck uploading forcing me to kill the process for the backup to finish. This is a work in progress and would love some feedback.
I also have a couple of G Suite accounts that include unlimited Drive space, as well as some Office 365 with up to 25TB of storage. So I started to look on how to use that space to hold more historic data. As this is is public cloud, using encryption was a must.
I found this tool: https://rclone.org which is a very easy to use tool and supports a lot of cloud operators (including basic ones like ftp) so I decided to give it a go and so far I've been very happy with the results.
Please feel free to share any modifications or ideas of this.
All the commands are done using root.
First we install rclone:
Then we need to add our cloud storage simply follow the guide from rclone:
Google Drive: https://rclone.org/drive/
Microsoft OneDrive: https://rclone.org/onedrive/
Amazon S3: https://rclone.org/s3/
to just show a couple.
Next, you add an encryption layer: https://rclone.org/crypt/
Finally, I encrypted my configuration file as an additional precaution: https://rclone.org/docs/#configuration-encryption
Now we need to configure DirectAdmin to perform the backups as required just to a local path, so rclone can read it, in my example I use /home/user_backups.
Only thing missing is to upload the files after DirectAdmin is done with the backup. For this we use: /usr/local/directadmin/scripts/custom/all_backups_post.sh
Here is my current script:
You can also mount the GDrive partition using rclone in case you need to restore: https://rclone.org/commands/rclone_mount/
This will allow you to simply navigate to the folder and use DirectAdmin restore directly instead of having to copy the files locally. If you decide to do this, I highly recommend adding this to your script as you can make the mistake of moving files from gdrive over to gdrive using mount/move and lose those backup files:
Now, The advantages of rclone over other solutions is that it will error out automatically and retry up to 3 times (you can configure this) to upload the file, you can also do parallel uploads which can speed things up in some scenarios. It supports a lot of backends and if one is missing, adding them is very easy and the main Dev is very active and helpful.
So far, I've been using rclone for close to 3 months, but now I've learned enough to share it and hope people benefit from this.
Ideally DirectAdmin could integrate it and allow us to do it directly, even allowing multiple destinations (as my current setup is) all via a nice GUI screen.
Please let me know your thoughts and any questions you may have.
Jose
I've used for the longest time the admin backup/restore function to do twice a week backup of all my users to a remote FTP. Lately I've been having issues with curl getting stuck uploading forcing me to kill the process for the backup to finish. This is a work in progress and would love some feedback.
I also have a couple of G Suite accounts that include unlimited Drive space, as well as some Office 365 with up to 25TB of storage. So I started to look on how to use that space to hold more historic data. As this is is public cloud, using encryption was a must.
I found this tool: https://rclone.org which is a very easy to use tool and supports a lot of cloud operators (including basic ones like ftp) so I decided to give it a go and so far I've been very happy with the results.
Please feel free to share any modifications or ideas of this.
All the commands are done using root.
First we install rclone:
Code:
curl https://rclone.org/install.sh | sudo bash
Then we need to add our cloud storage simply follow the guide from rclone:
Google Drive: https://rclone.org/drive/
Microsoft OneDrive: https://rclone.org/onedrive/
Amazon S3: https://rclone.org/s3/
to just show a couple.
Next, you add an encryption layer: https://rclone.org/crypt/
Finally, I encrypted my configuration file as an additional precaution: https://rclone.org/docs/#configuration-encryption
Now we need to configure DirectAdmin to perform the backups as required just to a local path, so rclone can read it, in my example I use /home/user_backups.
Only thing missing is to upload the files after DirectAdmin is done with the backup. For this we use: /usr/local/directadmin/scripts/custom/all_backups_post.sh
Here is my current script:
Bash:
#!/bin/bash
# Local Drive
LOCAL=/home/user_backups
LANG=C DOW=$(date +"%a")
#exit if running
if [[ $(/usr/sbin/pidof -x "$(basename "$0")" -o %PPID) ]]; then exit; fi
/usr/bin/rclone copy $LOCAL JCAFTP:$DOW/ --ftp-concurrency 2 --log-level INFO --log-file /home/backup_logs/jcaftp.log
/usr/bin/rclone move $LOCAL Crypt:"$(date +"%Y%m%d")"/ --fast-list --drive-stop-on-upload-limit --drive-chunk-size 128M --delete-empty-src-dirs --log-level INFO --log-file /home/backup_logs/crypt.log
You can also mount the GDrive partition using rclone in case you need to restore: https://rclone.org/commands/rclone_mount/
This will allow you to simply navigate to the folder and use DirectAdmin restore directly instead of having to copy the files locally. If you decide to do this, I highly recommend adding this to your script as you can make the mistake of moving files from gdrive over to gdrive using mount/move and lose those backup files:
Bash:
if /bin/findmnt $LOCAL -o FSTYPE -n | grep fuse; then
echo "fuse file system, exitting"
exit 1
fi
Now, The advantages of rclone over other solutions is that it will error out automatically and retry up to 3 times (you can configure this) to upload the file, you can also do parallel uploads which can speed things up in some scenarios. It supports a lot of backends and if one is missing, adding them is very easy and the main Dev is very active and helpful.
So far, I've been using rclone for close to 3 months, but now I've learned enough to share it and hope people benefit from this.
Ideally DirectAdmin could integrate it and allow us to do it directly, even allowing multiple destinations (as my current setup is) all via a nice GUI screen.
Please let me know your thoughts and any questions you may have.
Jose
Last edited: