[release] sFTP backups for DirectAdmin (version: 0.1.poralix)

Getting an error

Hi Alex,

Thanks so much for this script. I've tried to make backups to my STACK environment (SFTP over ssh port 22) for ages.

I've installed the script and setup the cron, however it gives this error:

Code:
User justin has been backed up. <9:25:03>
ftp_upload.php exit code: 1
ftp_upload.php output: [upload] sftp return code: 1
<9:25:03>

Although a backup error has occurred, the upload of valid backups would have still been attempted to ftps://stack.*.nl/October/11/ <9:25:03>

When I start a sftp connection at Cyberduck for example it works perfectly. I see in the url it says ftps instead of sftp. I'm probably doing something wrong, but not sure what. Do you have any idea?

My cron settings:
DeZwviX.png
 
Try and change secure FTPs to plain FTP, and see whether or not it helps. The custom script will detect sFTP by port number specified in the backup task.
 
Thanks, just tried a few different settings and normal FTP with port 22 works.

But it does only work with default path's. When I use custom path: %B/%d or %B/%d/ it gives an error. When I choose one from the list it works perfectly.

So no big issue right here, but if you have a fix let me know please :) thanks again for this great script!
 
Hi @zEitEr

On a new server we are getting the error:

  • sftp return code: 3
    Failed to change pseudo terminal's permission: Operation not permitted
    FTP information invalid.

    Can you help at all please? we have the same settings working on other servers so I know they are correct, but for some reason getting this error?

    Thanks in advance.
 
Thanks, just tried a few different settings and normal FTP with port 22 works.

But it does only work with default path's. When I use custom path: %B/%d or %B/%d/ it gives an error. When I choose one from the list it works perfectly.

So no big issue right here, but if you have a fix let me know please :) thanks again for this great script!

Hi @Livonias just started using the script as wel had similar issue.. Just to state the obvious but check for double // in url
if you for example have a remote path like this: /here/are/mybackups/ with a date as custom variable.
It will generate the following dir: /here/are/mybackups//01-dec-2019/
This is cause the custom variable starts with /
In thiscase remove / in remote path field.

@zEitEr Lovely script works like a charm
 
Oops My assumption was wrong it keeps failing.
Error:
2019:12:01-06:00:02: uploadFtpFiles('admin.tar.gz', , /home/tmp/admin.15045, diradmin) failed (is_missing_user_ok=0)

I have now manualy created the dir to see if that works.

Nope something changed indeed since it worked first few days.
futher troubleshooting:
Code:
2019:12:03-01:39:03: uploadFtpFiles('user.tar.gz', , /home/tmp/admin.10888, diradmin) failed (is_missing_user_ok=0)
2019:12:03-01:41:28: uploadFtpFiles('user2.tar.gz', , /home/tmp/admin.10888, diradmin) failed (is_missing_user_ok=0)
2019:12:03-01:41:38: uploadFtpFiles('user3.tar.gz', , /home/tmp/admin.10888, diradmin) failed (is_missing_user_ok=0)
2019:12:03-01:41:38: uploadFtpFiles('user4.tar.gz', , /home/tmp/admin.10888, diradmin) failed (is_missing_user_ok=0)
2019:12:03-01:41:42: uploadFtpFiles('user5.tar.gz', , /home/tmp/admin.10888, diradmin) failed (is_missing_user_ok=0)
2019:12:03-01:41:42: uploadFtpFiles('user6.tar.gz', , /home/tmp/admin.10888, diradmin) failed (is_missing_user_ok=0)
This might be relevant.:
 
Last edited:
Update: Ok found the cause it was:

I have a custom path: "%d-%b-%Y/00-00" for 4 scheduled task (06-00 12-00 18-00) just so it looks nice and tidy. I started out with: "%d-%b-%Y/%H-%M" Reverted back to this now backups are running again checked with the provider they told me DA tries to create the path but fails cause the underlying dir is not present.

Not sure if this is a bug in DA or issue with @zEitEr script. Perhaps @smtalk could enlighten us ?
 
DirectAdmin should create folders recursively. Local backup creates parent directories too, and the ftp curl upload uses --ftp-create-dirs, which should take care of that as well.
 
DirectAdmin should create folders recursively. Local backup creates parent directories too, and the ftp curl upload uses --ftp-create-dirs, which should take care of that as well.
Alrighty thanks for that then I presume it must be in the sftp alteration.
 
Hello,

This is scp script to backup all files to another server.

password="xxxxxxxxxxxxxxxxxx"
username="root"
Ip="xxxxxxxxxxxxxx"
sshpass -p "$password" scp /backup/*.gz $username@$Ip:/home


You can add this file from crontab -e.

Thanks,
Melih
 
I updated the scripts:
git pull
Updating 104bf62..ef2a9de
error: Your local changes to the following files would be overwritten by merge:
README.md
ftp_list.php.patch
ftp_upload.php.old
ftp_upload.php.old.patch
ftp_upload.php.patch
Please commit your changes or stash them before you merge.
Aborting

Err odd but understandable the reason i am showing is for full clarity. I had to recover half my file system due to a huge error on centos called (rm -rf) :oops:
So moved them all to backup dir and did:
git pull -f
Updating 104bf62..ef2a9de
Fast-forward
README.md | 6 ++--
ftp_download.php | 6 ++--
ftp_list.php | 26 ++++++++--------
ftp_list.php.patch | 24 +++++----------
ftp_upload.php | 99 +++++++++++++++++++++++++++--------------------------------
ftp_upload.php.old | 287 ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------
ftp_upload.php.old.patch | 124 --------------------------------------------------------------------------
ftp_upload.php.patch | 60 +++++++++++++++++++++++-------------
8 files changed, 112 insertions(+), 520 deletions(-)
delete mode 100644 ftp_upload.php.old
delete mode 100644 ftp_upload.php.old.patch

I duplicated one of the crons and changed it to run 20:00.
Unfortunately i get the same exact result.
Result:
User admin has been backed up. <20:00:05>
ftp_upload.php exit code: 1
ftp_upload.php output: [upload] sftp return code: 1
<20:00:05>
User bfeldman has been backed up. <20:04:09>
ftp_upload.php exit code: 1
ftp_upload.php output: [upload] sftp return code: 1
<20:04:09>
User hcappon has been backed up. <20:04:21>
ftp_upload.php exit code: 1
ftp_upload.php output: [upload] sftp return code: 1
<20:04:22>
User minecrafth has been backed up. <20:04:22>
ftp_upload.php exit code: 1
ftp_upload.php output: [upload] sftp return code: 1
<20:04:23>
User minecrafts has been backed up. <20:04:27>
ftp_upload.php exit code: 1
ftp_upload.php output: [upload] sftp return code: 1
<20:04:28>
User poralix has been backed up. <20:04:28>
ftp_upload.php exit code: 1
ftp_upload.php output: [upload] sftp return code: 1
<20:04:28>

Although a backup error has occurred, the upload of valid backups would have still been attempted to ftps://spacecabbie.stackstorage.com/DA_user.backup/11-Dec-2019/00-00 <20:04:28>

Code:
2019:12:11-20:00:05: uploadFtpFiles('admin.tar.gz', , /home/tmp/admin.18334, diradmin) failed (is_missing_user_ok=0)
2019:12:11-20:02:05: UnBlock IP '185.36.81.229': Script output: [OK] The IP 185.36.81.229 was unblocked Reason: when=1575918064 + (60 * unblock_brute_ip_time=2880) <= now=1576090922
2019:12:11-20:04:09: uploadFtpFiles('bfeldman.tar.gz', , /home/tmp/admin.18334, diradmin) failed (is_missing_user_ok=0)
2019:12:11-20:04:22: uploadFtpFiles('hcappon.tar.gz', , /home/tmp/admin.18334, diradmin) failed (is_missing_user_ok=0)
2019:12:11-20:04:23: uploadFtpFiles('minecrafth.tar.gz', , /home/tmp/admin.18334, diradmin) failed (is_missing_user_ok=0)
2019:12:11-20:04:28: uploadFtpFiles('minecrafts.tar.gz', , /home/tmp/admin.18334, diradmin) failed (is_missing_user_ok=0)
2019:12:11-20:04:28: uploadFtpFiles('poralix.tar.gz', , /home/tmp/admin.18334, diradmin) failed (is_missing_user_ok=0)
 
Make sure you've got a version: 1.2.poralix.2 $ Wed Dec 11 20:30:29 +07 2019 installed. You can check top lines of the installed scripts.
 
Alex, thank you very much!!!

Hi Alex, thanks for sharing this script, really needed.

I hope staff will integrate this in the default panel software so that others can benefit.

Really appreciated ;)
Totally agree with you.
 
Back
Top