Backup problem: file too big?

tino

Verified User
Joined
Jan 8, 2004
Messages
92
Hi,

It appears that my backups are failing because the backup of one site is just too big. (8GB) The backup runs fine until this file gets transferred.

With Ensim, you can split the tar files into managable pieces (lik 250MB or so). This prevents these problems.

The error message I get in the "message system" is:

Subject: An error occurred during the backup. Today at 20:55
User user1 been backed up.
User user2 been backed up.
User user3 been backed up.
User problem has been backed up.
User user4 been backed up.
...etc for each user. now the transfer starts:...
Error while uploading problem.tar.gz

Is there a way to work around this? Will there be a fix for this? Noone else has problems with large sites?

Tino
 
Well,

could be, but I'd rather have an automated solution.. :-)

Where are the files located when I setup a backup job that ftp's to another server? Will these files be kept after a failed transfer?

for now, I'll make a local backup and try to transfer it through ftp, just for testing.

Tino
 
Last edited:
Hello,

They're put in /home/tmp/resellername/* .. but removed after the backup attempt, regardless of outcome.

John
 
Okay,

I've backupped the site (that went well), and I'm manually ftp'ing it to another server. It's pretty fast, but it's 23:45 over here, so I'm not sure if I can wait for the upload to finish. It's past my bedtime already... :o

It's now at 2.2GB, and still going strong... Couldn't it be a php timeout error? I can remeber a setting in php.ini that sets a maximum to the time php scripts can run...? Backing up this amount of data will take at least an hour... Could that be it?!

Tino
 
Last edited:
2.2 Gig?

Many linux servers have a 2G limit on filesizes.

Could that be the problem on one of your boxes?

Jeff
 
The 2GB limit is for the ext2 filesystem. I use only ext3 filesystems, so that's not the issue.

I've FTP'd the file manually, and that works. Yet, from DA it won't. Perhaps a php time-out issue or something?

Tino
 
Last edited:
Are you sure?

We've got a bunch of RHL7.3 servers running ext3 filesystems which had problems until we started breaking up our backup files into smaller pieces.

Jeff
 
I still cannot perform a normal scheduled backup.

Does anyone else have this problems?

It must be due to large sites (10GB or so)

Tino
 
Are you having problems with a user backup, a reseller backup, or a system backup?

It appears as if your backup software is running up against the largest file size allowed by your OS.

Check your OS for file size limits and report back to us :) .

And let us know which backup you're having problems with.

Jeff
 
I have this problem with a reseller backup, doing all users.

I am running RedHat 9 with all ext3 filesystems.

A local backup appears to work. So it's in the transfer part.

Tino
 
If the local backup works, then it may not be the file size.

But I'm not sure because I've not examined the backup code.

If the backup code makes one tarball for transfer to the other system, and if your OS has a 2 gig limit, and if the resulting tarball is over 2G, it might fail.

But the failure should mean not that it doesn't seem to work, but that you can't restore from it.

What are the symptoms?

Jeff
 
After the backup ran, I get a status message.

The status message is built like this;

First a few lines with:

User userX has been backed up.

But after a few succesfull backups, the rest of the backups fail with this error:

Warning: ftp_put(): userY.tar.gz: Permission denied in - on line 31
Error while uploading userY.tar.gz

When I do a backup to the machine itself, so not using FTP, it works. And, when I then try to manually FTP these files to my target machine, it works as well.

So it's not the target machines FS that's causing the problem.

Tino
 
tino said:
Warning: ftp_put(): userY.tar.gz: Permission denied in - on line 31
Error while uploading userY.tar.gz
What does line 31 of the backup program say?
When I do a backup to the machine itself, so not using FTP, it works. And, when I then try to manually FTP these files to my target machine, it works as well.
But perhaps you're doing it differently.

I'm sorry I don't have a better answer; perhaps DA support needs to get involved, though I'm not sure how they could help either; it may not be a DA problem either :( .

Jeff
 
GXX said:
My issue is, it does a local full system backup just fine. I can see the directory and all files in it. It just does not FTP it over to another server when it's done.
You may have incorrect settings somewhere in your backup ftp configuration.

Can you ftp the backup manually? If so, make sure your settings are the same as what you do manually.

I'm sorry I don't have any better ideas.

Jeff
 
This is the code causing the problem:

if (ftp_put($conn_id, $ftp_remote_file, $ftp_local_file, FTP_BINARY))
{
exit(0);
}
else
{
echo "Error while uploading $ftp_remote_file";
exit(4);
}


My idea is that it has something to do with the $conn_id. Perhaps the connection server timed out or something because TARring the HUGE file takes some time;

DA uses $conn_id = ftp_connect($ftp_server);

I will try to change it into:

$conn_id = ftp_connect($ftp_server, 21, 600);

(in this case, the timeout is 600 seconds in stead of the default 90.)

Of course, you can also use ftp_set_option() and ftp_get_option() to change this value.

I surely would like DA to confirm this.

Tino
 
90 seconds seems much to low for me. I'd go for at least 10 minutes; perhaps 15.

As I've mentioned previously, the forums are NOT an official DA support venue; you could always send them an email :) .

Jeff
 
Can you use passive ftp?

It could be a firewall issue if you can't use passive ftp.

Jeff
 
Since it always fails starting at the same domain, I don't think passive FTP is the problem.

Tino
 
Back
Top