Backup problem: file too big?

GXX said:
My issue is, it does a local full system backup just fine. I can see the directory and all files in it. It just does not FTP it over to another server when it's done.


Yes, we have exactly the same problem. Only one of our customers backup files are over 2Gb and that is the one which the FTP keeps failing on.

We really dont want to seperate this customer's backup out and generate a manual method as this creates an administrative nightmare.

Any one got over this problem yet, cos its a right pain in the ass!

Neil :-)
 
Some OS distributions can't handle files over 2GB.

What OS distribution are you using/

Are you using the sytem backup or the reseller backup?

If the system backup have you tried setting DA to upload each file as it's created? That might help by not building one huge file. Worth a try.

Jeff
 
Googling finds that the maximum file size for an ext2 system is 2GB, and that an ext3 system is the same as an ext2 system, but with journaling added.

By default, CentOS systems use the ext3 file system, so it's logical to presume the problem is file sizes over 2GB.

Have you tried copying each file as it's created, as mentioned in my last post?

Jeff
 
Ah I see how you do it as admin user but we do this under the reseller user

Neil :D
 
As far as I know you cannot do it as a reseller user.

I did a bit more research after my last post; it appears that while Linux itself, and both the 2.4 and 2.6 kernel, can create files much larger than 2GB, the libaries compiled into most linux systems cause the 2GB limit.

Jeff
 
It looks like this issue was fixed with DA version 1.26.1? can someone confirm this as it wasn't mentioned in the release notes.

We updated DA on 25.12.05 to v 1.26.1 and since then the ftp on the backups has not failed on files over 2Gb.

No other updates where done at that time (e.g. apache, php, redhat) so it was definitely something in the release of 1.26.1 that has made this work. We had updated Apache to v1.3.34 and PHP to v4.4.1 a few weeks before.


Neil :-)
 
Last edited:
The 2GB limit will occur sporadically, depending on your OS distribution and the way files on it were compiled against certain libraries.

I'm sorry I can't be more specific.

You may never have a problem, but you may.

Jeff
 
I dont know about sporadic. The 2Gb issue has been consistent every day (we backup customer files daily) for last 6 months or so since some customers backup files exceeded 2Gb, then since the DA 1.26.1 update no more ftp failures on those accounts exceeding 2Gb.

I guess time will tell.

Neil :-)
 
I use RHEL 3.0 w/ ext3 on my main shared server. No 2GB file size limit, however I backup remotely to a backup server provided by The Planet. They do have a 2GB file size limit.

So what I ended up doing was have DA do a system backup to a staging area on my local disk. Then I have a shell script run through cron a few hours later that tars up the backup directory and splits it into 1.5gb files, then copy that to my network share(backup drive on the The Planet provided server). Works like a charm now that I have all the bugs worked out...

Anyhow, when I was doing Reseller backups directly to ftp, I was getting the same error with files larger than 2gb..
 
The update to DirectAdmin version 1.26.3 did not fix the upload problem for me here. I run DirectAdmin under CentOS 4.1 and am trying to backup to an Apple Xserve G5 running Mac OS Server 10.4.5.

If I have DirectAdmin backup to a local directory and then manually ftp the files over everything works just fine, when DirectAdmin tries to automatically send the files using scheduled Reseller level backups it fails on any file over 2gb.

The fact that performing the ftp manually works seems to indicate to me that the problem is some sort of bug in DA. Am I wrong in my thinking here? Has anyone found a solution for this problem yet?

Thanks.
 
Same problem here. I have a customer with 5+ gigs of photo's and the backup fails (the backup server is indeed an ext2 filesystem). I would urge to fix the backup code for DA to allow multiple volumes
 
I believe that the problem we are experiencing has to do with the fact that php does not support file sizes larger than 2gb. This limitation appears to exist whether the machine it has been installed on supports larger than 2gb files or not. See http://bugs.php.net/bug.php?id=24411 for a bit more info on this.

The DirectAdmin script that is failing is located in /usr/local/directadmin/scripts and is called ftp_upload.php.

I have rewritten it on my system using a shell_exec command that calls curl to ftp the file instead of using php's built in ftp support and my backups are working again.

I would be happy to post the script if someone wants to see it but won't take any responsibility if it messes up your system (I am quite new to php). Also, since I am always backing up to an ftp server that uses PASV I didn't bother to have the script retry without it if the upload fails as the current DirectAdmin supplied script appears to do (I am both the admin and the only reseller that uses this particular server so I can get away with this kind of sloppiness).
 
Back
Top