Incremental backup feature

moja

Verified User
Joined
Mar 4, 2010
Messages
27
Hi,

please can you integrate an incremental backup feature directly to Direct Admin? If you have lot of domains on a server, you cannot make daily FTP backup, because the server is compressing data whole day - cannot do that just over night...

I believe there is so many people who would like to see incremental backup system intregrated into DA. Thank you.
 
mmmh, how should be useful?

I remember i had ear about a tag to add to gzip command for make a compressed and compatible with rsync backup but doesnt seems to be related cause gzip command will come next...

For add additional files ofc, thats perfect, but i cant see the relation between incremental/non-compressed/rsync backup at all..

But, prolly is just me that cant see that right now :)

Regards
 
mmmh ok thats right but would be a copy and then compression.. and if you decide to move the backup will fail i suppose.. so, yes can be done like that but should be a risk no?
 
I think DA should just dump the current script and use duply. It supports lots of different backends, doess incremental backups, ebcrypts everything and can be scripted with pre and post scripts.
DA just needs to add the proper crons, just like it does now and to fill the right config files with all the info.
 
You'll need to teach us about duply since their website says nothing. Their main page tells us it's history, and how to download, and the only links are to:

License
Download
Support / Bugs / Feature-Requests
Documentation
Changelog / Todo
Code

Documentation appears to be notable only by it's absence.

Jeff
 
Here is what comes with it
VERSION:
duply version 1.5.4.2
(http://duply.net)

DESCRIPTION:
Duply deals as a wrapper for the mighty duplicity magic.
It simplifies running duplicity with cron or on command line by:

- keeping recurring settings in profiles per backup job
- enabling batch operations eg. backup_verify_purge
- executing pre/post scripts for every command
- precondition checking for flawless duplicity operation

For each backup job one configuration profile must be created.
The profile folder will be stored under '~/.duply/<profile>'
(where ~ is the current users home directory).
Hint:
If the folder '/etc/duply' exists, the profiles for the super
user root will be searched & created there.

USAGE:
first time usage (profile creation)
duply <profile> create

general usage in single or batch mode (see EXAMPLES)
duply <profile> <command>[_<command>_...] [<options> ...]

Non duply options are passed on to duplicity (see OPTIONS).
All conf parameters can also be defined in the environment instead.

PROFILE:
Indicated by a profile _name_ (<profile>), which is resolved to
'~/.duply/<profile>' (~ expands to environment variable $HOME).

Superuser root can place profiles under '/etc/duply' if the
folder is manually created before running duply.
ATTENTION:
If '/etc/duply' is created, old profiles under
'~root/.duply/<profile>' have to be moved manually
to the former or will cease to work.

example 1: duply humbug backup

Alternatively a _path_ might be used. This might be useful for quick testing,
restoring or exotic locations. Shell expansion should work as usual.
ATTENTION:
The path must contain at least one '/', e.g './test' instead of only 'test'.

example 2: duply ~/.duply/humbug backup

COMMANDS:
usage get usage help text
changelog print changelog / todo list
create creates a configuration profile
backup backup with pre/post script execution (batch: pre_bkp_post),
full (if full_if_older matches or no earlier backup is found)
incremental (in all other cases)
pre/post execute <profile>/, <profile>/ scripts
bkp as above but without executing pre/post scripts
full force full backup
incr force incremental backup
list [<age>]
list all files in backup (as it was at <age>, default: now)
status prints backup sets and chains currently in repository
verify list files changed since latest backup
purge [--force]
shows outdated backup archives (older than $MAX_AGE)
[actually delete these files]
purge-full [--force]
shows outdated backups (more than $MAX_FULL_BACKUPS,
the number of 'recent' full backups and associated
incrementals to keep) [actually delete these files]
cleanup [--force]
shows broken backup archives (e.g. after unfinished run)
[actually delete these files]
restore <target_path> [<age>]
restore the backup to <target_path> [as it was at <age>]
fetch <src_path> <target_path> [<age>]
restore single file/folder from backup [as it was at <age>]

OPTIONS:
--force passed to duplicity (see commands: purge, purge-full, cleanup)
--preview do nothing but print out generated duplicity command lines
--disable-encryption
disable encryption, overrides profile settings

PRE/POST SCRIPTS:
All internal duply variables will be readable in the scripts.
Some of interest might be
CONFDIR, SOURCE, TARGET_URL_<PROT|HOSTPATH|USER|PASS>,
GPG_<KEYS_ENC|KEY_SIGN|PW>, CMD_<PREV|NEXT>
The CMD_* variables were introduced to allow different actions according to
the command the scripts were attached to e.g. 'pre_bkp_post_pre_verify_post'
will call the pre script two times, but with CMD_NEXT variable set
to 'bkp' on the first and to 'verify' on the second run.

EXAMPLES:
create profile 'humbug':
duply humbug create (now edit the resulting conf file)
backup 'humbug' now:
duply humbug backup
list available backup sets of profile 'humbug':
duply humbug status
list and delete obsolete backup archives of 'humbug':
duply humbug purge --force
restore latest backup of 'humbug' to /mnt/restore:
duply humbug restore /mnt/restore
restore /etc/passwd of 'humbug' from 4 days ago to /root/pw:
duply humbug fetch etc/passwd /root/pw 4D
(see "duplicity manpage", section TIME FORMATS)
a one line batch job on 'humbug' for cron execution
duply humbug backup_verify_purge --force

FILES in the profile folder(~/.duply/<profile>):
conf profile configuration file
pre,post pre/post scripts (see above for details)
gpgkey.*.asc exported GPG key file(s)
exclude a globbing list of included or excluded files/folders
(see "duplicity manpage", section FILE SELECTION)

IMPORTANT:
Copy the _whole_ profile folder after the first backup to a safe place.
It contains everything needed to restore your backups. You will need
it if you have to restore the backup from another system (e.g. after a
system crash). Keep access to these files restricted as they contain
_all_ informations (gpg data, ftp data) to access and modify your backups.

Repeat this step after _all_ configuration changes. Some configuration
options are crucial for restoration.

And Google's #1 result:
http://trick77.com/2010/01/01/how-to-ftp-backup-a-linux-server-duply/
 
The number one reason geeks don't do well as marketing:

They completely ignore the features I need to know about before I invest time, energy and money in a product.

I still don't know what the mighty duplicity magic is.

Maybe I should try it on my linux desktop.

Jeff
 
Well, a lot of admin products work that way before they get bought, bloated and sold with insane margins by large corporations ;)

In a couple of words, duply will allow you to make incremental backups, in a secure way, in the location of your choosing.

Things that duply does that the current DA tool doesn't:
- encrypts all data
- can backup on S3
- does incremental backups
 
You'll need to teach us about duply since their website says nothing. Their main page tells us it's history, and how to download, and the only links are to:

License
Download
Support / Bugs / Feature-Requests
Documentation
Changelog / Todo
Code

Documentation appears to be notable only by it's absence.

Jeff

I see documentation there. Right between support.. and changelog.. Might be just my imagination
 
Sure, but nothing about what it does, or why anyone should use it.

This will be my last post on duply.

Jeff
 
Back
Top