Sunday, January 16, 2011

Incremental backup of site

I want to make periodical archive of my site. I have lftp script to download site content via ftp to today (date +%Y%m%d) directory. What is best way to make an incremental compressed backup without many duplicates?

  • did you try rsync?

    Vlad : i can not use rsync - access only via ftp
    Vlad : and i want make compressed incremental backup
    From quamis
  • Duplicity may fit your needs.

    It is incremental: After a full backup is performed, all future backups are simply difference files. It's important to note that it is the opposite of other backup solutions which store a mirror of the latest state, and difference files to recreate the previous backup points.

    It is compressed: Duplicity is an encrypted backup (perhaps good for you, since you're stuck with FTP?) - and the encrypted file is compressed (as I understand it). You can also bypass the encryption, and simply get a gzipped backup. (--no-encryption)

    It works over FTP: Duplicity can use many remote protocols (including FTP), the problem in you case is that duplicity would need to be run from your server. I do not believe you can use duplicity to backup a remote source to a local destination (just local source to remote destination).

    In your case, if you're not looking for the compression benefit in transferring the data, only storing the data, then you could keep your FTP script, and after the current 'image' is transfered have duplicity backup that temporary image to your existing backup, the delete the image. This way you would have a series of backup files that could be used to restore you site at any backup point, and those files would be gziped archives of only the changes from the last backup point.

    Just a note, every so often it would be wise to do a 'full' backup, since duplicity relies on each incremental backup going forward from a full backup.

    Another solution (assuming again that temporarily storing a FTP'd copy locally is acceptable), would be to simply use rdiff-backup. This would give you a mirror of you site (as of the last backup), and past backups would be stored as the differences going backwards. I'm not sure if those are compressed, but even if they aren't, you would be only storing the changes to files for each backup point.

    Vlad : so i can not use it from local linux?
    Tim Lytle : Not sure what you mean there. I don't believe you can run duplicity locally and backup a remote path (it would make the encryption somewhat meaningless). But you can run duplicity locally and backup to a local path (essentially making gzipped archives of the changes since the last full backup).
    From Tim Lytle
  • backup2l is a very simple tool that builds an incremental zip file, which you can then download via FTP.

0 comments:

Post a Comment