4

I'm trying to backup a folder containing several folders and files to a remote location (will be uploading zipped files). Is there any existing scripts that may help me, which checks if the files have been modified after the date of the last backup, and only backs up files created / modified after that?

The current size of the data is around 1gb, and I expect adding 50mb-200mb each month

Also, what would be the best way to extract the state of the files on a specific date?

tshepang
  • 12,111
  • 21
  • 91
  • 136
Dogbert
  • 212,659
  • 41
  • 396
  • 397
  • 7
    Does it need to be PHP? Do you have any access to a Linux or Windows command line? I'm sure there are solutions in PHP for this, but there are more, more flexible and more stable ones in other languages (like `rsync` for example). – Pekka Jun 10 '10 at 11:38
  • 4
    `rsync` would definitly work but I was going to suggest using `git`, `mercurial` or the likes to get the job done. – Lieven Keersmaekers Jun 10 '10 at 11:53
  • @Pekka, I'm specifically looking for PHP based solutions, as this needs to run on a shared host. `rsync` is definitely the best for such things, but I was looking for something that could run seamlessly on shared hosts. – Dogbert Jun 10 '10 at 12:11
  • @Dogbert Did you find any PHP solution? – EmptyData May 25 '17 at 08:31

5 Answers5

3

incremental backup script using php:

http://web4u.mirrors.phpclasses.org/package/4841-PHP-Manage-backup-copies-of-files-.html

ftrotter
  • 3,066
  • 2
  • 38
  • 52
2

I would use Subversion for this. If you have a shell on the remote system then its easy to do this with a cron job.

If you have a shared host then you could automate this process over sftp/ftp by mounting the remote drive (maybe with fuse) and then run a svn commit via cron job.

rook
  • 66,304
  • 38
  • 162
  • 239
2

You can call/execute rsync from php, rsync is a command that synchronizes to remote directories, as its name implies. The good thing about rsync is that it only adds new resources, send only diffs on updated resource, and deletes anything that is not on the source directory (if you want it too). Note that with this you don't have incremental backups. For that you should use a VCS (Git, SVN or CVS) as stated on other answers.

Here is a step by step rsync+php tutorial for using it from within php

redben
  • 5,578
  • 5
  • 47
  • 63
1

I don't think anything like this exists.

First, you'll need some recursive function to find all the files in a directory and all the sub directories. There are a lot of examples for that problem. The idea is to use the scandir() function recursively

Then for each file found, you'll need to check if the file has been modified since your last backup, and if it has been modified, ad it to the list of files to backup. You could do something like:

if (filemtime($filename) > $last_backup_time)
{
    $files_to_backup[] = $filename;
}

Finally, for each file to backup, you just have to copy() or ftp_put() your archive of modified files.

analogue
  • 3,160
  • 1
  • 21
  • 27
  • 1
    The RecursiveDirectoryIterator from the SPL is a much nicer way than recursive scandir calls ;) http://stackoverflow.com/questions/2418068/php-spl-recursivedirectoryiterator-recursiveiteratoriterator-retrieving-the-full – Tobias P. Jun 18 '10 at 12:31
  • You can also execute rdiff command for that also. See http://en.wikipedia.org/wiki/Rsync#Variations – redben Jun 18 '10 at 18:37
1

Instead of rsync you could consider rdiff-backup. Using rsync techniques, it's able to make incremental remote backups. Git is also possible to do this, but the downside is that you can't remove older incrementals from the repository (due to Gits nature).

I don't see why you'd write your own solution, while other excellent solutions already exist.

Bram Schoenmakers
  • 1,599
  • 14
  • 19