6

Need to find a way to upload files to my server through FTP. But only the ones that have been modified. Is there a simple way of doing that? Command line ftp client or script is preferred. Thanks, Jonas.

DavidG
  • 113,891
  • 12
  • 217
  • 223
user52965
  • 69
  • 1
  • 2

7 Answers7

6

The most reliable way would be to make md5 hashes of all the local files you care about and store it in a file. So the file will contain a list of filenames and their md5 hashes. Store that file on your ftp server. When you want to update the files on your ftp server, download the file containing the list, compare that against all your local files, and upload the files that have changed (or are new). That way you don't have to worry about archive bits, modified date, or looking at file sizes, the use of which can never be 100% reliable.

Using file sizes isn't reliable for the obvious reason - a file could change but have the same size. I'm not a fan of using the archive bit or modified date because either of those could be confused if you backup or restore your local directory with another backup program.

mhenry1384
  • 7,538
  • 5
  • 55
  • 74
4

git-ftp works really nice:

apt-get install git-ftp

in the application folder:

git config git.ftp.xxxx.url ftpservice.server.com/root/dir/for/ftp
git config git.ftp.xxxx.user myUsername
git config git.ftp.xxxx.password myPassword

following only if target is up-to-date by other ftp client (it changes ftpservice.server.com/root/dir/for/ftp/git-ftp.log to current commit)

git ftp catchup --scope xxxx

//editing sources//

git commit -m "new version"
git ftp push --scope xxxx

and you can see what hapens:

git ftp show --scope xxxx
git ftp log --scope xxxx
mirek
  • 1,140
  • 11
  • 10
4

Do you really insist on ftp, or can you use rsync instead?


If ftp is required mhenry1384's idea is roughly what rsync does (well half of it anyway, rsync also looks at modification times for files that differ...).

Community
  • 1
  • 1
dmckee --- ex-moderator kitten
  • 98,632
  • 24
  • 142
  • 234
2

(Waiting for the comment on the main question to be answered before expanding)

The strategy would be:

  1. Find the files changed using dates, times, archive bits or hashes (depending on OS)
  2. Using this list, generate a FTP script PUTting those files
  3. Run the FTP script.
ColinYounger
  • 6,735
  • 5
  • 31
  • 33
0

By using git-ftp it is relatively easy. But it works only if your computer is the only one which modifies the folder on the ftp server...

I heard that rsync is working too, but I am not sure of that, I never used rsync... And it's not ftp...

inf3rno
  • 24,976
  • 11
  • 115
  • 197
0

Netbeans has a nice ftp. it supports selective upload, download

Junaid Qadir Shekhanzai
  • 1,386
  • 1
  • 20
  • 38
0

Jonas,

How often are the files likely to change?

If they change on a predictable time frame, you could consider using the modified and other date attributes on the file.

Nick
  • 2,285
  • 2
  • 14
  • 26