0

A customer gave us ftp-access to download PDF files.

Unfortunately I don't know if the file on the remote is ready to download.

This command line works:

ncftpget -u user -p pwd foo.example.com import created/*pdf

But I am a afraid that the files are not complete. I don't want to download files which are not completely create on the remote site.

Client and server run on linux. File-Locking is not available.

Just for the records. We switch from ftp to http. Up to now we used ftp, but now we use a simple tool to upload files via http: tbzuploader

guettli
  • 25,042
  • 81
  • 346
  • 663

2 Answers2

1

Check the size of the file for every five seconds.If the size varies for consecutive time.Then it is partial.If not then file is complete.

Ramsad
  • 63
  • 1
  • 1
  • 11
  • Yes, should work. A stateless solution would be easier, but I guess this is not possible at all (except you switch from ftp to a nice http API). – guettli Apr 10 '17 at 12:48
0

I use ftputil to implement this work-around:

  1. connect to ftp server
  2. list all files of the directory
  3. call stat() on each file
  4. wait N seconds
  5. For each file: call stat() again. If result is different, then skip this file, since it was modified during the last seconds.
  6. If stat() result is not different, then download the file.

This whole ftp-fetching is old and obsolete technology. I hope that the customer will use a modern http API the next time :-)

guettli
  • 25,042
  • 81
  • 346
  • 663
  • Would you please rather add this solution to your other answer at http://stackoverflow.com/a/43339916/850848 not to scatter the information over different posts? And close this question as duplicate? – Martin Prikryl Apr 13 '17 at 10:42
  • 1
    @MartinPrikryl, yes ok. Let's close my question. – guettli Apr 13 '17 at 10:57