I'm building a desktop application for Windows in Python 2.7. The primary function of this application is to watch a folder for new files. Whenever a new file appears in this folder the app uploads it to remote server. The process on the remote server creates a db record for the file and stores remote file path in that record.
Currently I'm using watchdog to monitor directory and httplib for file upload.
What approach should I take to ensure that a new file will be uploaded reliably regardless of a network condition or internet connection loss?
Update: What I mean by reliable upload is that the app will upload the file even if the app restarts. Like Dropbox. Some files are quite big (> 100 MB) so simple solutions like wrapping the code in
try / catch
and starting the upload all over is not very efficient. I know Dropbox uses librsync, but it might be overkill in this case.What if the source file has been changed during the upload? Should I stop the upload and start over?