the situation: Files are being dumped in a folder. This folder is being constantly monitored with own logic. When files are in the folder they are being processed automatically. We only want to process files that are fully copied into the directory. In case we copy a large file e.g. 100MB to a folder, we don't want to process that file until it is fully copied ('complete').
Currently we test this with this code:
FileStream fs = null;
try {
fs = fileInfo.Open(FileMode.Open, FileAccess.Read, FileShare.Read);
// file is 'complete'
} catch (System.Security.SecurityException) {
// file is locked, don't do stuff (maybe Windows Explorer is still copying).
}
catch {}
finally {
fs?.Close();
}
As I think the SO User Hans Passant once said, the only way to test this is to try opening this file.
This code works but is 'old'.
Are there more efficient methods/techniques to implement and test this? (as performance is critical, the faster, the better).