0

I have a windows service that monitors a folder for new files and then sends to a S3 Bucket to process. The service works for the first file, but then I get this exception for any files after that.

Message: 'The process cannot access the file 'C:\PDFs\sample-2.pdf' because it is being used by another process.' when writing an object

Here is part of the code for the upload:

 private static async Task UploadFileAsync()
    {
        try
        {
            var fileTransferUtility = new TransferUtility(s3Client);

            //await fileTransferUtility.UploadAsync(filePath, bucketName);

            Log.WriteLine("FilePath: " + filePath);

            using (var fileToUpload = new FileStream(filePath, FileMode.Open, FileAccess.Read))
            {
                await fileTransferUtility.UploadAsync(fileToUpload, bucketName, fileToUpload.Name);
            }

            Log.WriteLine("File uploaded: " + filePath);
        }
        catch (AmazonS3Exception e)
        {
            Log.WriteLine("Error encountered on server. Message: '"+ e.Message +"' when writing an object");
        }
        catch (Exception e)
        {
            Log.WriteLine("Unknown encountered on pc. Message: '" + e.Message + "' when writing an object");
        }
    }

I originally had the fileMode set to Read and File Access set to Read as well.

Any clues?

linuxer
  • 523
  • 2
  • 4
  • 22
  • Are you waiting enough time for the sample-2.pdf file to finish writing before you pick it up? Like are the PDFs being uploaded from an FTP, or dumped there by another process? Sometimes the file will take a second or 2 to finish uploading, and during that time it is locked by the OS. – Jon Apr 24 '19 at 17:55
  • @Jon Yeah, I've waited minutes to hours. If I restart the service then I can upload a single file again. I am just copying and pasting the file from one dir to the monitored dir. – linuxer Apr 24 '19 at 18:00
  • Shouldn't this line be fileToUpload = new FileStream(filePath, FileMode.Open, FileAccess.Read)) ? – Jon Apr 24 '19 at 18:20
  • @Jon That is what I originally had and it doesn't work. Tried again just barely to be sure. – linuxer Apr 24 '19 at 19:34
  • @Jon It works if I copy multiple files in at the same time. The issue appears to be one at a time. – linuxer Apr 24 '19 at 19:43
  • @Jon Could it have something to do with it being an async task? – linuxer Apr 24 '19 at 19:48
  • Is it possible that you have 2 threads trying to access the same file at the same time? Try just uploading the files serially, using one thread, one at a time. – Jon Apr 25 '19 at 14:38
  • Ok, so I figured something out. It appears to only have the issue if I copy and paste. If I move the file to the directory it works every time. Is there a way to detect if the file is finished copying before atempting to upload? – linuxer Apr 29 '19 at 18:54
  • Yes, I use the answer from here IsFileLocked() https://stackoverflow.com/questions/876473/is-there-a-way-to-check-if-a-file-is-in-use – Jon Apr 29 '19 at 19:07

0 Answers0