1

In my Windows Service solution have a FileSystemWatcher monitoring a directory tree for new files, and whenever it fires a Created event I am trying to move the files asynchronously to another server for further processing. Here's the code:

foreach (string fullFilePath in 
                Directory.EnumerateFiles(directoryToWatch, "*.*",  
                                         SearchOption.AllDirectories)
                         .Where(filename => fileTypes.Contains(Path.GetExtension(filename))))
            {
                string filename = Path.GetFileName(fullFilePath);
                using (FileStream sourceStream = File.Open(filename, FileMode.Open, FileAccess.Read))
                {
                    using (FileStream destStream = File.Create(Path.Combine(destination, filename)))
                    {
                        await sourceStream.CopyToAsync(destStream);
                    }
                }
            }

The problem is that as these files are being copied into the folder I'm watching, they're not always unlocked and available to me. I want to "retry" when I hit a locked file, but I'm not accustomed to thinking asynchronously, so I have no idea how to put the error'ed file back in the queue.

Scott Baker
  • 10,013
  • 17
  • 56
  • 102
  • possible duplicate of [Filesystem watcher and large files](http://stackoverflow.com/questions/3822175/filesystem-watcher-and-large-files) – Jim Mischel Jul 23 '13 at 16:39

1 Answers1

1

First of all you need to 'detect' the exceptions thrown in process of asynchronous execution. This can be done by something like this:

        try
        {
            await sourceStream.CopyToAsync(destStream);
        }
        catch (Exception copyException)
        {
        }

Once an exception is detected and properly handled i.e. you decide that one particular exception is a reason for a retry, you will have to maintain your own queue of copy targets (and destinations) that are due for a retry.

Then you will have to organize a new entry point that would lead to a retry itself. Such an entry point could be triggered by a timer or a next event from the file system monitor you use (which I would not recommend). You will also have to implement a detection for an overflow of your queue for a case of multiple failures. Keep in mind that such overflow detection is also present in the file system monitor which can simply skip a notification if there are too many system events (many files appear to be copied into the monitored folders at once).

If these matters do not bother you much I would suggest you to implement a timer or to be more precise a timeout in order to retry the copy task. If on the other hand you need a robust solution I would implement a file system monitor myself.

Concerning the timeout it could look like this:

    private Queue<myCopyTask> queue;
    private Timer retryTimeout;

    public Program()
    {
        retryTimeout = new Timer(QueueProcess, null, Timeout.Infinite, Timeout.Infinite);
    }

    private void FileSystemMonitorEventhandler()
    {
        //New tasks are provided by the file system monitor.
        myCopyTask newTask = new myCopyTask();
        newTask.sourcePath = "...";
        newTask.destinationPath = "...";

        //Keep in mind that queue is touched from different threads.
        lock (queue)
        {
            queue.Enqueue(newTask);
        }

        //Keep in mind that Timer is touched from different threads.
        lock (retryTimeout)
        {
            retryTimeout.Change(1000, Timeout.Infinite);
        }
    }

    //Start this routine only via Timer.
    private void QueueProcess(object iTimeoutState)
    {
        myCopyTask task = null;

        do
        {
            //Keep in mind that queue is touched from different threads.
            lock (queue)
            {
                if (queue.Count > 0)
                {
                    task = queue.Dequeue();
                }
            }

            if (task != null)
            {
                CopyTaskProcess(task);
            }
        } while (task != null);
    }

    private async void CopyTaskProcess(myCopyTask task)
    {
        FileStream sourceStream = null;
        FileStream destStream = null;

        try
        {
            sourceStream = File.OpenRead(task.sourcePath);
            destStream = File.OpenWrite(task.destinationPath);
            await sourceStream.CopyToAsync(destStream);
        }
        catch (Exception copyException)
        {
            task.retryCount++;

            //In order to avoid instant retries on several problematic tasks you probably 
            //should involve a mechanism to delay retries. Keep in mind that this approach
            //delays worker thread that is implicitly involved by await keyword.
            Thread.Sleep(100);

            //Keep in mind that queue is touched from different threads.
            lock (queue)
            {
                queue.Enqueue(task);
            }

            //Keep in mind that Timer is touched from different threads.
            lock (retryTimeout)
            {
                retryTimeout.Change(1000, Timeout.Infinite);
            }
        }
        finally
        {
            if (sourceStream != null)
            {
                sourceStream.Close();
            }

            if (destStream != null)
            {
                destStream.Close();
            }
        }
    }
}

internal class myCopyTask
{
    public string sourcePath;
    public string destinationPath;
    public long retryCount;
}
Zverev Evgeniy
  • 3,643
  • 25
  • 42