0

Not sure if this is even possible, but here's the idea. I have to process a bunch of files in a path, and I want to do them concurrently. My application uses external processes (such as FTP) that can timeout or error in other ways (not ideal for sure but trying to fix one thing at a time).

I want to process all these files (or batches of them in parallel) but give each thread a limit of how long it can take (say 5 minutes). So I kick off each thread and if it isn't done within 5 minutes it should just be killed off.

Because it's possible (almost certain) that all the threads won't kick off at that same time I don't want to give it five minutes total, but each thread should get five minutes

Here's some psuedo code:

DirectoryInfo di = new DirectoryInfo(somePath);
List<FileInfo> files = di.GetFiles("*.txt"); // grab all text files in path
Parallel.ForEach(files, currFile => ProcessFile(currFile));

This will spin off the threads for each file nicely and appropriately run ProccessFile on each .txt file in the path, but is there a way to limit each call to ProcessFile to only a certain amount of time?

sedavidw
  • 11,116
  • 13
  • 61
  • 95

1 Answers1

0

I think this is very much possible if I understand your question. You are trying to spin off a bunch of tasks that essentially execute the same logic, just on different files. But you don't want it to take too long and timeout if it does. if this is correct then i think Task's with the timeout parameters is what your looking for. A quick search brought me to this SO solution i think similar to what your looking for. Essentially you would create the method that processes your files. Using tasks in this manner is one of the simplest and i think cleanest ways to handle this. Other options include creating timers inside each thread or a class that manages each thread with timers. I've done almost every way this can be done (except cancellation tokens), and the Task, async/await is by far the best option IMO.

Community
  • 1
  • 1
xtreampb
  • 528
  • 6
  • 19