Not sure if this is even possible, but here's the idea. I have to process a bunch of files in a path, and I want to do them concurrently. My application uses external processes (such as FTP) that can timeout or error in other ways (not ideal for sure but trying to fix one thing at a time).
I want to process all these files (or batches of them in parallel) but give each thread a limit of how long it can take (say 5 minutes). So I kick off each thread and if it isn't done within 5 minutes it should just be killed off.
Because it's possible (almost certain) that all the threads won't kick off at that same time I don't want to give it five minutes total, but each thread should get five minutes
Here's some psuedo code:
DirectoryInfo di = new DirectoryInfo(somePath);
List<FileInfo> files = di.GetFiles("*.txt"); // grab all text files in path
Parallel.ForEach(files, currFile => ProcessFile(currFile));
This will spin off the threads for each file nicely and appropriately run ProccessFile
on each .txt
file in the path, but is there a way to limit each call to ProcessFile
to only a certain amount of time?