For a project we need to code a windows service that will process a job queue and do the necessary processing and calls to different APIs, and it will be running on a stand-alone machine.
Context: This service will poll the SQL database every x (ex: 5) seconds, get the top Y jobs according to the priority and creation date, and then start processing them. We expect huge volume for this windows service so we want to make it multi-thread or asynchronous. If the maximum amount of parallel processing is reached, we do not want it to launch more threads. But at the same time, if we have 7 jobs taking 30seconds and one taking 5 minutes, we don't want it to wait for the 5minute job to finish before looping and starting another batch of 8.
The first option we looked for were the BackgroundWorker. Every iteration of the timer will check the status of each BackgroundWorker, and if available will instruct it to process a new job. But with the newer versions of the .Net framework those are obsolete with the async method.
The second option we looked for was the Parallel.ForEach and awaitAll. But the problem is if 7 of the 8 threads take 1minute but the last one takes 6minutes, we do not want to wait for the last one to finish before starting 7 new job process.
The most appropriate way to do it would be tasks.
My questions are: is there a way to track the status of multiple tasks running? And to limit the number of tasks running simultaneously? Should I instanciate all my tasks in the OnStart() of the service, and each OnElapsed of my timer, check the status of each tasks and if available launch it again with a new job? Or am I completely wrong on how tasks work?
The number of allowed parallel processing will be defined in an app config file and initialized in the OnStart of the Windows service.