As you want to achive parallel processing, using a Task
and async processing would be a good approach.
Therefore, you need to define a method or expression that will be your Task
/Action
and includes a retry pattern internally:
public const int MAX_RETRY_COUNT = 3;
private void ProcessItemsAsync(Item item, int retryCount)
{
// Note: Exceptions thrown here will pop up
// as AggregateException in Task.WaitAll()
if (retryCount >= MAX_RETRY_COUNT)
throw new InvalidOperationException(
"The maximum amount of retries has been exceeded");
retryCount++;
// Either implement try-catch, or use conditional operator.
try
{
// Do stuff with item
}
catch(Exception ex)
{
// Exception logging relevant? If not, just retry
ProcessItemsAsync(item, retryCount);
}
}
Once you defined your Task
method, you can process a bulk of tasks at once:
public const int BULK_AMOUNT = 10;
private async void ProcessSqlData()
{
List<Item> lstItems = await UnitOfWork.Items.GetAllAsync();
for (var i = 0; i < lstItems.Count; i += BULK_AMOUNT)
{
// Running the process parallel with to many items
// might slow down the whole process, so just take a bulk
var bulk = lstItems.Skip(i).Take(BULK_AMOUNT).ToArray();
var tasks = new Task[bulk.Length];
for (var j = 0; j <= bulk.Length; j++)
{
// Create and start tasks, use ProcessItemsAsync as Action
tasks[j] = Task.Factory.StartNew(() => ProcessItemsAsync(bulk[j], 0));
}
// Wait for the bulk to complete
try
{
Task.WaitAll(tasks);
}
catch (AggregateException e)
{
Log.WriteLine(String.Format(
"The maximum amount of retries has been exceeded in bulk #{0}. Error message: {1}",
i,
e.InnerException != null
? e.InnerException.Message
: e.Message));
}
}
}
However, if you know that your computer running this has enough performance, you might increase the BULK_AMOUNT
. You should test it, to find the optimal amount here.