As @JuanR mentioned, the issue rises where too many requests are sent to the API faster than they could be handled.
To overcome this issue and have more control over sending requests, I can think of two approaches:
Using an async
Method
You can define GetItemsByID
as async
and wait for the response to each request before proceeding to send the next one.
private static async Task GetItemsAsync(List<dynamic> list)
{
var client = new HttpClient();
foreach (var n in list)
{
var res = await client.GetAsync("http://api/users/" + n.ID);
// Do whatever you want with `res` here
}
}
Calling the async method
- Call it inside an
async
method like this:
await GetItemsAsync(List);
- Call it inside a non-
async
method:
GetItemsAsync(List).Wait();
More Control using ActionBlock
You can use ActionBlock
so that multiple calls to the API at the same time "could" make the whole process faster. It is possible to limit the number of parallel calls using MaxDegreeOfParallelism
:
private static void GetItemsByID(List<dynamic> list)
{
var client = new HttpClient();
var workerBlock = new ActionBlock<string>(async id =>
{
var res = await client.GetAsync("http://api/users/" + id);
// Do whatever you want with `res` here
},
new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 4 }
);
foreach (var n in list)
{
workerBlock.Post(n.ID);
}
workerBlock.Complete();
// Wait for all messages to propagate through the network.
workerBlock.Completion.Wait();
}
Calling the method
- Call it simply like any other method:
GetItemsById(List);
Read More