0

I'm creating an app that requires todo parallel http request, I'm using HttpClient for this. I'm looping over the urls and foreach URl I start a new Task todo the request. after the loop I wait untill every task finishes. However when I check the calls being made with fiddler I see that the request are being called synchronously. It's not like a bunch of request are being made, but one by one. I've searched for a solution and found that other people have experienced this too, but not with UWP. The solution was to increase the DefaultConnectionLimit on the ServicePointManager. The problem is that ServicePointManager does not exist for UWP. I've looked in the API's and I thought I could set the DefaultConnectionLimit on HttpClientHandler, but no.

So I have a few Questions. Is DefaultConnectionLimit still a property that could be set somewhere? if so, where do i set it? if not, how do I increase the connnectionlimit? Is there still a connectionlimit in UWP?

this is my code:

var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
   requests.Add(Task.Factory.StartNew((x) =>
   {
      ((Show)x).NextEpisode = GetEpisodeAsync(((Show)x).NextEpisodeUri, client).Result;}, show));
   }
}
await Task.WhenAll(requests.ToArray());

and this is the request:

public async Task<Episode> GetEpisodeAsync(string nextEpisodeUri, HttpClient client)
{
    try
    {
        if (String.IsNullOrWhiteSpace(nextEpisodeUri)) return null;
        HttpResponseMessage content; = await client.GetAsync(nextEpisodeUri);
        if (content.IsSuccessStatusCode)
        {
            return JsonConvert.DeserializeObject<EpisodeWrapper>(await content.Content.ReadAsStringAsync()).Episode;
        }
    }
    catch (Exception ex)
    {

        Debug.WriteLine(ex.Message);
    }

    return null;
}
Sebastian S
  • 285
  • 2
  • 16
  • you might wonder why I don't use async/await in the task. When I use it, I get my result pretty quick. The problem however is that Task.WhenAll doesn't wait untill all tasks are complete. When I debug I still see calls being made after I pass Task.WhenAll. Maybe the problem lies there. Is there a way to make Task.WhenAll wait untill all tasks are complete using async / await in the task? – Sebastian S Apr 30 '16 at 22:04

2 Answers2

0

Oke. I have the solution. I do need to use async/await inside the task. The problem was the fact I was using StartNew instead of Run. but I have to use StartNew because i'm passing along a state. With the StartNew. The task inside the task is not awaited for unless you call Unwrap. So Task.StartNew(.....).Unwrap(). This way the Task.WhenAll() will wait untill the inner task is complete. When u are using Task.Run() you don't have to do this.

Task.Run vs Task.StartNew

The stackoverflow answer

var requests = new List<Task>();
var client = GetHttpClient();
foreach (var show in shows)
{
    requests.Add(Task.Factory.StartNew(async (x) =>
    {
        ((Show)x).NextEpisode = await GetEpisodeAsync(((Show)x).NextEpisodeUri, client);
    }, show)
    .Unwrap());
}
Task.WaitAll(requests.ToArray());
Sebastian S
  • 285
  • 2
  • 16
0

I think an easier way to solve this is not "manually" starting requests but instead using linq with an async delegate to query the episodes and then set them afterwards.

You basically make it a two step process:

  1. Get all next episodes
  2. Set them in the for each

This also has the benefit of decoupling your querying code with the sideeffect of setting the show.

        var shows = Enumerable.Range(0, 10).Select(x => new Show());
        var client = new HttpClient();

        (Show, Episode)[] nextEpisodes = await Task.WhenAll(shows
            .Select(async show =>
                (show, await GetEpisodeAsync(show.NextEpisodeUri, client))));

        foreach ((Show Show, Episode Episode) tuple in nextEpisodes)
        {
            tuple.Show.NextEpisode = tuple.Episode;
        }

Note that i am using the new Tuple syntax of C#7. Change to the old tuple syntax accordingly if it is not available.

Leonhard Bauer
  • 316
  • 1
  • 6