2

I have troubles understanding why this code:

// Example #1
foreach (var task in tasks)
{
    task.Start();
    task.Wait();
}

runs much, much faster than:

// Example #2
foreach (var task in tasks)
{
    task.Start();
}

foreach (var task in tasks)
{
    task.Wait();
}

While example #1 executes all tasks in 1-2 seconds, example #2 takes almost 20s to execute. Variable tasks is of type Task[].

There is about a dozen of tasks in array, and each takes 500-1000ms to execute. There is no CPU bound, because tasks just send HTTP requests to server.

enter image description here enter image description here

It doesn't make any sense for me.

stil
  • 5,306
  • 3
  • 38
  • 44
  • 5
    It's quite easy to create code that behaves like that, intentionally or not. So it depends on what your tasks are doing (and how many you're starting) which at this point is unknown to anyone reading this. In your case it also depends on what the web server you're accessing is doing, and how well it handles concurrent requests, or if that server again hits a database, it might be the database at the other end of the world that is the actual bottleneck. – nos Nov 17 '15 at 20:43
  • 1
    It would be easier to help you if you show us more relevant code. – Jonathan Carroll Nov 17 '15 at 20:44
  • 2
    This could be due to the server? Perhaps it only handles a few requests at a time? Or perhaps all the tasks hit the same tables in the DB and they interfere with each other. We need to know what these tasks are doing (and what the server is doing with the HTTP requests) in order to give a useful answer – Vlad274 Nov 17 '15 at 20:45
  • @stil have you tried using `Task.WaitAll(tasks);` ? Also, are all your tasks using the same connection to make the requests? There might be some kind of race condition if it's the case... – Philippe Paré Nov 17 '15 at 20:46
  • 1
    @PhilippeParé That wouldn't be a meaningful change. – Servy Nov 17 '15 at 20:48
  • 3
    `There is no CPU bound, because tasks just send HTTP requests to server.` Then you shouldn't be creating multiple threads in the first place. You should have the *only thread in your program* kick off all of the asynchronous requests and then handle all of the responses. Creating a bunch of threads that will do nothing but sit around waiting for IO is counter-productive. – Servy Nov 17 '15 at 20:50
  • @nos, if there was a server issue, #1 example would not execute so fast. In both examples I use same set of HTTP requests to same servers. I'm trying to reproduce issue so it would be simple and short enough to paste on StackOverflow but it's not easy. – stil Nov 17 '15 at 21:21
  • 2
    @still Your assumptions could be right, but they could also be wrong - the important part is to realize that you are making assumptions, and you should verify those assumptions somehow. One can write a server(or webapp) that is very fast at handling sequential requests, and one can write a server that simply crawls to a halt if it's hit with too many requests at the same time. The same goes for a database, it might handle 1 single query blazingly fast, but if you throw 500 SQL queries at it at once, it might bog down and exponentially increase the response time for all queries. – nos Nov 17 '15 at 21:29

1 Answers1

-1

I solved my problem with @StephenCleary helper library: https://github.com/StephenCleary/AsyncEx

AsyncContext.Run(async () =>
{
    foreach (var task in tasks) task.Start();
    await Task.WhenAll(tasks);
});

Now it runs as quickly as #1 example but also correctly waits before all requests are complete.

stil
  • 5,306
  • 3
  • 38
  • 44
  • Can you achieve the same result by simply storing the tasks in an array and using await Task.WhenAll(tasks)? In other words what does AsyncContext buy you? –  Nov 17 '15 at 23:25
  • @Sam, in console application you cannot simply call `await Task.WhenAll(tasks)` in Main method. From what I understand, my previous code caused deadlocks. You may find more information here: http://stackoverflow.com/a/9212343/1420356 – stil Nov 17 '15 at 23:29
  • Please see my answer. –  Nov 17 '15 at 23:58