0

We're developing WebAPI which has some logic of decryption of around 200 items (can be more). Each decryption takes around 20ms. We've tried to parallel the tasks so we'll get it done as soon as possible, but it seems we're getting some kind of a limit as the threads are getting reused by waiting for the older threads to complete (and there are only few used) - overall action takes around 1-2 seconds to complete...

What we basically want to achieve is get x amount of threads start at the same time and finish after those ~20 ms.

We tried this: Await multiple async Task while setting max running task at a time

But it seems this only describes setting a limit while we want to release it...

Here's a snippet:

                    var tasks = new List<Task>();
                    foreach (var element in Elements)
                    {
                        var task = new Task(() =>
                        {
                          element.Value = Cipher.Decrypt((string)element.Value);
                        }
                        });
                        task.Start();
                        tasks.Add(task);
                    }
                    Task.WaitAll(tasks.ToArray());

What are we missing here?

Thanks, Nir.

Community
  • 1
  • 1
nirpi
  • 715
  • 1
  • 9
  • 24

2 Answers2

1

I cannot recommend parallelism on ASP.NET. It will certainly impact the scalability of your service, particularly if it is public-facing. I have thought "oh, I'm smart enough to do this" a couple of times and added parallelism in an ASP.NET app, only to have to tear it right back out a week later.

However, if you really want to...

it seems we're getting some kind of a limit

Is it the limit of physical cores on your machine?

We tried this: Await multiple async Task while setting max running task at a time

That solution is specifically for asynchronous concurrent code (e.g., I/O-bound). What you want is parallel (threaded) concurrent code (e.g., CPU-bound). Completely different use cases and solutions.

What are we missing here?

Your current code is throwing a ton of simultaneous tasks at the thread pool, which will attempt to handle them as best as it can. You can make this more efficient by using a higher-level abstraction, e.g., Parallel:

Parallel.ForEach(Elements, element =>
{
  element.Value = Cipher.Decrypt((string)element.Value);
});

Parallel is more intelligent in terms of its partitioning and (re-)use of threads (i.e., not exceeding number of cores). So you should see some speedup.

However, I would expect it only to be a minor speedup. You are likely being limited by your number of physical cores.

Stephen Cleary
  • 437,863
  • 77
  • 675
  • 810
0

Asuming no hyper threading:

If it takes 20ms for 1 item , then you can look at it as if it takes 1 core 20ms. If you want 200 items to complete in 20 ms, then you need 200 cores all for you. If you don't have that many, it just can't be done...

Under normal surcumstances, as many Task Will be scheduled parallel as optimal for you system

Wim Reymen
  • 190
  • 2
  • 4