17

I'm doing some tests with the new Background tasks with hosted services in ASP.NET Core feature present in version 2.1, more specifically with Queued background tasks, and a question about parallelism came to my mind.

I'm currently following strictly the tutorial provided by Microsoft and when trying to simulate a workload with several requests being made from a same user to enqueue tasks I noticed that all workItems are executed in order, so no parallelism.

My question is, is this behavior expected? And if so, in order to make the request execution parallel is it ok to fire and forget, instead of waiting the workItem to complete?

I've searched for a couple of days about this specific scenario without luck, so if anyone has any guide or examples to provide, I would be really glad.

Edit: The code from the tutorial is quite long, so the link for it is https://learn.microsoft.com/en-us/aspnet/core/fundamentals/host/hosted-services?view=aspnetcore-2.1#queued-background-tasks

The method which executes the work item is this:

public class QueuedHostedService : IHostedService
{
    ...

    public Task StartAsync(CancellationToken cancellationToken)
    {
        _logger.LogInformation("Queued Hosted Service is starting.");

        _backgroundTask = Task.Run(BackgroundProceessing);

        return Task.CompletedTask;
    }

    private async Task BackgroundProceessing()
    {
        while (!_shutdown.IsCancellationRequested)
        {
            var workItem = 
                await TaskQueue.DequeueAsync(_shutdown.Token);

            try
            {
                await workItem(_shutdown.Token);
            }
            catch (Exception ex)
            {
                _logger.LogError(ex, 
                    $"Error occurred executing {nameof(workItem)}.");
            }
        }
    }

    ...
}

The main point of the question is to know if anyone out there could share the knowledge of how to use this specific technology to execute several work items at the same time, since a server can handle this workload.

I tried the fire and forget method when executing the work item and it worked the way I intended it to, several tasks executing in parallel at the same time, I 'm jut no sure if this is an ok practice, or if there is a better or proper way of handling this situation.

marceloatg
  • 536
  • 1
  • 9
  • 21
  • If you implemented it after [this](https://learn.microsoft.com/en-us/aspnet/core/fundamentals/host/hosted-services?view=aspnetcore-2.1#queued-background-tasks) example than it is no wonder that the tasks are executed in order because thats the whole point of the model. The `BackgroundProceessing` method always awaits the next task. This means that tasks are worked parallel (to the web server) but in order. – a-ctor Jul 15 '18 at 15:52
  • So, what you are saying is that once deployed to a server, having multiple requests coming from different users those tasks will be executed in parallel? And because I'm new to stack overflow, what does the -1 on my question means? – marceloatg Jul 15 '18 at 16:02
  • your questions could be improved by showing us what you have done. A code snippet of what you have done would be great. So I assume you used the example code that I linked? – a-ctor Jul 15 '18 at 16:04
  • The code from the example runs in parallel (parallel to the threads that handle the web requests) but only one task is executed at the same time. This means that adding two items to the hosted service will execute them in the order they are added but they will be executed parallel to the web request handling. – a-ctor Jul 15 '18 at 16:07
  • yes, I assumed that was specific enough since that's the only official documentation for that specific topic for that specific technology, thanks for the clarification. But most importantly, can you confirm if once deployed to a server, having multiple requests coming from different users those tasks will be executed in parallel? That's the whole point of the question and it takes some time until I can find a server and deploy the code and then run the tests (which by the way I'm doing right now with free tier AWS). – marceloatg Jul 15 '18 at 16:08
  • 1
    ok, I guess I'm not expressing myself very well. What I really want to know is if there is a way to execute several tasks at the same time, since the server can handle that. I tried the fire and forget method and it worked that way I intended, I 'm jut no sure if this is an ok practice. – marceloatg Jul 15 '18 at 16:11
  • Check this out. [ASP .Net Core Queued background tasks parallel processing](https://stackoverflow.com/a/75319776/4354755) This answer much more effective. – htekir Feb 02 '23 at 07:48

2 Answers2

14

The code you posted executes the queued items in order, one at a time but also in parallel to the web server. An IHostedService is running per definition in parallel to the web server. This article provides a good overview.

Consider the following example:

_logger.LogInformation ("Before()");
for (var i = 0; i < 10; i++)
{
  var j = i;
  _backgroundTaskQueue.QueueBackgroundWorkItem (async token =>
  {
    var random = new Random();
    await Task.Delay (random.Next (50, 1000), token);
    _logger.LogInformation ($"Event {j}");
  });
}
_logger.LogInformation ("After()");

We add ten tasks which will wait a random amount of time. If you put the code in a controller method the events will still be logged even after controller method returns. But each item will be executed in order so that the output looks like this:

Event 1
Event 2
...
Event 9
Event 10

In order to introduce parallelism we have to change the implementation of the BackgroundProceessing method in the QueuedHostedService.


Here is an example implementation that allows two Tasks to be executed in parallel:

private async Task BackgroundProceessing()
{
  var semaphore = new SemaphoreSlim (2);

  void HandleTask(Task task)
  {
    semaphore.Release();
  }

  while (!_shutdown.IsCancellationRequested)
  {
    await semaphore.WaitAsync();
    var item = await TaskQueue.DequeueAsync(_shutdown.Token);

    var task = item (_shutdown.Token);
    task.ContinueWith (HandleTask);
  }
}

Using this implementation the order of the events logged in no longer in order as each task waits a random amount of time. So the output could be:

Event 0
Event 1
Event 2
Event 3
Event 4
Event 5
Event 7
Event 6
Event 9
Event 8

edit: Is it ok in a production environment to execute code this way, without awaiting it?

I think the reason why most devs have a problem with fire-and-forget is that it is often misused.

When you execute a Task using fire-and-forget you are basically telling me that you do not care about the result of this function. You do not care if it exits successfully, if it is canceled or if it threw an exception. But for most Tasks you do care about the result.

  • You do want to make sure a database write went through
  • You do want to make sure a Log entry is written to the hard drive
  • You do want to make sure a network packet is sent to the receiver

And if you care about the result of the Task then fire-and-forget is the wrong method.

That's it in my opinion. The hard part is finding a Task where you really do not care about the result of the Task.

a-ctor
  • 3,568
  • 27
  • 41
  • Thanks for your answer and for your time, this approach is basically what I tried with the fire and forget for the work item, the only difference in your code is the SemaphoreSlim delimiting the concurrent work. So we come back to the question (and please, this is an honest question, I'm just trying to learn more about this topic) is it ok in a production environment to execute code this way, without awaiting it? I ask this because in every related question in stackoverflow people always condemn this practice and and don't know if it has a real world impact in this case. – marceloatg Jul 15 '18 at 17:27
  • @marceloatg see my edit. Hope it answers you question :) – a-ctor Jul 15 '18 at 18:13
  • yes it did, this is the kind of question that people usually don't answer straight to the point like you did in your edit, so thank you very much for the explanation. – marceloatg Jul 15 '18 at 18:41
3

You can add the QueuedHostedService once or twice for every CPU in the machine.

So something like this:

for (var i=0;i<Environment.ProcessorCount;++i)
{
    services.AddHostedService<QueuedHostedService>();
}

You can hide this in an extension method and make the concurrency level configurable to keep things clean.

alex.pino
  • 225
  • 2
  • 9
  • Hi Alex, have you used this approach in a production environment? is it reliable? I did something similar but I couldn't find any documentation about it. – Gabriel Cerutti May 01 '19 at 03:12
  • Yes I have. I’ve also looked into the source code on github to make sure it would work. – alex.pino Jul 15 '19 at 19:18
  • For whatever reason this is not working for me. The accepted answer is. –  May 18 '20 at 13:43
  • Approach with 'ProcessorCount' will fail in case of kubernetes/docker hosting. What is more, sometimes you may want 20 Queded services, even on 2 CPU machine (in case each service is handling calls to external resources in async way). – Maciej Pszczolinski Dec 20 '20 at 06:20
  • Calling this multiple times potentially doesn't work. Though there is a workaround. https://github.com/dotnet/runtime/issues/38751 – benmccallum Dec 03 '21 at 15:07