0

I needed a very basic serial execution queue. I wrote the following based on this idea, but I needed a queue to ensure FIFO, so I added a intermediate ConcurrentQueue<>.

Here is the code:

public class SimpleSerialTaskQueue
{
    private SemaphoreSlim _semaphore = new SemaphoreSlim(0);
    private ConcurrentQueue<Func<Task>> _taskQueue = new ConcurrentQueue<Func<Task>>();

    public SimpleSerialTaskQueue()
    {
        Task.Run(async () =>
        {
            Func<Task> dequeuedTask;
            while (true)
            {
                if (await _semaphore.WaitAsync(1000))
                {
                    if (_taskQueue.TryDequeue(out dequeuedTask) == true)
                    {
                        await dequeuedTask();
                    }
                }
                else
                {
                    Console.WriteLine("Nothing more to process");
                    //If I don't do that , memory pressure is never released
                    //GC.Collect();
                }
            }
        });
    }

    public void Add(Func<Task> o_task)
    {
        _taskQueue.Enqueue(o_task);
        _semaphore.Release();
    }
}

When I run that in a loop, simulating heavy load, I get some kind of memory leak. Here is the code:

static void Main(string[] args)
{
    SimpleSerialTaskQueue queue = new SimpleSerialTaskQueue();
    for (int i = 0; i < 100000000; i++)
    {
        queue.Add(async () =>
        {
            await Task.Delay(0);
        });
    }
    Console.ReadLine();
}

EDIT: I don't understand why once the tasks have been executed, I still get like 750MB of memory used (based on VS2015 diagnostic tools). I thought once executed it would be very low. The GC doesnt seem to collect anything. Can anyone tell me what is happening? Is this related to the state machine

chrisdot
  • 659
  • 6
  • 19
  • 4
    24Mb isn't a memory leak, memory leak is when memory grows and grows, execute the same test with an external loop, if after executing the external loop some times it still goes down to 24Mb you don't have any leak. – Gusman Nov 06 '17 at 13:55
  • 1
    What is the purpose of the semaphore here? – Fildor Nov 06 '17 at 13:57
  • You also have 100k of task results still in memory. – BugFinder Nov 06 '17 at 13:59
  • If you really need something that guarentees FIFO you can use [this version of the queue](https://stackoverflow.com/questions/32993301/rising-events-without-blocking-and-receiving-events-in-the-right-order/32993768#32993768) instead. Your queue has numerous problems. Most importantly, there's no way for the caller to know when the operation added to the queue finishes, what it's result is, if it errored, etc. Yours only supports fire and forget. It also just kills itself randomly, and in a way that the caller can't know if it's still working or not. – Servy Nov 06 '17 at 14:16
  • @BugFinder What makes you say that? – Servy Nov 06 '17 at 14:16
  • @Fildor Without it the code would go into a busyloop, although there are much better ways of solving this problem than what's shown, this solution *would* be much worse without the semaphore. – Servy Nov 06 '17 at 14:20
  • @Servy while the SimpleSerialTaskQueue doesnt seem to have any direct documentation (in fact google only finds 2 things) the other thing as this was one, also demonstrates my thought which is its a queue of tasks which get changed to done.. as a result of which the queue contains still a reference to every task. and as theres 100k of them put in there in the first place, theres 100k tasks in the queue. – BugFinder Nov 06 '17 at 14:33
  • @BugFinder That queue for tasks is the class shown in the question. As the items are added to the queue they're also taken out of the queue and executed. Given that all of the tasks immediately complete, the queue will be empty very shortly after they finish being added. At the end of the loop there's most likely only a small number of tasks, and within a fraction of a second there won't be any. – Servy Nov 06 '17 at 14:38
  • @Servy interesting - the queues I used didnt do that - do you have docs on this queue? – BugFinder Nov 06 '17 at 14:39
  • @BugFinder Again, the code for the queue *is in the question*. – Servy Nov 06 '17 at 14:42
  • @Servy I worded the question badly. I am aware of what it does. I was wondering why it is used in the first place instead of the much better alternatives you mentioned. – Fildor Nov 06 '17 at 14:45
  • @BugFinder `if (_taskQueue.TryDequeue(out dequeuedTask) == true) { await dequeuedTask(); }` The Tasks are dequeued here and go out of scope ... I don't see where they should be kept in memory. Personally, I'd even _not_ call GC.Collect. – Fildor Nov 06 '17 at 14:52
  • @Gusman: you are right, I've updated the example with a bigger number of tries. Now I got 700MB of memory that will never get released if I don't collect the GC myself – chrisdot Nov 06 '17 at 15:02
  • @Servy this example is here only to show the memory consumption. I know that it is a fire and forget task runner, without any exception/error handling, but the prupose for me here is to find out why the memory doesn't go down even after it has exectuted. – chrisdot Nov 06 '17 at 15:20
  • Servy and Gusman clearly i need more caffeine on this one normally Im more awake than this.. @Christophe when you are testing for memory, are you checking how much you have before you run your queue, while I got 24mb after, it was at 19-20 before ...by the time you lazy load a reference about queues... that could be most of the difference – BugFinder Nov 06 '17 at 15:21
  • @BugFinder See my updated code, now I'm on a base of 700MB which shouldn't bring any confusion – chrisdot Nov 06 '17 at 15:33
  • Im not seeing your results. I dont gain 700mb. but it does depend where you test the memory.. So output Process.PagedMemorySize64 before you make the queue and after the console.readline - and only hit go when its done.. and output PagedMemorySize64 again, Im getting the same number. – BugFinder Nov 06 '17 at 16:02
  • @BugFinder This is because you didn't update the section where I commented out the GC.Colect() I guess – chrisdot Nov 06 '17 at 16:05
  • Actually I did, but I also then set queue to null, so your queue has no leak – BugFinder Nov 06 '17 at 16:06
  • If the memory goes away when you do a GC.Collect you don't have a leak, you just have a GC that has decided that it is not nessessary to do a collection yet. The collection only happens when your system does not have enough free memory to rapidly process requests. If you have a lot of free memory on your system you won't see frequent GC's. – Scott Chamberlain Nov 06 '17 at 16:13

0 Answers0