-1

I made a program that does a long task. In this task I need to perform a "await" call, but I don't really know how to make this multithreaded. This is my code:

ApplicationDbContext db = new ApplicationDbContext();
foreach (var apiaccount in db.APIAccounts.ToArray())
{
    var max = db.Chats.Where(a => a.apiaccountid == apiaccount.id).Max(a => a.ended);
    var start = max != null ? max.Value : new DateTime(2014, 10, 1, 1, 0, 0);

    if (start.Hour == 0)
    {
        start = start.AddDays(-1);
    }

    var mod = new ImportAPIModel()
    {
        email = apiaccount.Email,
        token = apiaccount.Token,
        startdate = start,
        stopdate = null,
        importid = importid,
        apiaccountid = apiaccount.id
    };

    await ImportAPI(mod);
}

//async Task ImportAPI(object obj)

There are only a few accounts, so I want the code to start next to eachother. So I change the code to:

ApplicationDbContext db = new ApplicationDbContext();
List<Thread> threads = new List<Thread>();

foreach (var apiaccount in db.APIAccounts.ToArray())
{
    var max = db.Chats.Where(a => a.apiaccountid == apiaccount.id).Max(a => a.ended);

    var start = max != null ? max.Value : new DateTime(2014, 10, 1, 1, 0, 0);
    if (start.Hour == 0)
    {
        start = start.AddDays(-1);
    }

    var mod = new ImportAPIModel()
    {
        email = apiaccount.Email,
        token = apiaccount.Token,
        startdate = start,
        stopdate = null,
        importid = importid,
        apiaccountid = apiaccount.id
    };

    //await ImportAPI(mod);

    Thread thread = new Thread(new ParameterizedThreadStart(ImportAPI));
    thread.Start(mod);

    threads.Add(thread);
}

foreach (var thread in threads) thread.Join();

//async Task ImportAPI(object obj)

But this doesn't compile as the thread cannot use the Task type. If I then change it to void it does run, but doesnt wait on the .Join() command.

It seems to me like this should be really easy with async programming, I just don't get how. I've tried to find some information on this on google, but with no result. So this is why I'm asking. I hope I won't be tagged as duplicate, but if so, I will be helped anyways :) So thank you all in advance! Stackoverflow is the best.

ikwillem
  • 1,044
  • 1
  • 12
  • 24
  • 1
    By the way, it is better to use a using-block for your DbContext object, to dispose it automatically. Now you have to manually dispose it. – Peter Bruins Nov 27 '17 at 16:11
  • Is that really true? The thread will just exit the function then gabage collector will take the datacontext right? – ikwillem Nov 27 '17 at 16:15
  • 1
    You should not trust the garbage collector for that. For programs with low memory pressure a full garbage collection may never run through the lifetime of the program – Scott Chamberlain Nov 27 '17 at 16:18
  • Ok, I will use it like that in the future. But I thought there was this statement about gabage collection that goes something like "just let go". I did already use a using-block with streams, but this is more because this is the shortest way to write code with streams. Thanks for the advice! – ikwillem Nov 27 '17 at 16:38
  • 2
    @ikwillem For all things that are managed by the GC, you should indeed just "let it go" and let the GC take care of it. But some resources *aren't* managed by the GC, these "unmanaged resources" are things that the GC isn't responsible for cleaning up for you. The `IDisposable` interface exists precisely to support these unmanaged resources. When you have a disposable object it's important for you to explicitly dispose of that unmanaged resource, because the GC *isn't* the one responsible for cleaning it up, you are. – Servy Nov 27 '17 at 21:23

1 Answers1

0

Add the task generated from ImportAPI to a list, than call Task.WhenAll and await it. It will return a new task that will complete when all tasks in the list are completed.

var tasks = new List<Task<SomeType>>();
foreach (var apiaccount in db.APIAccounts.ToArray())
{
    ....
    tasks.Add(ImportAPI(mod));
}
await Task.WhenAll(tasks);
Magnus
  • 45,362
  • 8
  • 80
  • 118