I wrote a software to process images and to reduce the processing time, I tried to use multithreading. Below is the relevant snippet.
bool Multithread = CheckMultithread();
UpdateParameters();
if (Multithread)
{
Parallel.For(0, FileNames.Length ,i => Solve(FileNames[i]));
}
else
{
foreach (string s in FileNames)
{
Solve(s);
}
}
This is the first time I try to write multithreaded code in C#; but I believe there are no threading issues, since the processing of one image does not interfere with the processing of another.
The problem is: if the Multithread
is true
, I get an OutOfMemoryException
when 200ish image is being processed... I imagine this kind of parallel implementation consumes N times more memory than the sequential equivalent, with N being the number of threads.
I'm using unmanaged code in a single class, but every time such class is used it is inside an using
context. For reference, this class is a wrapper for the System.Drawing.Bitmap
.
Each thread is consuming +/- 400 MB of ram, and when the OutOfMemoryException is thrown, the program is using around 1300 MB of ram. Even though I have over 9 GBs of free memory.
I wrote the following workaround code inside the Solve
method. with this exception, I added the following code right in the begging of Solve()
if (GC.GetTotalMemory(false) > 1000*1000*1000)
{
lock (Manager.dasLock)
{
Manager.sw.Start();
GC.Collect();
Manager.sw.Stop();
}
}
With the workaround, the software was able to process all 2000+ images without running out of memory, but my peers are complaining that I shouldn't touch the GC. So, how can I fix this issue without invoking the GC manually?