I'm working on an ASP.NET application that does a few imaging operations on large picture files (up to 100MP!). The processing of these pictures is actually relatively fast (less than a second) but the memory usage is understandably huge (about 500MB per image). When multiple pictures are uploaded to the server at the same time the server starts accepting all requests at the same time and the host runs out of memory.
1) How do I minimise this memory impact? 2) If the memory impact is minimised, there would still be a limit. So, can I also limit the absolute amount of images that are processed concurrently?
My own ideas and thoughts...
Because the execution time allows for some waiting (it's no problem if the requests takes a couple of seconds) I want to solve this by queuing the image transformation functions and only allowing concurrent execution of up to 2 or 3 pictures at the same time. This way, memory usage is at about 1.5GB which is fine. When moving to production, I'd like to increase this number as more memory will be available there.
Perhaps: How can I apply C# multi-threading classes (e.g. ConcurrentQueue, BlockingCollection, Interlocked) to ensure a single method invoked by a ASP.NET request handler can only execute in parallel a finite amount of instances?
Note that expensive threading operations are not really a problem here, as the overhead compared to the second-long operation of transforming the images is negligible.
public ActionResult UploadLargePicture()
{
// Some trivial stuff like authorization
var result = VeryMemoryIntensiveFunction(); // This is the part of the code that should have limited concurrency
return Json(...);
}