To become more deterministic you should write some integration tests to check where your memory does end up. You can do it now with WMemoryProfiler. I would first load the images in 1500x1500 size, clean everything up and then mark all objects as known. Then I would relod the big images and check which new objects were allocated and have a sharp look at how many of them are there and who does own them.
You say that there are many external modules used. Perhaps you should drop some of them due to unwise usage of memory and replace them with something better. Now you can check.
If you are reaching the limit you still can unload some images and load them on demand if you and your plugins do support lazy structures such as IEnumerable<Image>
where you as provider can decide when to load an images and how long to keep it in a cache until you get rid of the reference to help freeing up some memory.
[Test]
public void InstanceTracking()
{
using (var dumper = new MemoryDumper()) // if you have problems use to see the debugger windows true,true))
{
TestWith1500x1500();
dumper.MarkCurrentObjects();
TestWith3000x3000();
ILookup<Type, object> newObjects = dumper.GetNewObjects()
.ToLookup( x => x.GetType() );
// here we do find out which objects are holding most of the memory
MemoryStatistics statOld = dumper.GetMemoryStatistics();
foreach (var typeInfo in statOld.ManagedHeapStats
.OrderByDescending(x => x.Value.Count))
{
Console.WriteLine("Type {0} has {1} instances of total size {2:N0} bytes",
typeInfo.Key,
typeInfo.Value.Count,
typeInfo.Value.TotalSize);
}
// then check with the info from above who is holding the most interesting new objects.
Console.WriteLine("New Strings:"); // just an example perhaps you should have a look at the images.
foreach (var newStr in newObjects[typeof(string)] )
{
Console.WriteLine("Str: {0}", newStr);
}
}
}