1

What it does: for each EncryptedBase64PictureFile, reads the content, decrypts the base64 string and creates a picturebox.

Where the problem is: Insane memory usage! I guess that some data after each loop are not deleted properly. For example, 100 loops with input around 100MB of encrypted data, which should generate around 100MB of image files, uses around 1.5 GB of memory! And when i try to decrypt just a little more data, around 150MB, i get OutOfMemory exception. Visual studio's memory profiling report says, that " string fileContent= reader.ReadToEnd();" line is responsible for 80% of allocations.

for each EncryptedBase64PictureFile {
    Rijndael rijAlg = Rijndael.Create();
    rijAlg.Key = ASCIIEncoding.ASCII.GetBytes(sKey);
    rijAlg.IV = ASCIIEncoding.ASCII.GetBytes(sKey);
    FileStream fsread = new FileStream(EncryptedBase64PictureFile, FileMode.Open, FileAccess.Read);
    ICryptoTransform desdecrypt = rijAlg.CreateDecryptor();
    CryptoStream cryptostreamDecr = new CryptoStream(fsread,desdecrypt, CryptoStreamMode.Read);

    StreamReader reader = new StreamReader(cryptostreamDecr);
    string fileContent= reader.ReadToEnd(); //this should be the memory eater
    var ms = new MemoryStream(Convert.FromBase64String(fileContent));

    PictureBox myPictureBox= new PictureBox();
    myPictureBox.Image = Image.FromStream(ms);

    ms.Close();
    reader.Close();
    cryptostreamDecr.Close();
    fsread.Close();

}

So the question is, is there a way to dealocate memory properly after each loop? Or is the problem in something else? Thanx for each idea!

EDIT: Of course i tried to dispose() all 4 streams, but the result was the same...

ms.Dispose();
reader.Dispose();
cryptostreamDecr.Dispose();
fsread.Dispose();

EDIT: Found the problem. It was not dispose(), but creating the picture from stream. After deleting the picture, memory usage went from 1.5GB to 20MB.

EDIT: Pictures are about 500kb in .jpg format, around 700kb in base64 encrypted format. But i have really no idea, how big is the imagebox object.

EDIT: 100 loops with input around 100MB was meant that each loop takes around 1MB, 100MB is total for 100 loops

user2054618
  • 55
  • 10
  • How large images do you have there? – Lasse V. Karlsen Mar 07 '14 at 14:03
  • 32 bit? It MAY just be fragmentation. Dealing with 100mb blocks in 32 bit memory space is - reckless. THis is bound to create problems (large object heap framentation). – TomTom Mar 07 '14 at 14:13
  • 1
    Have you tested the assumption that X amount of encrypted data results in X amount worth of `PictureBox` + `Image` instances? Perhaps there's a factor > 1 at work here? (although a factor of 15 does sound too much) – Jon Mar 07 '14 at 14:14
  • 1
    If the images are .JPG images, then 100MB of JPEG is a lot of pixels, hence my original question. – Lasse V. Karlsen Mar 07 '14 at 14:18
  • @LasseV.Karlsen: That definitely sounds plausible. – Jon Mar 07 '14 at 14:19
  • Possible. Absolutely - one should try to run taht on 64 bit and have a look at the generated image. A 100mb JPEG is either REALLY high quality compressed (and even then large) or the mother of all images in size. COuld e a digital camera high resolution shot. Not sure those can be sensibly bitmapped in 32 bit into an image (i.e. by standard libraries). – TomTom Mar 07 '14 at 14:20
  • 1
    Why do you explicitly call `.Dispose()` there is no good reason to, you really should change your code to use `using` statements. – Scott Chamberlain Mar 07 '14 at 14:30

2 Answers2

1

Another answer: Live with it.

As in: You work with 100mb blocks in what appears to be a 32 bit application. This will not work without reusing buffers due to large object heap and general memory fragmentation.

As in: The memory is there, just not in large enough blocks. THis results in allocation errors.

There is no real way around this except going 64 bit where the larger address space handles the issue.

Information about this may be at:

https://connect.microsoft.com/VisualStudio/feedback/details/521147/large-object-heap-fragmentation-causes-outofmemoryexception

https://www.simple-talk.com/dotnet/.net-framework/large-object-heap-compaction-should-you-use-it/

has a possible solution these days, engabling large object heap conpaction:

GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect(); // This can be omitted

LOH oprations are expensibe, but 100mb areas running aruond is not exactly a GC recommendet scenario. Not in 32 bit.

TomTom
  • 61,059
  • 10
  • 88
  • 148
0

Use a base 64 transform when decrypting your stream. Do not use Convert.FromBase64String as this requires all data to be in memory.

using (FileStream f64 = File.Open(fileout, FileMode.Open) ) // content is in base64
using (var cs=new CryptoStream(f64, new FromBase64Transform(), CryptoStreamMode.Read ) ) // transform passed to constructor
using(var fo =File.Open(filein +".orig", FileMode.Create))
{
    cs.CopyTo(fo); // stream is accessed as if it was already decrypted
}   

Code sample taken from this related answer - Convert a VERY LARGE binary file into a Base64String incrementally

Gusdor
  • 14,001
  • 2
  • 52
  • 64