Running OutOfMemory in my relatively simple web API which is deployed to an Azure server. I download two images using the same method called DownloadImageFromUrl(string url).. I then just draw some text on these, do resizing, and return the image. One of them is a relatively large image file, anywhere from 1-12 MB.
Here is the implementation of my DownloadImageFromUrl, which the error messages (and heap analysis, see below) are pointing me to. Both methods result in the same errors:
private Bitmap DownloadImageFromUrl(string url)
{
//try
//{
// // METHOD A:
// WebRequest request = System.Net.WebRequest.Create(url);
// WebResponse response = request.GetResponse();
// Stream responseStream = response.GetResponseStream();
// return new Bitmap(responseStream);
//}
//catch (WebException e)
//{
// return null;
//}
//METHOD B:
try
{
using (WebClient client = new WebClient())
{
byte[] data = client.DownloadData(url);
using (MemoryStream mem = (data == null) ? null : new MemoryStream(data))
{
return (data == null || mem == null) ? null : (Bitmap)Image.FromStream(mem);
}
}
}
catch (WebException e)
{
return null;
}
}
My API handles singular requests very well. The problem comes in when blasting it with requests. When I send 20 all at the same time, the memory shoots up to over 1g and throws OutOfMemoryExceptions. When checking out snapshots of the heap, I notice that there is a new memorystream object created for each API call (20 of them) and it alone is using 30,410,112 bytes at the climax of the memory spike.
Can someone help me to alter my API to handle more user requests at once? Strangely enough, creating more instances of my app on Azure actually yielded worse results.
Edit: I've also considered using ImageMagick.NET in order to handle the drawing and resizing of images but this will require a big overhaul.