I am showing the images that I capture from two cameras in a picturebox (the same for both). Each camera image is shown in a different ROI of the picturebox.
My problem is that the memory used by the application is increasing continuously, most probably I am missing freeing some resources, but I cannot find what I am missing.
This is the code:
// bitmap: input new image
// srcRoi: a rectangle with the ROI for the input image
// dstRoi: a rectangle with the ROI for the pictureBox
// reset: true for the first camera, false for the second one.
if (reset)
{
pictureBoxPreview1.Image.Dispose();
}
if (pictureBoxPreview1.Image == null || reset)
{
pictureBoxPreview1.Image = new Bitmap(pictureBoxPreview1.Width,
pictureBoxPreview1.Height);
}
using (Graphics g = Graphics.FromImage(pictureBoxPreview1.Image))
{
using (var img = Image.FromHbitmap(bitmap.GetHbitmap()))
{
g.DrawImage(img, dstRoi, srcRoi, GraphicsUnit.Pixel);
}
}
if (reset) {
pictureBoxPreview1.Invalidate();
}
The problem is not happening if pictureBoxPreview1.Image.Dispose()
is call for both cameras, but then only the image of one camera is shown each time.
I don't understand why if I am only creating a new image and disposing it for half of the images the problem is solved when the same thing is done for all the images.