0

I have an issue with memory in Python 2.7, on Windows 10.

I have acquire few thousands of images (tiff format). I then want to reconstruct the whole object. The first part of my code is working good by extracting some metadata that are used later.

The images folder have a size of 1.1 GB, and the final image have approximately the same size. I loop trough the images for reconstructed my final image. The trouble came after few hundreds of iterations, the Memory used by Python suddenly start to increase up to reach 100% (32 GB of RAM!). It manage to finish, but the computer freeze for few minutes. And one day I will need to manipulate bigger images file.

I try the timeit module for found the trouble, garbage collect (example code) and optimize my code (using numpy) without any success.

So I wonder if they are some tricks for bunch of image processing in python, or some other way to force freeing the memory, if possible without using the subprocess module.

Here is the portion of code that cause me some trouble:

import gc
garbage = []

ImageSize = np.array((mean.shape), dtype = np.int16)             #Classical image size

RecontructImage = np.zeros((Xlgth*ImageSize[0], Ylgth*ImageSize[1]), dtype=np.int16)     #final image
for x in range(len(ImageList)):
    image1 = Image.open(mypath+'/'+ImageList[x])
    image = np.array(image1, dtype = np.int16)                   #transform image to array
    image1.close()
    image = SingleImageTreatment(image)                          #Image modification
    pos = np.array((PositionCoordonate[x][0]*image.shape[0], PositionCoordonate[x][1]*image.shape[1])) #Extract coordinate of the image
    RecontructImage[pos[0]:pos[0]+image.shape[0], pos[1]:pos[1]+image.shape[1]] = image            #Put the image in the global image
    image, image1 = None, None                                   #un-allocated for garbage
    garbage.append(gc.get_count())
    gc.collect()
Kalpesh Dusane
  • 1,477
  • 3
  • 20
  • 27
simon LECLERC
  • 28
  • 1
  • 7
  • In my case, I do not use any compression, so the TIFF is of the same size than the original folder (`ImageList`). – simon LECLERC Sep 14 '16 at 00:06
  • Task Manager show a sudden increase in memory consumption that go exponential after few hundreds of iterations. The first `garbage` is working well. After, it's always the same thing. `SingleImageTreatment` is always rewriting the image array after some filtration. Just wondering if a `numpy` array have a size limit. My array in this case is half a billion. Filling it little by little can cause a memory overload? – simon LECLERC Sep 14 '16 at 00:11

1 Answers1

0

All shame for me. The error was not in the code that I posted. But on the next line.

plt.imshow(ReconstructImage)

I use matplotlib.pyplot (aka plt) for pre-visualization. And the function imshow() is very memory asking, making the computer freeze. I delete that line and no more trouble.

The best answer to my trouble : Excessive memory usage in Matplotlib imshow. With the link to the detail here.

Next time that I need help, I will take the time to make a standalone code that reproduce the trouble.

Community
  • 1
  • 1
simon LECLERC
  • 28
  • 1
  • 7