I have an issue with memory in Python 2.7, on Windows 10.
I have acquire few thousands of images (tiff
format). I then want to reconstruct the whole object. The first part of my code is working good by extracting some metadata that are used later.
The images folder have a size of 1.1 GB
, and the final image have approximately the same size. I loop trough the images for reconstructed my final image.
The trouble came after few hundreds of iterations, the Memory used by Python suddenly start to increase up to reach 100%
(32 GB
of RAM!). It manage to finish, but the computer freeze for few minutes. And one day I will need to manipulate bigger images file.
I try the timeit
module for found the trouble, garbage collect (example code) and optimize my code (using numpy
) without any success.
So I wonder if they are some tricks for bunch of image processing in python, or some other way to force freeing the memory, if possible without using the subprocess module.
Here is the portion of code that cause me some trouble:
import gc
garbage = []
ImageSize = np.array((mean.shape), dtype = np.int16) #Classical image size
RecontructImage = np.zeros((Xlgth*ImageSize[0], Ylgth*ImageSize[1]), dtype=np.int16) #final image
for x in range(len(ImageList)):
image1 = Image.open(mypath+'/'+ImageList[x])
image = np.array(image1, dtype = np.int16) #transform image to array
image1.close()
image = SingleImageTreatment(image) #Image modification
pos = np.array((PositionCoordonate[x][0]*image.shape[0], PositionCoordonate[x][1]*image.shape[1])) #Extract coordinate of the image
RecontructImage[pos[0]:pos[0]+image.shape[0], pos[1]:pos[1]+image.shape[1]] = image #Put the image in the global image
image, image1 = None, None #un-allocated for garbage
garbage.append(gc.get_count())
gc.collect()