14

I want to render around 500 images and save them to different png files, therefore I wrote a small class which contains my dataset and a renderfunction

from mayavi import mlab
mlab.options.offscreen=True

class Dataset(object):
    def __init__(self):
        some init stuff etc

                 .
                 .
                 .
                 .

    def save_current_frame_to_png(self, filename):
        mlab.contour3d(self.frame_data, contours =30, opacity=0.2) 
        mlab.savefig(filename)
        mlab.clf()
        mlab.close()
        gc.collect()

    def create_movie_files(self):
        folder_name = "animation"
        try:
            os.makedirs(folder_name)
        except OSError:
            raise OSError("Directory already exists.")

        self.__go_to_first_frame()

        for i in range(self.frames):
            filename = "".join([folder_name, "/%.5i" % i, ".png"])
            print filename
            self.save_current_frame_to_png(filename)
            self.read_single_frame()

        self.__go_to_first_frame()

So everything seemed to work fine but than I had a look at the memory usage which goes up until the system crashes. So i tried to use mlab.clf() and gc.collect() to keep my memory low which didn't work. I found a solution with mlab.close() which seems to work for memory usage but this brings a new problem. Everytime a new image is rendered there is also a new window created by mayavi, so after around 200 windows the programm crashes. Maybe is there a possibilty to disable the windows completely ? It seems to me that mlab.options.offscreen=True only disable drawing inside the current window.

EDIT: self.frame_data is a numpy array of shape (100,100,100) and self.read_single_frame() just reads the next frame from a textfile and stores it in self.frame_data. This functions do not increase the ram, if I turn the rendering off the memory usage stays at 1.2%.

Aleksander Lidtke
  • 2,876
  • 4
  • 29
  • 41
jrsm
  • 1,595
  • 2
  • 18
  • 39
  • Can you shown the `some init stuff`? Particularly what is `salf.frame_data`? Also, can you post `self.read_single_frame()` please? Because your problem is due to the fact that you store a lot of data in memory, so you keep a reference to it somewhere when it's no longer needed. – Aleksander Lidtke Jun 13 '14 at 08:54
  • self.frame_data is a numpy array of shape (100,100,100) and self.read_single_frame() just reads the next frame from a textfile and stores it in self.frame_data. This functions do not increase the ram, if I turn the rendering off the memory usage stays at 1.2%. – jrsm Jun 13 '14 at 08:59
  • This is a great question. I was not able to reproduce all of your errors -- when using `mlab.close` to basically close each figure that is spawned, I never ran out of memory and was able to scroll through 500 sets of (very simple) images. I think the reason is that either my scenes had a lower memory constraint or my system has more memory, and that in any case spawning the mayavi scenes and then closing them causes some amount of memory leakage. It is definitely bothersome that functions like `source.remove()` and `mlab.clf()` do not do a good job with memory leaking. – aestrivex Jun 25 '14 at 15:34

1 Answers1

0

you should hide mlab.show() while you keep mlab.close(). then It will work