0

I am trying to plot 3D volume with large data set. My data set can be larger than RAM and local disk. Because of this I get MemoryError

I try to create Virtual Memory, hoping that when I interact with 3D volume slices it will call data and extract only needed part but didn't work. Is it possible to visualize such a big data?

my data set is 3D numpy array

My code:

source = mlab.pipeline.scalar_field(data)
source.spacing = [1, 1, -1]

for axis in ['x', 'y', 'z']:
    plane = mlab.pipeline.image_plane_widget(source, 
                                    plane_orientation='{}_axes'.format(axis),
                                    slice_index=100, colormap='gray')
    plane.module_manager.scalar_lut_manager.reverse_lut = True

mlab.show()
mtkilic
  • 1,213
  • 1
  • 12
  • 28
  • Assuming you already profiled your code, this is not an easy answer as there's so many technical details to sort out. You should take a look at the answers to [this question](https://stackoverflow.com/q/15749100/3637404) to setup Paraview. Do remember that `mayavi` includes `tvtk`. – Felipe Lema Jun 14 '19 at 21:38
  • (hint: you _should_ profile your code first, maybe you're doing unnecessary allocations) – Felipe Lema Jun 14 '19 at 21:39
  • How big is your array? – Patol75 Jun 19 '19 at 07:03
  • @Patol75 i am working with Seismic data, so it can go over 100gb – mtkilic Jun 19 '19 at 14:18

0 Answers0