0

I have a large matfile MEG data set (more than 2 GB) and I want to open that to figure out what is in it.

I have tried this

import h5py

with h5py.File('subject_1.mat', 'r') as file:

    print(list(file.keys()))

which gives

['#refs#', 'Data'] 

so I try

myvar = f['Data'].value

but then my kernel will shut down and crash.

I expect to see a time series but it is possibly too large for my computer. Please note that my RAM is 16 GB.

merv
  • 67,214
  • 13
  • 180
  • 245
payam
  • 41
  • 1
  • 5
  • Did you try loading the .MAT file in MATLAB? Also, are you sure that your .MAT file uses the HDF5 binary data format (.MAT version 7.3)? – m_power Aug 06 '19 at 17:17
  • Assuming your .mat file is a HDF5 file (version 7.3 or higher), you still have a challenge. The schema Matlab uses is not simple or easy (especially if you are new to HDF5). There are A LOT of object references. I wrote an Answer that explains how object references work for the SVHN data in .MAT/HDF5 format with Python/h5py. Take a look at this SO Q&A for insights:[the difference between the two ways of accessing the hdf5](https://stackoverflow.com/a/55643382/10462884). Also, it has links to some github projects that might help you. – kcw78 Aug 06 '19 at 18:13

0 Answers0