0

I am trying to read a dataset that has the following properties:

enter image description here

I am using HDF5 compiled static library for Visual Studio 2017. I have been mainly using the C++ API and had no problem with reading non-compressed datasets. How should I read LZF compressed dataset in this case?

1 Answers1

2

You need to build the lzf filter and make it available to HDF5.
The h5py repo has some information on how to do this.

Basically you need to clone the h5py repo, build the lzf filter as a shared library and then put it into /usr/local/hdf5/lib/plugin and optionally point the environment variable to that location:

git clone https://github.com/h5py/h5py.git
cd h5py/lzf && gcc -O2 -fPIC -shared lzf/*.c lzf_filter.c -o liblzf_filter.so
mkdir -p /usr/local/hdf5/lib/plugin
cp liblzf_filter.so /usr/local/hdf5/lib/plugin

You can then test it with h5dump dataset.hdf5. If the lzf filter is properly detected and loaded it should dump the contents of the compressed dataset. If not it will show an error.

Ümit
  • 17,379
  • 7
  • 55
  • 74
  • Can you tell me how I statically link the h5py/lzf with the HDF5? On the read me, it says `gcc -O2 lzf/*.c lzf_filter.c myprog.c -lhdf5 -o myprog`, but I am not sure how I can statically link it with the HDF5 source code. – coffeenator Jun 17 '20 at 16:59
  • I don't think you need to specifically link the lzf filter to youyr static binary. It should be enough that you compile just the lzf filter as described in the README and put it into the right location (/usr/local/hdf5/lib/plugin) and then your binary which uses HDF5 should be able to read the lzf compressed datasets because HDF5 will dynamicaly load the lzf filter plugin. Try to run h5dump on the lzf conpressed dataset. Without the plugin you should get an error. With the plugin it should display the contents of the dataset. – Ümit Jun 18 '20 at 07:47