-1

Now I have a bunch of hdf5 chunks, their structures are all the same. And I would load them as a whole in python, is there any easiest way?

(For now, I'd have to use for loop to load them file by file and then concatenate together, but I believe this seems dumb and there should be something easier to use).

kcw78
  • 7,131
  • 3
  • 12
  • 44
Hangci Du
  • 25
  • 1
  • 5
  • Please add more details to your question. Have you written any code? Also, what do you mean by "chunks"? (FYI chunks has a specific meaning with HDF5: it designates _**chunked storage**_ of a dataset.) I suspect you mean you have "_a bunch of **HDF5 files**_". If so, please share basic info about schema and dataset dtype/shape. – kcw78 May 09 '22 at 13:54
  • 1
    Does this answer your question? [How to Combine Two HDF5 Datasets without intermediate buffer](https://stackoverflow.com/questions/68025342/how-to-combine-two-hdf5-datasets-without-intermediate-buffer) – kcw78 May 09 '22 at 13:57
  • @kcw78 In the end I just wrote `for` loop in python, but not as done by that. I use a lot cache to make it easier for reading. – Hangci Du May 10 '22 at 01:29

1 Answers1

0

I found the solution:

we could use the .copy method in h5py.

Answered by: How to Combine Two HDF5 Datasets without intermediate buffer

Hangci Du
  • 25
  • 1
  • 5