0

Suppose a folder contains 100 binary .npy files of 2D arrays. They all have the same type and size. Each of the files requires 1GB of disk space.

Is it possible to create a huge .npy file in which all of the arrays are merged, say along axis=0, without having to load them into memory?

RolleRugu
  • 233
  • 2
  • 11
  • Possible duplicate of [How to put many numpy files in one big numpy file without having memory error?](https://stackoverflow.com/questions/42385334/how-to-put-many-numpy-files-in-one-big-numpy-file-without-having-memory-error) – Max Crous Nov 21 '19 at 15:40
  • Try using a [memory mapped array](https://docs.scipy.org/doc/numpy/reference/generated/numpy.memmap.html) – Matt Eding Nov 21 '19 at 16:01
  • The `npy` format is not designed for appending. It has an initial block with basic info like shape, strides and dtype, and a big 1d dump of the array's data-buffer. If you concatenate several arrays in memory, the new array has a new shape, and the data-buffer joins all of the original buffers. – hpaulj Nov 21 '19 at 17:29

0 Answers0