I am pickling a 1544 x 1936 x 256 array of 2 byte integers created by numpy.zeros((1544,1936,256),dtype='int16')
which logically should take up ~1.5 GB of memory and did when I used to run this program on windows. Recently I moved to Ubuntu and when I run this command the resulting file is 6.1 GB. Why is the resulting file nearly 4x bigger than expected on linux?
Asked
Active
Viewed 60 times
0

Nolan McCarthy
- 9
- 4
-
You should use `numpy.save` and `numpy.load` instead, it will keep you much closer to the "expected" size. Pickle is actually a transformation of the data, so there is no expectation of maintaining the native data size. – Benjamin Dec 14 '16 at 18:04
-
See for example: http://stackoverflow.com/questions/30253976/pickling-pandas-dataframe-does-multiply-by-5-the-file-size – Benjamin Dec 14 '16 at 18:04
-
`numpy.save` and `numpy.load` solved my problem. Thanks. – Nolan McCarthy Dec 14 '16 at 19:00
1 Answers
0
You should use numpy.save
and numpy.load
instead, it will keep you much closer to the "expected" size. Pickle is actually a transformation of the data, so there is no expectation of maintaining the native data size.
See for example: stackoverflow.com/questions/30253976/… – Benjamin

Armali
- 18,255
- 14
- 57
- 171