I have a large numpy
array (188,995 values to be exact) containing 18-digit integers. Here would be the first 5:
array([873205635515447425, 872488459744513265, 872556415745513809,
872430459826834345, 867251246913838889])
The array's dtype is dtype('int64')
. I'm currently storing this array in a .npy
file that's 1.5mb in size.
I'll be storing a couple of these arrays every day, and I want to be conscious of storage. If it helps, the integers are always 18-digits long. They don't have any discernible pattern, so dividing them down won't work.
I was able to decrease the file size to 1.4mb by gzip compressing and storing as a .npy.gz
file, but that's the lowest it'll go.
Is there a way to compress the array down further?