I have a bunch of json objects that I need to compress as it's eating too much disk space, approximately 20 gigs
worth for a few million of them.
Ideally what I'd like to do is compress each individually and then when I need to read them, just iteratively load and decompress each one. I tried doing this by creating a text file with each line being a compressed json object via zlib, but this is failing with a
decompress error due to a truncated stream
,
which I believe is due to the compressed strings containing new lines.
Anyone know of a good method to do this?