0

For some reason, I need to pass some large numerical vectors through a json.

A trivial serialization leads to the following plain text:

[[0, 0.0003963873381352339], [1, 0.0008297143834970196], [2, 0.0007295088369519877],
 [3, 0.0007836414989179601], [4, 0.0007501355526877312], ...

This is quite inefficient as we are using 15 bytes to store 4 byte floating points. What could be a more efficient way to encode numbers in a json file? Efficient as in size of the generate files; parsing time/resources is somewhat less important.

Later consumers of such json are python and javascript scripts, with common open source libraries available for parser.

00__00__00
  • 4,834
  • 9
  • 41
  • 89
  • Does this answer your question? [Which is the best way to compress json to store in a memory based store like redis or memcache?](https://stackoverflow.com/questions/15525837/which-is-the-best-way-to-compress-json-to-store-in-a-memory-based-store-like-red) – oskros Feb 28 '23 at 08:23
  • Do you have control of both ends? Can you use Protobuf? – Tom McLean Feb 28 '23 at 08:33
  • to some extent yes, please porpose in an answer – 00__00__00 Feb 28 '23 at 08:44

0 Answers0