5

I'm working on a web application for searching, filtering, and exploring a fixed data set. The data set fits in about 10 MB of JSON. It will change a few times a year, and it can be updated incrementally.

I would like to store this data in the browser to avoid latency. This is a hard requirement -- the profound gain in responsiveness is the entire reason to create this application when many excellent alternatives exist. I accept that it won't work well for some users.

What is the best way to store this data persistently in the browser? I could pack it into localStorage, but I'd have to decompress and JSON.parse it on every page load. I've heard good things about IndexedDB, but I'm not sure that it's well-suited to storing a single giant blob.

Thom Smith
  • 13,916
  • 6
  • 45
  • 91

1 Answers1

1

If your data is smaller than 10MB then you could store it in localStorage however the size of the storage can customised by the user so this seems on the limit. Also you would be stuck doing parse/stringify frequently, and only able to filter in-memory.

Using indexedDB is probably your best option as it can storage more. There is apparently no limit to the size of a single item, so you could do one giant value blob if you wanted to.

Can you describe the structure of the data further? If its a list then your data is partitionable. You could leverage indexes to enable searching for exact list item values quite efficiently.

Uzer
  • 3,054
  • 1
  • 17
  • 23
  • 2
    The application is a Magic: The Gathering card directory, and the data is lists of cards, sets, printings, and other related information. I would like to do the filtering and other processing in memory anyway; it's a lot easier, it's probably faster, and it isn't *that* much data (fewer than 20,000 cards). Queries will often be much more complex than a simple index match. – Thom Smith Aug 03 '18 at 02:31