Our app is basically an image search engine. We have around 20000-30000 images and we want to search them either by keyword or by category (or both of them).
The first consideration is taking all the information asyncronously at the beginning (on page loading) so the app can work as a spa (without latency delay and minimizing the impact on the server).
Our jsons are normalized so we decrease the size of the files. We won't modify data from the client side so this is the only benefict we get from normalizing jsons
categories.json:
{
language: "xxxx",
"categories": [
{
"id": 1,
"parent_id": null,
"label": "House"
}, {
"id": 2,
"parent_id": 1,
"label": "Furniture"
},
....
]
}
keywords.json:
{
"language": "xxx",
"keywords":
[
{
"id": "table",
"images": [2381, 2746, 3602, 4038, 6572, 6572, 13176, 13273, 28659, 28660],
"cat": [1, 2]
},
....
]
}
images.json:
{
"base-url": "http://www.xxxxx.org/images/",
"images": [
{
"id": 2381
"license": 7,
"type": 3,
"file": "4.jpg"
},
.....
]
}
license.json and type.json are similar.
As we thought about normalized jsons, we have to problems:
We should denormalize our data for showing to the user.
We shoud cache data so we don't download the json files almost everytime the user opens our webpage.
These are the solutions we've thought, so we'd like to know which one you would choose:
We shoud denormalize our data with JavaScript and send it as a property to the react component. I'm almost new to React/Redux so I don't know if there is any pattern, library... to do it.
Maybe the best way to normalize/denormalize our data could be an IndexedDB but it's not broad supported. In this way, normalize /denormalize could be easier than using javascript and we'd get our data cached.
Forget about our initial idea, request denormalized json data to the API Server so each time we search images we make an ajax request. Our API still needs to be done so it would be ok. This design is more scalable as our bank of images could grow.