4
  • I was curious to know if there is any limit for data caching in Single page applications using shared service or ngrx.
  • Does caching too much data on front end impacts the overall performance of web Application (DOM).

Lets say I have a very big complex nested object which I am caching in memory

single page application caching

Now assume that I want to use different subsets of object in different modules/components of our application and for that I may need to do lot of mapping operations(using loops by matching the id's etc) on UI.

I was thinking in other way around that instead of doing so much operations on UI to extract the relevant data why don't I use a simple API with having id parameter to fetch the relevant information if its not taking much time to get the data from backend.

url =  some/url/{id}

So is it worth to cache more complex nested objects if we cant use its subset simply by its properties obj[prop] and need to do lot of calculations on UI (looping etc) which actually is more time consuming than getting the data from rest API ?

Any help/explanation will be appreciated !!!

Thanks

shreyansh
  • 1,637
  • 4
  • 26
  • 46
  • 2
    The real answer here is "it depends". It depends on your application, its users, its requirements…there is no one-size-fits-all answer. You may wish to make the backend do more work, but that may also increase the number of requests (which you may want to reduce). Same goes for frontend calculations: depends on the dataset, the browser, the types of calculation…it really all depends on the specifics of your application. That's why architecture is a large part of our job :) – Will Alexander Aug 01 '20 at 09:41
  • Thanks @WillAlexander, but is there any limit of caching the data in memory (size), for any ideal SPA – shreyansh Aug 01 '20 at 09:49
  • Last I saw, Chrome limited things to 2GB per tab…so for NGRX type data, you're pretty safe… – Will Alexander Aug 01 '20 at 09:53
  • It depends on the user's browser and computer, as browser configuration and RAM are strict limiting factors – Will Alexander Aug 01 '20 at 09:54

1 Answers1

0

Caching too much data in memory is not a good idea. It will affect your application performance. Causes performance degradation in a system having less memory. theoretically cache memory is for keeping less amount of data. The maximum support size is 2GB. I think chrome is also supported up to that limit. For keeping data of big size in client-side never use memory cache instead you should use client-side database/datastore. It uses disk space instead of memory. There are number of web technologies that store data on the client-side like

  • Indexed Database
  • Web SQL
  • LocalStorage
  • Cookies

Depending upon the client application framework it can be decided.

By default browser uses 10% of disk space for these data stores. We also have option to increase that size.

sreejithsdev
  • 1,202
  • 12
  • 26