0

People tend to use in-memory caches for faster response times and avoiding the re-computation of what could be prevented for the time being.

A simple in memory cache looks like:

const cache = {};
cache["id"] = {title: "...", score: "2", computations: {...}}

This is usually how I have seen people use the in memory cache. Mostly it would be a map or an array or any other data structure declared globally and accessible to the complete application)

But, here is my experience with the in-memory cache for a very data heavy application. Whenever I have started storing lots of data in the in-memory-cache, it starts to give "heap out of memory error". I understand this, but what then is the effective way to use an in-memory cache?

If I understand this correctly, everything (the object or the cache) resides inside the heap. And since this is a finite resource, pumping in more data will eventually give errors like these. But I have no idea on how to effectively use the in-memory cache. Are there already best practices established? Should there be a routine that continuously checks for the "size" of the cache object and free if necessary?

What if I need to cache data around 10GB? I know (as already pointed in the link above), that I could always increase the heap size, but is this the only thing I need to know when working with in-memory cache or the heap? (Continuously increasing the heap size using node --max-old-space-size=xMB yourFile.js) does not seem right.

I have always imagined an in-memory cache to be a very powerful tool, but have never been able to use it effectively. (As at some point, I always had to face heap out of memory error)

trincot
  • 317,000
  • 35
  • 244
  • 286
Suhail Gupta
  • 22,386
  • 64
  • 200
  • 328

1 Answers1

0

It's time to find out when your cache expires and how it removes old data.

First, you have to specify how long data should remain in the cache before being removed. Adjust the keys' TTL (a.k.a time-to-live) to fit the needs of your cache.

On the other hand, cache eviction policies are used to ensure that a cache does not exceed its maximum limit by removing older objects as new ones are added. There are several algorithms to choose from such as Least Recently Used (LRU), Least Frequently Used (LFU), Most Recently Used (MRU), and First In, First Out (FIFO). These algorithms vary in how they decide which objects to evict from the cache. It is important to keep in mind that a global policy may not be appropriate for every item and a combination of policies may be needed for optimal results.

There is a comprehensive article about caching here. I hope you find it helpful: Caching In Node.js Applications

Cong
  • 131
  • 1
  • 5