So I have an issue when trying to cache on the first request a user sends and then use the cached data in all following requests. A lot of data is fetched on the first request and it's pretty resource heavy so that's why I'm caching it.
The issue is that the user/frontend will on some pages send multiple requests simultaneously. This triggers the resource heavy mechanism to create the cache entry multiple times. Is there some good way to "consolidate" these requests to only trigger the caching logic once?
Currently it works like this:
Name | Time | Triggers recaching logic? |
---|---|---|
Request 1 | 00:00 | Yes <-- Only this one should fetch the resources to cache |
Request 2 | 00:00 | Yes <-- This should wait for cache to be populatd by Request 1 |
Request 3 | 00:00 | Yes <-- This should wait for cache to be populated by Request 1 |
Request 4 | 00:05 | No |
Request 5 | 00:15 | No |
I don't think my actual code is very relevant since I'm more looking for a general concept of this. I've tried looking around for solutions but can't find anything applicable, and I've also attempted to put a dictionary of <cacheKey, cachingTask> that contains the caching tasks while they're being processed but it seems the requests are too close in time for this to work as well.