1

I'm trying to implement data caching for a web app in ASP.NET, this is for class and I've been asked to limit the number of entries in the ObjectCache, not by memory size but by the number of entries itself. This is quite easy since I can call ObjectCache.Count, but when the cache grows beyond the established limit (5, just for testing) I can't figure out how to remove the oldest element stored since it's alphabetically sorted.

This is being implemented in a Service, at the Data Access layer so I can't use any additional structure like a Queue to keep track of the insertions in the cache.

What can I do? Is there a way to filter or get the older element in the cache?

Here's the method code

        public List<EventSummary> FindEvents(String keywords, long categoryId, int start, int count)
    {
        string queryKey = "FindEvent-" + start + ":" + count + "-" + keywords.Trim() + "-" + categoryId;
        ObjectCache cache = MemoryCache.Default;
        List<EventSummary> val = (List<EventSummary>)cache.Get(queryKey);
        if (val != null)
            return val;

        Category evnCategory = CategoryDao.Find(categoryId);
        List<Event> fullResult = EventDao.FindByEventCategoryAndKeyword(evnCategory, keywords, start, count);
        List<EventSummary> summaryResult = new List<EventSummary>();

        foreach (Event evento in fullResult)
        {
            summaryResult.Add(new EventSummary(evento.evnId, evento.evnName, evento.Category, evento.evnDate));
        }

        if (cache.Count() >= maxCacheSize)
        {
            //WHAT SHOULD I DO HERE?
        }

        cache.Add(queryKey, summaryResult, DateTime.Now.AddDays(cacheDays));

        return summaryResult;
    }
Trigork
  • 177
  • 1
  • 16
  • Are you able to modify the CacheObject class? – DanielS May 31 '15 at 18:23
  • Another question is how many entries you'll need to preserve in a real world scenario? – DanielS May 31 '15 at 18:25
  • And a final remark, in my opinion, it's the responsibility of the ObjectCache to deal with max size issues, not the calling code – DanielS May 31 '15 at 18:27
  • No, CacheObject class should be used as it's provided. Real world scenario doesn't matter since this is just homework, but probably the number would be so huge that it would be better to control tha cache through expiration dates rather than size. – Trigork May 31 '15 at 18:36
  • Since I can't modify ObjectCache and I don't know how to use its class methods I guess I have to control it from outside, but that's the question itself. – Trigork May 31 '15 at 18:37
  • 1
    I agree with @DanielS. `MemoryCache` does have a LRU (Least Recently Used) policy on the `Trim` method, but it is only accessible through a percentage of items to remove. You might want to hack around it and use `1/cache.Count()` to get around it an remove only one item, but the percentage parameter is an int, so you cannot have precision and, therefore, no real guarantee that only 1 object was removed (and it would be very sketchy anyways) – Phil Gref May 31 '15 at 18:37
  • The Trim thing might solve it, I'm gonna mark this as useful. Feel free to post that as an answer. – Trigork May 31 '15 at 18:38
  • If you can't modify the ObjectCache, I would create a wrapper object (Decorator pattern) that would store additional dictionary of the ids/ insertion dates and make all my cache interactions through that object – DanielS May 31 '15 at 18:44

2 Answers2

0

As mentioned in the comments, the Trim method from MemoryCache has a LRU (Least Recently Used) policy, which is the behavior you are looking for here. Unfortunately, the method is not based on an absolute number of objects to remove from the cache, but rather on a percentage, which is an int parameter. This just means that, if you try to hack your way around it and pass 1 / cache.Count() as the percentage, you have no control over how many objects have truly been removed from the cache, which is not an ideal scenario.

Another way to do it would just be to go with a DIY approach and simply not use the .NET caching utilities since, in our case, they do not seem to natively exactly fit your needs. I'm thinking of something along the lines of a SortedDictionary with the timecode of your cache objects as the key and a list of cache objects inserted into the cache at the given timecode as you values. It would be a good and, IMO, not too daring exercice to try and reproduce the .NET cache behavior you are already using, with the additionnal benefit of directly controlling the removal policy yourself.

Phil Gref
  • 987
  • 8
  • 19
  • Combines the initial useful answer and the following comment by @DanielS about implementing a DIY solution. Good answer, thanks! – Trigork May 31 '15 at 19:17
0

As a side comment,not directly related to your question,
the biggest problem with caches in managed memory models is GC. The moment you start storing over a few million entries you are asking for eventual GC pauses even with the most advanced non-blocking GCs.

It is hard to cache over 16 Gb, without pausing every now and then for 5-6 seconds (that is stop-all).

I have previously described here: https://stackoverflow.com/a/30584575/1932601 the caching of objects as-is is eventually a bad choice if you need to store very many expiring entries (say 100 million chat messages)

Take a look at what we did to store hundreds of millions of objects for a long time without killing the GC.

https://www.youtube.com/watch?v=Dz_7hukyejQ

Community
  • 1
  • 1
itadapter DKh
  • 596
  • 3
  • 7