76

I'd like to add caching capabilities to my application using the System.Runtime.Caching namespace, and would probably want to use caching in several places and in different contexts. To do so, I want to use several MemoryCache instances.

However, I see here that using more than one instance of MemoryCache is discouraged:

MemoryCache is not a singleton, but you should create only a few or potentially only one MemoryCache instance and code that caches items should use those instances.

How would multiple MemoryCache instances affect my application? I find this kind of weird because it seems to me that using multiple caches in an application is a pretty common scenario.

EDIT: More specifically, I have a class that should keep a cache for each instance. Should I avoid using MemoryCache and look for a different caching solution? Is using MemoryCache in this situation considered bad, and if so, why?

Adi Lester
  • 24,731
  • 12
  • 95
  • 110
  • 1
    What's the difference between a single cache and many caches? After all, they're all going to do the same thing. – spender Dec 11 '11 at 12:33
  • 13
    For one, I won't have to worry about key collisions as much. Also, I believe it's more organized and easier to debug than having one object that holds everything the application caches. – Adi Lester Dec 11 '11 at 12:41
  • 5
    It means that you should not create many caches which do cache the same thing. It is much better to cache as much as possible by one central cache. But it is perfectly ok to create many caches which do cache differnt things. – Alois Kraus Dec 11 '11 at 12:53
  • If you are interested in a distributed cache, you could learn about the Windows AppFabric Cache. Comes in Win 2008 and provides a more powerful caching system that leverages external servers. Here is an article about it http://www.hanselman.com/blog/InstallingConfiguringAndUsingWindowsServerAppFabricAndTheVelocityMemoryCacheIn10Minutes.aspx – agarcian Dec 11 '11 at 13:23
  • 11
    I use several too. Generally one per type. It sure would be wonderful if documentation that said "you should..." would say "you should... because...". – Kit Aug 23 '12 at 19:45
  • @spender Also, sometimes I want to clear a cache of a specific context, but not others, which would be much easier if the context had a dedicated cache (that's how I wound up at this question anyway!) – jleach Oct 05 '16 at 12:59
  • #WhenYouWantSomethingThatYouAssumedWasEasyButTheyMakeItHardAndThenPeopleSeemToHaveNoClueWhyYouWantIt – Simon_Weaver Jan 12 '19 at 04:03
  • I guess the framework doesn't always provide everything. LazyCache looks promising : https://www.hanselman.com/blog/UsingLazyCacheForCleanAndSimpleNETCoreInmemoryCaching.aspx – Simon_Weaver Jan 12 '19 at 04:11

3 Answers3

79

I recently went through this myself as well. Considering an in memory cache will be process specific (not shared across multiple instances of a website or native business app or multiple servers) there is really no benefit to having multiple MemoryCache instances except for code organizational reasons (which can be achieved in other ways).

The Memory cache is intended to be used alone mostly because of its memory management capabilities. In addition to the performance counters (which do have some overhead) the MemoryCache is also able to expire items when it runs out of allocated memory.

If the current instance of the cache exceeds the limit on memory set by the CacheMemoryLimit property, the cache implementation removes cache entries. Each cache instance in the application can use the amount of memory that is specified by the CacheMemoryLimit property.

from MemoryCache.CacheMemoryLimit Property

By using only one instance of the MemoryCache it can apply this memory management efficiently across the entire application instance. Expiring the least important items across the entire application. This ensures maximum memory use, without exceeding your hardware capabilities. By limiting the scope of any one MemoryCache (like to one instance of a class) it can no longer effectively manage memory for your application (as it can't "see" everything). If all of these cache's were "busy" you may have a harder time managing memory and it will never be nearly as efficient.

This is particularly sensitive in applications which don't have the luxury of a dedicated server. Imagine you are running your app on a shared server where you've only been allocated 150mb RAM (common cheap $10/month hosting) you need to count on your cache to use that to the max without exceeding it. If you exceed this memory usage your app pool will be recycled and your app loses all in memory caches! (common cheap hosting practice) The same could apply to a non-web app hosted in house on some shared corporate server. Same deal, you're told not to hog all the memory on that machine and to peacefully co-exist with some other line of business apps.

That memory-limit, app pool recycle, lose caches thing is a common "Achilles heel" to web apps. When the apps are their busiest, they reset the most often due to exceeding memory allocations, losing all cache entries and therefor doing the most work re-fetching stuff that should have been cached in the first place. Meaning the app actually loses performance at max load instead of gaining.

I know MemoryCache is the non-web specific version of System.Web.Caching.Cache implementation, but this illustrates the logic behind cache implementation. The same logic can apply in a non-web project if you don't have exclusive use of the hardware. Remember if your cache forces the machine to start doing pagefile swaps then your cache is no longer any faster than caching on disk. You'll always want a limit somewhere, even if that limit is 2gb or something.

In my case after reading up about this, I switched to using one 'public static MemoryCache' in my app and I simply segregated cached items by their cache keys. For example if you want to cache on a per instance you could have a cache key like something like "instance-{instanceId}-resourceName-{resourceId}". Think of it as name spacing your cache entries.

starball
  • 20,030
  • 7
  • 43
  • 238
BenSwayne
  • 16,810
  • 3
  • 58
  • 75
  • 2
    Thanks, that was helpful. However, I'm not looking to limit my cache based on memory size, but possibly by capacity - where I want each instance to have a certain capacity and I don't care about the combined caches. Segregating cache items by a naming convention isn't good for that case, and it seems a bit forced either way. – Adi Lester Nov 19 '12 at 20:52
  • 1
    @AdiLester You've got it spot on. Ideally, Microsoft would finish implementing cache "Regions" for their `MemoryCache` so that you wouldn't need these forced "namespaced" keys and you could query the number of cached items in any Region. But that's not currently supported for the non-web cache implementation. :-( In the meantime sounds like you know your limitations and are ok with them. – BenSwayne Nov 19 '12 at 22:00
  • Can you please show some code example that explains the benefit of above usage over Single Instance. In this it will become more helpful and more popular. – Imad Alazani Jul 16 '13 at 04:26
  • 1
    Excellent answer. They should add this to their documentation! – jleach Oct 05 '16 at 13:04
6

I use several too. Generally one per type.

Looking at the MemoryCache I see that it hooks into AppDomain events and maintains performance counters. I suspect then there's some overhead resource-wise by using more than one (e.g. CPU, counters, and memory) and that's why it's discouraged.

Kit
  • 20,354
  • 4
  • 60
  • 103
  • Then why don't they 'up' its capabilities and allow you to at least create ONE level of subcategorization with different rules. Right now I'm up to two caches (hence finding this question). An image cache which has large objects that I want constrained by memory, and also pretty much to not expire - and a tiny cache for verifying USPS addresses so if someone keeps verifying the same address within 15 minutes I want it to not keep hitting a third party service. It's funny because I find myself wanting something that's either simpler or more complex to solve the same problem. – Simon_Weaver Jan 12 '19 at 04:08
1

MemoryCache is one of the most baffling APIs to me out of the box.

Consider instead using something like LazyCache, which is a layer on top to make things much easier.

Simon_Weaver
  • 140,023
  • 84
  • 646
  • 689
  • 1
    I have summited a [pull request](https://github.com/alastairtree/LazyCache/pull/187 "Improve the locking per key logic in the CachingService") to the alastairtree/LazyCache repository, and 5 months later I have still to get feedback from the owner. Which is not a good sign. But I agree that Microsoft's `MemoryCache` is even worse. – Theodor Zoulias Aug 23 '23 at 06:20