20

I'm using the cache provided by System.Runtime.Caching.MemoryCache.

I'd like to enumerate over the cache's items so that I can invalidate (evict then reload) items as such

foreach (var item in MemoryCache.Default) { item.invalidate() }

But the official docs found here state:

!Important: Retrieving an enumerator for a MemoryCache instance is a resource-intensive and blocking operation. Therefore, the enumerator should not be used in production applications.

Surely there must be a simple and efficient way to iterate over the cache's items?

Peter Marks
  • 989
  • 2
  • 8
  • 15
  • Maybe you will somehow need to know in ADVANCE what things are in the cache... But this would mean using an array or something. – sinni800 Nov 05 '11 at 21:53
  • Yep, that's what I'm doing now, but that's crazy as it means I'm tracking what the cache is already tracking! :) – Peter Marks Nov 05 '11 at 21:56
  • Exactly. But if the cache holds the large amounts of data and your array only the information to REACH that data it holds up well. – sinni800 Nov 05 '11 at 22:10
  • True. But I'd like the cache to do it if possible. I'm sure I've just missed something obvious, I mean this is a heavily used caching provider, and I'm sure people are iterating over it in production systems... – Peter Marks Nov 05 '11 at 22:13
  • @Peter IIRC `MemoryCache` already has various expiry options; can you be more specific what the scenario is that requires this, so we can try to give appropriate answers? – Marc Gravell Nov 05 '11 at 22:56
  • @MarcGravell I've set the data to never expire. When some data changes, I'd like to invalidate the stale items in the cache. This can be done with ``ChangeMonitors`` but I wanted to do avoid over-engineering my code by doing that manually (it's also easier as I am the one changing the db, so I want to just go ahead and invalidate the stale data). It looks like there's no lightweight way to enumerate, so maybe the only option is to use a dummy list as described above. – Peter Marks Nov 05 '11 at 23:29
  • It seems the ChangeMonitors are the only effective way of achieving this. I don't think this is over engineering, it's only a sensible choice. – sinni800 Nov 05 '11 at 23:33
  • @Peter just a thought, but have you considered other storage metaphors here? I use redis a lot, and while `KEYS` is also discouraged, the pub/sub stuff makes it easy to do change-notification – Marc Gravell Nov 05 '11 at 23:34
  • Wanted something quick-and-simple, and MemoryCache is close to the old ASP.NET caching classes, so migrating old code was easy. Would have been nice though if it had more functionality out of the box, maybe in the next BCL. – Peter Marks Nov 05 '11 at 23:38
  • @sinni800 Probably going to use the list approach as it's the easiest way to solve the problem without the overhead of ChangeMonitors. If you'd like to add your comment as an answer, I'll accept it. – Peter Marks Nov 05 '11 at 23:39
  • 1
    Killing a cache, in general, is an expensive operation, because you need to get a lock for each item. – Sklivvz Nov 06 '11 at 09:23
  • @Sklivvz Yes I now agree with your thinking, as I noted in my "answer" below. I guess my question is now whether to use the enumerator as stated above, or to use some other approach. Perhaps the list approach is still valid, where you check whether the item is still in the cache, and if it is, you invalidate it. But then of course you also need to repopulate your tracking list. – Peter Marks Nov 07 '11 at 18:09
  • One of an implementations to get the keys via reflection(not recommended in production due to unreliable performance) https://github.com/alastairtree/LazyCache/issues/56#issuecomment-582238774 – Michael Freidgeim Nov 20 '20 at 04:48
  • Does this answer your question? [How to retrieve a list of Memory Cache keys in asp.net core?](https://stackoverflow.com/questions/45597057/how-to-retrieve-a-list-of-memory-cache-keys-in-asp-net-core) – Michael Freidgeim Nov 20 '20 at 04:55

3 Answers3

11

Suggestions made so far have been great, but my need is still as stated: to iterate over the cache's items. It seems like such a simple task, and I expect that the cache internally has some sort of list structure anyway. The docs and the feature set for MemoryCache are wanting.

So as discussed above, I've added a list to my cache adapter class, which holds a reference to each item I place in the cache. If I need to iterate over the cache--not just for invalidation, but for gathering statistics, etc.--then I iterate over my list.

If the number of items placed in the cache does not change, then this is a reasonable solution. If the number does change, then you need to insert/remove via the adapter class, so as to keep the list in sync with the actual cache. Messy but it works, and avoids the perf penalties alluded to in the docs.

Hopefully MemoryCache cache provider will be fleshed-out in the next platform release.

Peter Marks
  • 989
  • 2
  • 8
  • 15
  • Hehe, I was too slow :). No worries! – sinni800 Nov 06 '11 at 11:48
  • 1
    I finally realised why the class was written the way it was. When iterating over the cache, you may encounter a race condition because an item is evicted during the iteration! So you need to lock the cache. BUT, if you have the use case, as I did, where you set the items to never expire, then this is not applicable (of course my items may be evicted anyway, because of low memory, but that's a border case if your cache is small and you're using a modern server). – Peter Marks Nov 07 '11 at 17:59
  • More notes. This got me thinking that my solution is actually quite *dangerous* and possibly stupid. It is far more sensible to accept the perf hit, and allow the MemoryCache to be locked during iteration--or to use some other way to invalidate all its items. If the cache is not invalidated often, then its fine to just live with the perf hit: just think of the time it takes to warm a cache--invalidating the whole thing is no different. Nonetheless, the docs do warn not to use the enumerator in production code: I wonder why? Perhaps there are other issues I'm not aware of? – Peter Marks Nov 07 '11 at 18:04
  • I believe the final answer to this question is "yes", but the real answer is "don't do it". This class needs more features, but unsynchronised enumeration is not one of them. – Peter Marks Nov 07 '11 at 22:01
  • This makes everything so clear. Multithreadedness offers so many different concerns to programmers... If your application was single threaded it obviously wouldn't even remotely make a difference if you did the iterating. – sinni800 Nov 07 '11 at 22:02
  • Ah yes... the glorious days of singlethreadedness. When you only had to worry about such esoteric things as heaps, stacks, and even stackoverflows! :P – Peter Marks Nov 07 '11 at 22:20
1

Consider using ChangeMonitors, which allow you to automatically evict stale entries when certain conditions are met.

See Is there some sort of CacheDependency in System.Runtime.Caching?

This is similar to System.Web.Caching CacheDependencys, who allow you to evict entries when files or other cache entries change.

Community
  • 1
  • 1
Will
  • 2,512
  • 14
  • 19
-2

In 2014,

This is the correct way to get all the items:

Dim AllItems = MemoryCache.Default.Select(Of ItemType)(Function(O) O.Value)

Hope this helps someone.

Dejisys
  • 130
  • 1
  • 7
  • 2
    Downvote. This is same as 'enumerating' the cache items which as OP states from official documentation is 'resource-intensive' and should be avoided. See [The problem with enumeration](http://stackoverflow.com/a/22388943/219516). – publicgk Jan 02 '15 at 08:22