I have created a cache using the MemoryCache class. I add some items to it but when I need to reload the cache I want to clear it first. What is the quickest way to do this? Should I loop through all the items and remove them one at a time or is there a better way?
-
2For .NET core check [this](https://stackoverflow.com/a/49425102/3170087) answer. – Makla Mar 27 '18 at 09:34
12 Answers
-
3I initially used MemoryCache.Default, causing Dispose to give me some grief. Still, Dispose ended up being the best solution I could find. Thanks. – LaustN Dec 06 '10 at 11:16
-
13@LaustN can you elaborate on the "grief" caused by MemoryCache.Default? I'm currently using MemoryCache.Default... MSDN's MemoryCache documentation makes me wonder if disposing and recreating is recommended: "Do not create MemoryCache instances unless it is required. If you create cache instances in client and Web applications, the MemoryCache instances should be created early in the application life cycle." Does this apply to .Default? I'm not saying using Dispose is wrong, I'm honestly just looking for clarification on all this. – ElonU Webdev Oct 11 '11 at 20:19
-
8Thought it was worth mentioning that `Dispose` _does_ invoke any `CacheEntryRemovedCallback` attached to current cached items. – Mike Guthrie Jul 31 '12 at 19:19
-
9@ElonU: The following Stack Overflow answer explains some of the grief you may encounter disposing of the default instance: http://stackoverflow.com/a/8043556/216440 . To quote: "The state of the cache is set to indicate that the cache is disposed. Any attempt to call public caching methods that change the state of the cache, such as methods that add, remove, or retrieve cache entries, might cause unexpected behavior. For example, if you call the Set method after the cache is disposed, a no-op error occurs." – Simon Elms Jan 31 '13 at 23:34
The problem with enumeration
The MemoryCache.GetEnumerator() Remarks section warns: "Retrieving an enumerator for a MemoryCache instance is a resource-intensive and blocking operation. Therefore, the enumerator should not be used in production applications."
Here's why, explained in pseudocode of the GetEnumerator() implementation:
Create a new Dictionary object (let's call it AllCache)
For Each per-processor segment in the cache (one Dictionary object per processor)
{
Lock the segment/Dictionary (using lock construct)
Iterate through the segment/Dictionary and add each name/value pair one-by-one
to the AllCache Dictionary (using references to the original MemoryCacheKey
and MemoryCacheEntry objects)
}
Create and return an enumerator on the AllCache Dictionary
Since the implementation splits the cache across multiple Dictionary objects, it must bring everything together into a single collection in order to hand back an enumerator. Every call to GetEnumerator executes the full copy process detailed above. The newly created Dictionary contains references to the original internal key and value objects, so your actual cached data values are not duplicated.
The warning in the documentation is correct. Avoid GetEnumerator() -- including all of the answers above that use LINQ queries.
A better and more flexible solution
Here's an efficient way of clearing the cache that simply builds on the existing change monitoring infrastructure. It also provides the flexibility to clear either the entire cache or just a named subset and has none of the problems discussed above.
// By Thomas F. Abraham (http://www.tfabraham.com)
namespace CacheTest
{
using System;
using System.Diagnostics;
using System.Globalization;
using System.Runtime.Caching;
public class SignaledChangeEventArgs : EventArgs
{
public string Name { get; private set; }
public SignaledChangeEventArgs(string name = null) { this.Name = name; }
}
/// <summary>
/// Cache change monitor that allows an app to fire a change notification
/// to all associated cache items.
/// </summary>
public class SignaledChangeMonitor : ChangeMonitor
{
// Shared across all SignaledChangeMonitors in the AppDomain
private static event EventHandler<SignaledChangeEventArgs> Signaled;
private string _name;
private string _uniqueId = Guid.NewGuid().ToString("N", CultureInfo.InvariantCulture);
public override string UniqueId
{
get { return _uniqueId; }
}
public SignaledChangeMonitor(string name = null)
{
_name = name;
// Register instance with the shared event
SignaledChangeMonitor.Signaled += OnSignalRaised;
base.InitializationComplete();
}
public static void Signal(string name = null)
{
if (Signaled != null)
{
// Raise shared event to notify all subscribers
Signaled(null, new SignaledChangeEventArgs(name));
}
}
protected override void Dispose(bool disposing)
{
SignaledChangeMonitor.Signaled -= OnSignalRaised;
}
private void OnSignalRaised(object sender, SignaledChangeEventArgs e)
{
if (string.IsNullOrWhiteSpace(e.Name) || string.Compare(e.Name, _name, true) == 0)
{
Debug.WriteLine(
_uniqueId + " notifying cache of change.", "SignaledChangeMonitor");
// Cache objects are obligated to remove entry upon change notification.
base.OnChanged(null);
}
}
}
public static class CacheTester
{
public static void TestCache()
{
MemoryCache cache = MemoryCache.Default;
// Add data to cache
for (int idx = 0; idx < 50; idx++)
{
cache.Add("Key" + idx.ToString(), "Value" + idx.ToString(), GetPolicy(idx));
}
// Flush cached items associated with "NamedData" change monitors
SignaledChangeMonitor.Signal("NamedData");
// Flush all cached items
SignaledChangeMonitor.Signal();
}
private static CacheItemPolicy GetPolicy(int idx)
{
string name = (idx % 2 == 0) ? null : "NamedData";
CacheItemPolicy cip = new CacheItemPolicy();
cip.AbsoluteExpiration = System.DateTimeOffset.UtcNow.AddHours(1);
cip.ChangeMonitors.Add(new SignaledChangeMonitor(name));
return cip;
}
}
}

- 2,082
- 1
- 19
- 24
-
8
-
Very nice. I've been trying to implement something using chained memorycache monitors and guids but it was starting to get a bit ugly as I tried to tighten up the functionality. – Chao Apr 14 '14 at 16:15
-
9I would not recommend this pattern for general use. 1. Its slow, no fault of the implementation, but the dispose method is extremely slow. 2. If your evicting items from the cache with an expiration, Change monitor still gets called. 3. My machine was swallowing all of the CPU, and taking a really long time to clear 30k items from the cache when I was running performance tests. A few times after waiting 5+ minutes I just killed the tests. – Aaron M Sep 24 '15 at 20:15
-
-
1@PascalMathys Unfortunately, There isn't a better solution than this. I ended up using it, despite the disadvantages, as its still a better solution than using the enumeration. – Aaron M Dec 02 '15 at 16:05
-
9@AaronM Is this solution still better than just disposing of the cache and instantiating a new one? – RobSiklos May 24 '16 at 17:06
-
Here's an improved version of this solution: https://stackoverflow.com/a/55414488/56621 – Alex from Jitbit Nov 14 '20 at 21:34
The workaround is:
List<string> cacheKeys = MemoryCache.Default.Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}

- 7,396
- 10
- 59
- 79
-
41From the [documentation](http://msdn.microsoft.com/en-us/library/system.runtime.caching.memorycache.getenumerator.aspx): _Retrieving an enumerator for a MemoryCache instance is a resource-intensive and blocking operation. Therefore, the enumerator should not be used in production applications._ – TrueWill Dec 18 '12 at 19:52
-
I don't think this is the same as retrieving an enumerator. This is actually pretty fast and doesn't enumerate anything. Still... I like disposing the cache myself. – hal9000 Sep 02 '16 at 16:56
-
4@emberdude It's exactly the same as retrieving an enumerator - what do you thing the implementation of `Select()` does? – RobSiklos Nov 16 '16 at 15:33
-
3Personally, I'm using this in my unit test [TestInitialize] function to clear out the memory cache for each unit test. Otherwise the cache persists across unit tests giving unintended results when trying to compare performance between 2 functions. – Jacob Morrison May 17 '17 at 19:26
-
7@JacobMorrison arguably, unit tests are not a "production application" :) – Mels Jan 01 '18 at 15:42
-
2@Mels arguably, unit tests should be written to the same standards as "production application"! :) – Etherman Jun 13 '20 at 16:43
var cacheItems = cache.ToList();
foreach (KeyValuePair<String, Object> a in cacheItems)
{
cache.Remove(a.Key);
}

- 98,240
- 88
- 296
- 433

- 2,178
- 3
- 36
- 67
-
7This has the same risk as @Tony's response; please see my comment under that. – TrueWill Dec 18 '12 at 19:54
-
1
-
4@AlexAngas - He may have changed his name to magritte. See also http://stackoverflow.com/questions/4183270/how-to-clear-the-net-4-memorycache/22388943#22388943 – TrueWill May 14 '14 at 13:38
If performance isn't an issue then this nice one-liner will do the trick:
cache.ToList().ForEach(a => cache.Remove(a.Key));

- 720
- 7
- 17
It seems that there is a Trim method.
So to clear all contents you'd just do
cache.Trim(100)
EDIT: after digging some more, it seems that looking into Trim is not worth your time

- 1
- 1

- 2,079
- 2
- 21
- 32
-
Thanks for the edit, saved me from the pain of a pointless rabbit hole... – CajunCoding Jul 28 '22 at 23:54
Ran across this, and based on it, wrote a slightly more effective, parallel clear method:
public void ClearAll()
{
var allKeys = _cache.Select(o => o.Key);
Parallel.ForEach(allKeys, key => _cache.Remove(key));
}

- 3,162
- 1
- 18
- 30
You could also do something like this:
Dim _Qry = (From n In CacheObject.AsParallel()
Select n).ToList()
For Each i In _Qry
CacheObject.Remove(i.Key)
Next

- 39
- 1
You can dispose the MemoryCache.Default cache and then re-set the private field singleton to null, to make it recreate the MemoryCache.Default.
var field = typeof(MemoryCache).GetField("s_defaultCache",
BindingFlags.Static |
BindingFlags.NonPublic);
field.SetValue(null, null);

- 53
- 2
- 10
I was only interested in clearing the cache and found this as an option, when using the c# GlobalCachingProvider
var cache = GlobalCachingProvider.Instance.GetAllItems();
if (dbOperation.SuccessLoadingAllCacheToDB(cache))
{
cache.Clear();
}

- 3,653
- 1
- 22
- 33
a bit improved version of magritte answer.
var cacheKeys = MemoryCache.Default.Where(kvp.Value is MyType).Select(kvp => kvp.Key).ToList();
foreach (string cacheKey in cacheKeys)
{
MemoryCache.Default.Remove(cacheKey);
}

- 921
- 1
- 12
- 30
This discussion is also being done here: https://learn.microsoft.com/en-us/answers/answers/983399/view.html
I wrote an answer there and I'll transcribe it here:
using System.Collections.Generic;
using Microsoft.Extensions.Caching.Memory;
using ServiceStack;
public static class IMemoryCacheExtensions
{
static readonly List<object> entries = new();
/// <summary>
/// Removes all entries, added via the "TryGetValueExtension()" method
/// </summary>
/// <param name="cache"></param>
public static void Clear(this IMemoryCache cache)
{
for (int i = 0; i < entries.Count; i++)
{
cache.Remove(entries[i]);
}
entries.Clear();
}
/// <summary>
/// Use this extension method, to be able to remove all your entries later using "Clear()" method
/// </summary>
/// <typeparam name="TItem"></typeparam>
/// <param name="cache"></param>
/// <param name="key"></param>
/// <param name="value"></param>
/// <returns></returns>
public static bool TryGetValueExtension<TItem>(this IMemoryCache cache, object key, out TItem value)
{
entries.AddIfNotExists(key);
if (cache.TryGetValue(key, out object result))
{
if (result == null)
{
value = default;
return true;
}
if (result is TItem item)
{
value = item;
return true;
}
}
value = default;
return false;
}
}

- 1,018
- 12
- 28