93

I’m working with the .NET 4.0 MemoryCache class in an application and trying to limit the maximum cache size, but in my tests it does not appear that the cache is actually obeying the limits.

I'm using the settings which, according to MSDN, are supposed to limit the cache size:

  1. CacheMemoryLimitMegabytes: The maximum memory size, in megabytes, that an instance of an object can grow to."
  2. PhysicalMemoryLimitPercentage: "The percentage of physical memory that the cache can use, expressed as an integer value from 1 to 100. The default is zero, which indicates that MemoryCache instances manage their own memory1 based on the amount of memory that is installed on the computer." 1. This is not entirely correct-- any value below 4 is ignored and replaced with 4.

I understand that these values are approximate and not hard limits as the thread that purges the cache is fired every x seconds and is also dependent on the polling interval and other undocumented variables. However even taking into account these variances, I'm seeing wildly inconsistent cache sizes when the first item is being evicted from the cache after setting CacheMemoryLimitMegabytes and PhysicalMemoryLimitPercentage together or singularly in a test app. To be sure I ran each test 10 times and calculated the average figure.

These are the results of testing the example code below on a 32-bit Windows 7 PC with 3GB of RAM. Size of the cache is taken after the first call to CacheItemRemoved() on each test. (I am aware the actual size of cache will be larger than this)

MemLimitMB    MemLimitPct     AVG Cache MB on first expiry    
   1            NA              84
   2            NA              84
   3            NA              84
   6            NA              84
  NA             1              84
  NA             4              84
  NA            10              84
  10            20              81
  10            30              81
  10            39              82
  10            40              79
  10            49              146
  10            50              152
  10            60              212
  10            70              332
  10            80              429
  10           100              535
 100            39              81
 500            39              79
 900            39              83
1900            39              84
 900            41              81
 900            46              84

 900            49              1.8 GB approx. in task manager no mem errros
 200            49              156
 100            49              153
2000            60              214
   5            60              78
   6            60              76
   7           100              82
  10           100              541

Here is the test application:

using System;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Linq;
using System.Runtime.Caching;
using System.Text;
namespace FinalCacheTest
{       
    internal class Cache
    {
        private Object Statlock = new object();
        private int ItemCount;
        private long size;
        private MemoryCache MemCache;
        private CacheItemPolicy CIPOL = new CacheItemPolicy();

        public Cache(long CacheSize)
        {
            CIPOL.RemovedCallback = new CacheEntryRemovedCallback(CacheItemRemoved);
            NameValueCollection CacheSettings = new NameValueCollection(3);
            CacheSettings.Add("CacheMemoryLimitMegabytes", Convert.ToString(CacheSize)); 
            CacheSettings.Add("physicalMemoryLimitPercentage", Convert.ToString(49));  //set % here
            CacheSettings.Add("pollingInterval", Convert.ToString("00:00:10"));
            MemCache = new MemoryCache("TestCache", CacheSettings);
        }

        public void AddItem(string Name, string Value)
        {
            CacheItem CI = new CacheItem(Name, Value);
            MemCache.Add(CI, CIPOL);

            lock (Statlock)
            {
                ItemCount++;
                size = size + (Name.Length + Value.Length * 2);
            }

        }

        public void CacheItemRemoved(CacheEntryRemovedArguments Args)
        {
            Console.WriteLine("Cache contains {0} items. Size is {1} bytes", ItemCount, size);

            lock (Statlock)
            {
                ItemCount--;
                size = size - 108;
            }

            Console.ReadKey();
        }
    }
}

namespace FinalCacheTest
{
    internal class Program
    {
        private static void Main(string[] args)
        {
            int MaxAdds = 5000000;
            Cache MyCache = new Cache(1); // set CacheMemoryLimitMegabytes

            for (int i = 0; i < MaxAdds; i++)
            {
                MyCache.AddItem(Guid.NewGuid().ToString(), Guid.NewGuid().ToString());
            }

            Console.WriteLine("Finished Adding Items to Cache");
        }
    }
}

Why is MemoryCache not obeying the configured memory limits?

David Schwartz
  • 1,956
  • 19
  • 28
Canacourse
  • 1,805
  • 2
  • 20
  • 37
  • 2
    for loop is wrong ,without i++ – xiaoyifang Oct 17 '13 at 09:56
  • 4
    I've added a MS Connect report for this bug (maybe someone else already did, but anyway...) https://connect.microsoft.com/VisualStudio/feedback/details/806334/system-runtime-caching-memorycache-do-not-respect-memory-limits – Bruno Brant Oct 24 '13 at 17:19
  • 4
    It's worth noting that Microsoft has now (as of 9/2014) added a fairly thorough response on the connect ticket linked above. The TLDR of it is that MemoryCache **does not** inherently check these limits upon every operation, but rather that the limits are only respected upon internal cache trimming, which is periodic based on dynamic internal timers. – Dusty Jan 14 '15 at 20:46
  • Thanks Dusty. Interesting response from MS on the issue. – Canacourse Jan 15 '15 at 21:09
  • Can you shed some light on why you use "size = size + (Name.Length + Value.Length * 2);" and "size = size - 108;" – Abhijeet Patel Jul 02 '15 at 04:30
  • 6
    Looks like they updated the docs for MemoryCache.CacheMemoryLimit: "MemoryCache does not instantly enforce CacheMemoryLimit each time a new item is added to a MemoryCache instance. The internal heuristics which evicts extra items from the MemoryCache does it gradually..." https://msdn.microsoft.com/en-us/library/system.runtime.caching.memorycache.cachememorylimit%28v=vs.110%29.aspx – Sully Jul 06 '15 at 16:15
  • @bruno Link to that article does not seem correct...can you fix it? – Zeus Feb 02 '17 at 18:23
  • 1
    @Zeus, I think MSFT removed the issue. In any case, MSFT closed the issue after some discussion with me, where they told me that the limit is only applied after PoolingTime has expired. – Bruno Brant Feb 13 '17 at 19:43
  • Anyone else feel like they would much prefer to have SoftReferences like Java rather than this unreliable cache? I've posted a feature request that (for now) people can vote on: https://github.com/dotnet/runtime/issues/63113 – Qwertie Dec 24 '21 at 03:52

7 Answers7

104

Wow, so I just spent entirely too much time digging around in the CLR with reflector, but I think I finally have a good handle on what's going on here.

The settings are being read in correctly, but there seems to be a deep-seated problem in the CLR itself that looks like it will render the memory limit setting essentially useless.

The following code is reflected out of the System.Runtime.Caching DLL, for the CacheMemoryMonitor class (there is a similar class that monitors physical memory and deals with the other setting, but this is the more important one):

protected override int GetCurrentPressure()
{
  int num = GC.CollectionCount(2);
  SRef ref2 = this._sizedRef;
  if ((num != this._gen2Count) && (ref2 != null))
  {
    this._gen2Count = num;
    this._idx ^= 1;
    this._cacheSizeSampleTimes[this._idx] = DateTime.UtcNow;
    this._cacheSizeSamples[this._idx] = ref2.ApproximateSize;
    IMemoryCacheManager manager = s_memoryCacheManager;
    if (manager != null)
    {
      manager.UpdateCacheSize(this._cacheSizeSamples[this._idx], this._memoryCache);
    }
  }
  if (this._memoryLimit <= 0L)
  {
    return 0;
  }
  long num2 = this._cacheSizeSamples[this._idx];
  if (num2 > this._memoryLimit)
  {
    num2 = this._memoryLimit;
  }
  return (int) ((num2 * 100L) / this._memoryLimit);
}

The first thing you might notice is that it doesn't even try to look at the size of the cache until after a Gen2 garbage collection, instead just falling back on the existing stored size value in cacheSizeSamples. So you won't ever be able to hit the target right on, but if the rest worked we would at least get a size measurement before we got in real trouble.

So assuming a Gen2 GC has occurred, we run into problem 2, which is that ref2.ApproximateSize does a horrible job of actually approximating the size of the cache. Slogging through CLR junk I found that this is a System.SizedReference, and this is what it's doing to get the value (IntPtr is a handle to the MemoryCache object itself):

[SecurityCritical]
[MethodImpl(MethodImplOptions.InternalCall)]
private static extern long GetApproximateSizeOfSizedRef(IntPtr h);

I'm assuming that extern declaration means that it goes diving into unmanaged windows land at this point, and I have no idea how to start finding out what it does there. From what I've observed though it does a horrible job of trying to approximate the size of the overall thing.

The third noticeable thing there is the call to manager.UpdateCacheSize which sounds like it should do something. Unfortunately in any normal sample of how this should work s_memoryCacheManager will always be null. The field is set from the public static member ObjectCache.Host. This is exposed for the user to mess with if he so chooses, and I was actually able to make this thing sort of work like it's supposed to by slopping together my own IMemoryCacheManager implementation, setting it to ObjectCache.Host, and then running the sample. At that point though, it seems like you might as well just make your own cache implementation and not even bother with all this stuff, especially since I have no idea if setting your own class to ObjectCache.Host (static, so it affects every one of these that might be out there in process) to measure the cache could mess up other things.

I have to believe that at least part of this (if not a couple parts) is just a straight up bug. It'd be nice to hear from someone at MS what the deal was with this thing.

TLDR version of this giant answer: Assume that CacheMemoryLimitMegabytes is completely busted at this point in time. You can set it to 10 MB, and then proceed to fill up the cache to ~2GB and blow an out of memory exception with no tripping of item removal.

David Hay
  • 3,027
  • 2
  • 27
  • 29
  • 4
    A fine answer thank you. I gave up trying to figure out what was going on with this and instead now manage the cache size by counting items in/out and calling .Trim() manually as needed. I thought System.Runtime.Caching was an easy choice for my app as it seems to be widely used and I thought therefore would not have any major bugs. – Canacourse Sep 11 '11 at 12:11
  • 3
    Wow. That's why I love SO. I ran into the exact same behavior, wrote a test app and managed to crash my PC many times even though polling time was as low as 10 seconds and cache memory limit was 1MB. Thanks for all the insights. – Bruno Brant Oct 22 '13 at 20:11
  • 7
    I know I just mentioned it up there in the question but, for completeness sake, I'll mention it here again. I've opened an issue at Connect for this. http://connect.microsoft.com/VisualStudio/feedback/details/806334/system-runtime-caching-memorycache-do-not-respect-memory-limits – Bruno Brant Oct 31 '13 at 16:53
  • 1
    I'm using the MemoryCache for external service data, and when I test by injecting garbage into the MemoryCache, it *does* auto-trim content, but only when using the percentage limit value. Absolute size does nothing to limit size, at least when inpsecting with a memory profiler. Not tested in a while loop, but by more "realistic" usages (it's a backend system, so I've added a WCF service which lets me inject data into the caches on demand). – Svend Oct 16 '14 at 10:21
  • 1
    Is this still an issue in .NET Core? – Павле Sep 10 '20 at 13:02
  • @Павле https://stackoverflow.com/questions/5547003/is-there-a-way-to-enforce-a-size-limit-of-memorycache-in-system-runtime-caching and https://learn.microsoft.com/en-us/aspnet/core/performance/caching/memory?view=aspnetcore-5.0 – Igor Beaufils Apr 08 '21 at 14:09
30

I know this answer is crazy late, but better late than never. I wanted to let you know that I wrote a version of MemoryCache that resolves the Gen 2 Collection issues automatically for you. It therefore trims whenever the polling interval indicates memory pressure. If you're experiencing this issue, give it a go!

http://www.nuget.org/packages/SharpMemoryCache

You can also find it on GitHub if you're curious about how I solved it. The code is somewhat simple.

https://github.com/haneytron/sharpmemorycache

Haney
  • 32,775
  • 8
  • 59
  • 68
  • 2
    This works as intended, tested out with a generator which fills up cache with loads of strings of 1000 characters. Although, adding up what should be like 100MB to the cache adds up actually 200 - 300MB to the cache, which I found quite strange. Maybe some overheards I'm not counting. – Karl Cassar Nov 06 '14 at 12:13
  • 5
    @KarlCassar strings in .NET are roughly `2n + 20` in size with respect to bytes, where `n` is the length of the string. This is mostly due to Unicode support. – Haney Nov 06 '14 at 14:40
5

I've encountered this issue as well. I'm caching objects that are being fired into my process dozens of times per second.

I have found the following configuration and usage frees the items every 5 seconds most of the time.

App.config:

Take note of cacheMemoryLimitMegabytes. When this was set to zero, the purging routine would not fire in a reasonable time.

   <system.runtime.caching>
    <memoryCache>
      <namedCaches>
        <add name="Default" cacheMemoryLimitMegabytes="20" physicalMemoryLimitPercentage="0" pollingInterval="00:00:05" />
      </namedCaches>
    </memoryCache>
  </system.runtime.caching>  

Adding to cache:

MemoryCache.Default.Add(someKeyValue, objectToCache, new CacheItemPolicy { AbsoluteExpiration = DateTime.Now.AddSeconds(5), RemovedCallback = cacheItemRemoved });

Confirming the cache removal is working:

void cacheItemRemoved(CacheEntryRemovedArguments arguments)
{
    System.Diagnostics.Debug.WriteLine("Item removed from cache: {0} at {1}", arguments.CacheItem.Key, DateTime.Now.ToString());
}
Aaron Hudon
  • 5,280
  • 4
  • 53
  • 60
4

I have done some testing with the example of @Canacourse and the modification of @woany and I think there are some critical calls that block the cleaning of the memory cache.

public void CacheItemRemoved(CacheEntryRemovedArguments Args)
{
    // this WriteLine() will block the thread of
    // the MemoryCache long enough to slow it down,
    // and it will never catch up the amount of memory
    // beyond the limit
    Console.WriteLine("...");

    // ...

    // this ReadKey() will block the thread of 
    // the MemoryCache completely, till you press any key
    Console.ReadKey();
}

But why does the modification of @woany seems to keep the memory at the same level? Firstly, the RemovedCallback is not set and there is no console output or waiting for input that could block the thread of the memory cache.

Secondly...

public void AddItem(string Name, string Value)
{
    // ...

    // this WriteLine will block the main thread long enough,
    // so that the thread of the MemoryCache can do its work more frequently
    Console.WriteLine("...");
}

A Thread.Sleep(1) every ~1000th AddItem() would have the same effect.

Well, it's not a very deep investigation of the problem, but it looks as if the thread of the MemoryCache does not get enough CPU time for cleaning, while many new elements are added.

Jezze
  • 307
  • 3
  • 12
3

It turned out it is not a bug , all what you need to do is setting the pooling time span to enforce the limits , it seem if you leave the pooling not set, it will never trigger.I just tested it and no need to wrappers or any extra code :

 private static readonly NameValueCollection Collection = new NameValueCollection
        {
            {"CacheMemoryLimitMegabytes", "20"},
           {"PollingInterval", TimeSpan.FromMilliseconds(60000).ToString()}, // this will check the limits each 60 seconds

        };

Set the value of "PollingInterval" based on how fast the cache is growing , if it grow too fast increase the frequency of polling checks otherwise keep the checks not very frequent to not cause overhead.

sino
  • 754
  • 1
  • 7
  • 22
3

I (thankfully) stumbled across this useful post yesterday when first attempting to use the MemoryCache. I thought it would be a simple case of setting values and using the classes but I encountered similar issues outlined above. To try and see what was going on I extracted the source using ILSpy and then set up a test and stepped through the code. My test code was very similar to the code above so I won't post it. From my tests I noticed that the measurement of the cache size was never particularly accurate (as mentioned above) and given the current implementation would never work reliably. However the physical measurement was fine and if the physical memory was measured at every poll then it seemed to me like the code would work reliably. So, I removed the gen 2 garbage collection check within MemoryCacheStatistics; under normal conditions no memory measurements will be taken unless there has been another gen 2 garbage collection since the last measurement.

In a test scenario this obviously makes a big difference as the cache is being hit constantly so objects never have the chance to get to gen 2. I think we are going to use the modified build of this dll on our project and use the official MS build when .net 4.5 comes out (which according to the connect article mentioned above should have the fix in it). Logically I can see why the gen 2 check has been put in place but in practise I'm not sure if it makes much sense. If the memory reaches 90% (or whatever limit it has been set to) then it should not matter if a gen 2 collection has occured or not, items should be evicted regardless.

I left my test code running for about 15 minutes with a the physicalMemoryLimitPercentage set to 65%. I saw the memory usage remain between 65-68% during the test and saw things getting evicted properly. In my test I set the pollingInterval to 5 seconds, physicalMemoryLimitPercentage to 65 and physicalMemoryLimitPercentage to 0 to default this.

Following the above advice; an implementation of IMemoryCacheManager could be made to evict things from the cache. It would however suffer from the gen 2 check issue mentioned. Although, depending on the scenario, this may not be a problem in production code and may work sufficiently for people.

Ian Gibson
  • 263
  • 2
  • 11
  • 4
    An update: I'm using .NET framework 4.5 and in no way the problem is corrected. The cache can grow large enough to crash the machine. – Bruno Brant Oct 22 '13 at 20:14
  • A question: do you have the link to the connect article you mentioned? – Bruno Brant Oct 22 '13 at 20:28
  • @BrunoBrant perhaps https://connect.microsoft.com/VisualStudio/feedback/details/661340/memorycache-evictions-do-not-fire-when-memory-limits-are-reached – Ohad Schneider Dec 28 '14 at 20:25
1

If you use the following modified class and monitor the memory via Task Manager does in fact get trimmed:

internal class Cache
{
    private Object Statlock = new object();
    private int ItemCount;
    private long size;
    private MemoryCache MemCache;
    private CacheItemPolicy CIPOL = new CacheItemPolicy();

    public Cache(double CacheSize)
    {
        NameValueCollection CacheSettings = new NameValueCollection(3);
        CacheSettings.Add("cacheMemoryLimitMegabytes", Convert.ToString(CacheSize));
        CacheSettings.Add("pollingInterval", Convert.ToString("00:00:01"));
        MemCache = new MemoryCache("TestCache", CacheSettings);
    }

    public void AddItem(string Name, string Value)
    {
        CacheItem CI = new CacheItem(Name, Value);
        MemCache.Add(CI, CIPOL);

        Console.WriteLine(MemCache.GetCount());
    }
}
woany
  • 1,219
  • 2
  • 11
  • 10
  • Are you saying it does or does not get trimmed? – Canacourse Jul 13 '12 at 17:11
  • Yep, it does get trimmed. Strange, considering all the problems people seem to have with `MemoryCache`. I wonder why this sample works. – Daniel Lidström Jun 26 '13 at 08:24
  • 1
    I don't follow it. I tried repeating the example, but the cache still grows indefinitely. – Bruno Brant Oct 22 '13 at 21:15
  • A confusing example class: "Statlock", "ItemCount", "size" are useless... The NameValueCollection(3) only holds 2 items?... In fact You created a cache with sizelimit and pollInterval properties, nothing more! The problem of "not evicting" items is not touched... – Bernhard Aug 21 '19 at 11:42