8

This may be like asking for the moon on a stick; but is there a is there a "C# Production-quality Thread-Safe in-memory LRU Cache with Expiry? Or does anyone have a best-practices idea to achieve the same thing?

(LRU being "Least Recently Used" - http://en.wikipedia.org/wiki/Cache_algorithms#LRU)

To clarify: I want to support a memory cache in an ASP.Net MVC site with the following interface:

public interface ICache
{
    T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class;
    bool Remove(string key);
}
  • I want "GetOrAdd" because I want these operations to be "atomic" - i.e. avoiding race conditions around two threads trying to query the cache at the same time
  • I want the Create function because the creation of these objects is expensive (requires complex database access)
  • I need expiry, as these objects do expire after a period of time

The best solution from Microsoft seems to be the "System.Runtime.Caching.MemoryCache", however it seems to come with a couple of caveats:

  • It needs to poll the cache periodically to obey the imposed memory limits. I can't have any possibility of running out of memory in my system. I have read this post, which worries me: MemoryCache does not obey memory limits in configuration
  • It only appears to have "AddOrGetExisting" to support my interface, which takes the constructed object as a second parameter - if this object is expensive to create, doesnt pre-constructing it kinda defeat the point of the cache?

The code would look something like:

public sealed class Cache : ICache
{
    private readonly MemoryCache _cache;

    public Cache()
    {
        _cache = MemoryCache.Default;
    }

    public T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class
    {
        // This call kinda defeats the point of the cache ?!?
        var newValue = create();

        return _cache.AddOrGetExisting(key, newValue, DateTimeOffset.UtcNow + timeToLive) as T;
    }

    public bool Remove(string key)
    {
        _cache.Remove(key);
        return true;
    }
}

Or maybe something better around Lazy < T >, which does allow the result to be created once only, but feels like a hack (are there consequences to caching Func?):

class Program
{
    static void Main(string[] args)
    {
        Func<Foo> creation = () =>
        {
            // Some expensive thing
            return new Foo();
        };

        Cache cache = new Cache();
        // Result 1 and 2 are correctly the same instance. Result 3 is correctly a new instance...
        var result1 = cache.GetOrAdd("myKey", creation, TimeSpan.FromMinutes(30));
        var result2 = cache.GetOrAdd("myKey", creation, TimeSpan.FromMinutes(30));
        var result3 = cache.GetOrAdd("myKey3", creation, TimeSpan.FromMinutes(30));

        return;
    }
}

public sealed class Foo
{
    private static int Counter = 0;
    private int Index = 0;

    public Foo()
    {
        Index = ++Counter;
    }
}

public sealed class Cache
{
    private readonly MemoryCache _cache;

    public Cache()
    {
        _cache = MemoryCache.Default;
    }

    public T GetOrAdd<T>(string key, Func<T> create, TimeSpan timeToLive) where T : class
    {
        var newValue = new Lazy<T>(create, LazyThreadSafetyMode.PublicationOnly);
        var value = (Lazy<T>)_cache.AddOrGetExisting(key, newValue, DateTimeOffset.UtcNow + timeToLive);
        return (value ?? newValue).Value;
    }

    public bool Remove(string key)
    {
        _cache.Remove(key);
        return true;
    }
}

Other thoughts:

  • I've also found this implementation, but it doesnt allow you to specify a time-based expiry: Is it there any LRU implementation of IDictionary?
  • Maybe there is an implementation out there that uses ReaderWriterLock?
  • Some wrapper around ConcurrentDictionary?
Community
  • 1
  • 1
Jon Rea
  • 9,337
  • 4
  • 32
  • 35
  • Try [this](https://www.nuget.org/packages/CSharpTest.Net.Library/) – Yuval Itzchakov May 27 '15 at 18:39
  • That project is deprecated. Are you referring to the CSharpTest.Net.Collections's LurchTable? https://github.com/csharptest/CSharpTest.Net.Collections - That does not support Expiry ... – Jon Rea May 28 '15 at 08:10
  • 2
    It's weird that there isn't a good LRU cache project. I might have to start one myself :) – Yuval Itzchakov May 28 '15 at 08:11
  • Your `GetOrAdd` method is fairly trivial to implement with the `Get` and `Set` methods available on `MemoryCache` - you don't have to use `AddOrGetExisting`. As for respecting memory limits, it's practically impossible to implement that in a strict sense in .NET, but unless you have no virtual memory configured on your system (which is a BAD idea and I doubt that's the case), in practice the cache will evict items very quickly once you reach high memory usage and should never cause you to get an out of memory error. – Mike Marynowski Jul 21 '18 at 04:00

1 Answers1

8

I implemented a thread safe pseudo LRU designed for concurrent workloads - currently in use in a production system. Performance is very close to ConcurrentDictionary, ~10x faster than MemoryCache and hit rate is better than a conventional LRU. Full analysis provided in the github link below.

Usage looks like this:

int capacity = 666;
var lru = new ConcurrentTLru<int, SomeItem>(capacity, TimeSpan.FromMinutes(5));

var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
bool removed = lru.TryRemove(1);

GitHub: https://github.com/bitfaster/BitFaster.Caching

Install-Package BitFaster.Caching
Alex Peck
  • 4,603
  • 1
  • 33
  • 37
  • I see that you have made comparisons with `System.Runtime.Caching.MemoryCache`. How does `BitFaster.Caching` compares with [`Microsoft.Extensions.Caching.Memory.MemoryCache`](https://learn.microsoft.com/en-us/dotnet/api/microsoft.extensions.caching.memory.memorycache)? – Theodor Zoulias Jun 16 '20 at 08:10
  • AFAIK the MemoryCache family of classes have similar performance characteristics (System.Runtime.Caching.MemoryCache, HttpRuntime.Cache, System.Web.Caching, MicrosoftExtensions.Caching.Memory.MemoryCache). The underlying implementations are all very similar. – Alex Peck Jun 16 '20 at 19:37
  • The [`Microsoft.Extensions.Caching.Memory.MemoryCache`](https://learn.microsoft.com/en-us/dotnet/api/microsoft.extensions.caching.memory.memorycache) is newer, has `object` keys, and offers more prioritization options. I have no idea if it's more efficient than the older implementation, but TBH my expectations are that it probably is! – Theodor Zoulias Jun 16 '20 at 20:25
  • 2
    I updated my cache hit benchmark to test Microsoft.Extensions.Caching.Memory.MemoryCache, and in my test it was slightly slower than System.Runtime.Caching.MemoryCache. This test is 100% cache hits, so will never be a good predictor of real world performance (where misses are expensive). Its just to give an indicator of raw speed. Regardless, the newer class has the same problems with sequential scan and runaway memory usage. The underlying algorithm is the same, and it has the same pros and cons. – Alex Peck Jun 16 '20 at 22:39
  • Thanks Alex for taking the time to make these benchmarks. I'll do keep the [BitFaster.Caching](https://github.com/bitfaster/BitFaster.Caching) library in mind, in case I ever need a speedy memory cache implementation! – Theodor Zoulias Jun 16 '20 at 23:13