4

I'm trying to keep an in-memory list of entries in my .NET cache, representing the last N HTTP requests to my app. What's the best .NET sequence to use for this?

Requirements

  • Fixed number of items (e.g 50)
  • Serializable (need to add sequence to .NET cache)
  • When i try and add the max+1 entry, it automatically removes the oldest item in order to make room
  • Don't really care about order of items
  • Need to be able to get all items in a single operation, in order to perform aggregate calculations.
  • Thread-safe
  • Non-unique (e.g List<T>, not Dictionary<TKey,TValue>). I might hit the URL /foo 10x times, which isn't unique but all needs to be added to the sequence.

Off the top of my head, i was thinking i could use a Queue<T>, and when enqueuing just check the length, and if it hits the capacity, dequeue the old one. But concerned about thread safety (ConcurrentQueue<T> perhaps?) and the best approach, since this is a 'hot' area of my app that needs to be optimal.

Thanks!

RPM1984
  • 72,246
  • 58
  • 225
  • 350

1 Answers1

4

It really depends on what you specifically mean by "oldest entry".

If you are looking for a FIFO structure, then it is possible to extend the ConcurrentQueue<T> to eject the oldest item (first item entered). (Copied from this answer).

public class FixedSizedQueue<T> : ConcurrentQueue<T>
{
    private readonly object syncObject = new object();

    public int Size { get; private set; }

    public FixedSizedQueue(int size)
    {
        Size = size;
    }

    public new void Enqueue(T obj)
    {
        base.Enqueue(obj);
        lock (syncObject)
        {
            while (base.Count > Size)
            {
                T outObj;
                base.TryDequeue(out outObj);
            }
        }
    }
}

If you are looking for a cache that keeps track of when the last item was accessed, and ejects the one that was accessed least recently (sort of like the sliding expiration functionality in System.Runtime.Caching), you could use a Least Recently Used (LRU) cache.

There is a high performance thread-safe .NET implementation of one called the LurchTable in the CSharpTest.Net.Collections project, which is available on NuGet.

Introducing the LurchTable as a C# version of LinkedHashMap

For other options, see

NightOwl888
  • 55,572
  • 24
  • 139
  • 212
  • Thanks for your answer. LRU/LurchTable seems to be the way to go... will check it out and report back. Cheers! – RPM1984 Oct 11 '17 at 06:16
  • i took a look at LurchTable/LRU, but it seems these are all dictionaries, e.g unique key/value pairs. What if i just want a non-unique List? (i also updated question, to make that clear). – RPM1984 Oct 11 '17 at 23:23
  • See my updated answer. You *could* just declare `LurchTable`, which would essentially give you a similar structure (keep in mind `Keys` is an `ICollection`). But I realized that an LRU cache might not be what you are looking for depending on what you mean by "oldest item". – NightOwl888 Oct 12 '17 at 00:04
  • BTW - You don't need a class to be serializable to use it with .NET caching. That only applies if you are thinking about using it in ASP.NET session state or user profiles. – NightOwl888 Oct 12 '17 at 00:05
  • `FixedSizeQueue` is the way to go. Thanks! – RPM1984 Oct 12 '17 at 00:46
  • 2
    I would be a little bit careful with using `public new void Enqueue(T obj)` if the class gets cast to any other type it will not behave correctly. I would go with composition over inheritance. Make a new class that implements `IProducerConsumerCollection` and have it contain a `ConcurrentQueue` as a private member that it forwards all of it's calls to. – Scott Chamberlain Oct 12 '17 at 20:03
  • @ScottChamberlain - Agreed. That's what I get for copying someone else's answer :). Thanks Microsoft for not making collection methods virtual so we have to build collections from scratch instead of just subclassing them. – NightOwl888 Oct 12 '17 at 20:26
  • Concurrent collections where designed for very high performance environments where a lock would be a unacceptable wait time, when dealing with time windows that small things like a virtual method lookup do not take a trivial amount of time once multiplied by the volume of data to be processed. Similar reason for all of the classes in System.Collections.Generics, they are designed for performance too. If you need to override methods you have to override a class from System.Collections, things like Queue from that method has a virtual Enqueue method or use System.Collections.ObjectModel – Scott Chamberlain Oct 12 '17 at 20:31
  • * or use the classes in System.Collections.ObjectModel to create your own generic types and use its virutal protected methods to do an action on a a add or remove. – Scott Chamberlain Oct 12 '17 at 20:38
  • If you don't need the out value, you can use `out _` instead. – shtse8 Aug 06 '18 at 21:45