1

I need a thread safe cache to store the instances of a disposable class.

  • It will be used with .NET 4.0
  • Cache should be aware of if a stored instance is beign used or not.
  • When an instance is wanted from the cache, it should look at the stored avaliable instances and give one; if there is no available, create a new instance and store it.
  • If the cache has not been used for a period of time, cache should dispose the stored, not being used insances and clear them.

This is the solution I wrote:

private class cache<T> where T : IDisposable
{
    Func<T> _createFunc;
    long livingTicks;
    int livingMillisecs;
    public cache(Func<T> createFunc, int livingTimeInSec)
    {
        this.livingTicks = livingTimeInSec * 10000000;
        this.livingMillisecs = livingTimeInSec * 1000;
        this._createFunc = createFunc;
    }
    Stack<T> st = new Stack<T>();
    public IDisposable BeginUseBlock(out T item)
    {
        this.actionOccured();
        if (st.Count == 0)
            item = _createFunc();
        else
            lock (st)
                if (st.Count == 0)
                    item = _createFunc();
                else
                    item = st.Pop();
        return new blockDisposer(this, item);
    }



    long _lastTicks;
    bool _called;
    private void actionOccured()
    {
        if (!_called)
            lock (st)
                if (!_called)
                {
                    _called = true;
                    System.Threading.Timer timer = null;
                    timer = new System.Threading.Timer((obj) =>
                    {
                        if ((DateTime.UtcNow.Ticks - _lastTicks) > livingTicks)
                        {
                            timer.Dispose();
                            this.free();
                        }
                    },
                    null, livingMillisecs, livingMillisecs);
                }

        _lastTicks = DateTime.UtcNow.Ticks;
    }
    private void free()
    {
        lock (st)
        {
            while (st.Count > 0)
                st.Pop().Dispose();
            _called = false;
        }
    }

    private class blockDisposer : IDisposable
    {
        T _item;
        cache<T> _c;
        public blockDisposer(cache<T> c, T item)
        {
            this._c = c;
            this._item = item;
        }
        public void Dispose()
        {
            this._c.actionOccured();
            lock (this._c.st)
                this._c.st.Push(_item);
        }
    }
}

This is a sample use:

class MyClass:IDisposable
{
    public MyClass()
    {
        //expensive work
    }
    public void Dispose()
    {
        //free
    }
    public void DoSomething(int i)
    {
    }
}
private static Lazy<cache<MyClass>> myCache = new Lazy<cache<MyClass>>(() => new cache<MyClass>(() => new MyClass(), 60), true);//free 60sec. after last call
private static void test()
{
    Parallel.For(0, 100000, (i) =>
    {
        MyClass cl;
        using (myCache.Value.BeginUseBlock(out cl))
            cl.DoSomething(i);
    });
}

My questions:

  • Is there a faster way of doing this? (I've searched for the MemoryCache examples, but couln't figure out how I could use it for my requirements. And it requires a key check. Stack.Pop would be faster than a key search, I thought; and for my problem, performance is very important.)
  • In order to dispose the instances after a while (60sec. for the example code) I had to use a Timer. I just need a delayed function call that would be re-delayed on each action happening with the cache. Is there a way to do that without using a timer?

Edit: I've tried @mjwills's comment. The performance is better with this:

    ConcurrentStack<T> st = new ConcurrentStack<T>();
    public IDisposable BeginUseBlock(out T item)
    {
        this.actionOccured();
        if (!st.TryPop(out item))
            item = _createFunc();
        return new blockDisposer(this, item);
    }

Edit2: In my cas its not required, but if we need to control the size of the stack and dispose the unused objects, using a separate counter which will be increment-decremented with Interlocked.Increment will be faster (@mjwills)

Koray
  • 1,768
  • 1
  • 27
  • 37
  • 1
    This would probably better suited to the Code-Review stack exchange! – flakes Dec 17 '17 at 09:28
  • 1
    So am I right in assuming you want some sort of object pool, which automatically disposes and releases old objects? – mjwills Dec 17 '17 at 11:51
  • 1
    I would suggest not using `Lazy` since it caches exceptions. Consider using `LazyWithNoExceptionCaching` instead https://stackoverflow.com/a/42567351/34092 . Consider using `ConcurrentStack` rather than `Stack` to reduce the need for `lock` (but also read https://msdn.microsoft.com/en-us/library/dd287185%28v=vs.110%29.aspx?f=255&MSPPError=-2147217396#Remarks ). – mjwills Dec 17 '17 at 11:53
  • 1
    Can you talk us through the usage pattern of your disposable object? How often will it be used? _If it is used regularly you may be able to avoid the need for the timer (you could instead just deal with 'dead' objects when an instance is requested)._ – mjwills Dec 17 '17 at 11:56
  • @flakes I have defined some requirements and given the solution that I could able to make with my knowledge. I'm not asking to write code for me from anyone. I'm asking if there is a model, pattern or a built-in .NET solution for the things I'm asking. – Koray Dec 18 '17 at 02:43
  • @mjwills thank you for the information about 'Lazy'! I didn't know that, I'll read more about it. I also didn't know about ConcurrentStack, thanks a lot. Yes it is what I want. It will be used like an API. I mean, I can't have a control on when it is going to be used and when its usage is done. Some module may start and stop using it anytime. I need to respond quickly, and free the memory after a period of time. That time will most probably be 60sec. by default, but it can be changed by the module that uses it, according to its needs. – Koray Dec 18 '17 at 03:03
  • Also it may not be used for a long time, and I cannot have an information that its usage is finished. So waiting for a request to free the dead objects cannot be an option. – Koray Dec 18 '17 at 03:06
  • 1
    Consider using a `MemoryCache` where the value stored is a `ConcurrentStack` or `ConcurrentQueue`. Set the expiry to a reasonable window (say 60 seconds). Wire up the https://msdn.microsoft.com/en-us/library/system.runtime.caching.cacheitempolicy.updatecallback(v=vs.110).aspx event and you will essentially get your timer (i.e. it will notify you when the item is about to be removed from cache). When the `UpdateCallback` fires, add the existing stack / queue back into the `MemoryCache` (i.e. it is always there), and then implement your code to remove 'old' entries from the Stack / Queue. – mjwills Dec 18 '17 at 07:56
  • 1
    You could do basically the same thing with a timer and without MemoryCache if you wanted. Either way, you also want to keep track of the size of the pool. You are best doing this with a separate `int` rather than checking the `Count`. Use `Interlocked.Increment` - increment whenever an object is added to the pool and decrement when it is removed. Only do cleanup if the `int` has increased between two cleanup cycles. – mjwills Dec 18 '17 at 09:42
  • @mjwills thank you very much for the valuable informations.I've removed locks and used a ConcurrentStack and the performace was much better in most of my test cases. I've decided not to use MemoryCache since I had to use an extra ContainsKey check. I only didn't see why we need a separate counter. Please see my edit, stack.count is no longer needed, thanks to you :) – Koray Dec 18 '17 at 13:06
  • It seems like you should build your solution on top of Redis instead :D – Matías Fidemraizer Dec 18 '17 at 13:07
  • @MatíasFidemraizer How would it work on Redis? _The challenge is how to create a disposable object pool. I am not sure how Redis can help with that._ – mjwills Dec 18 '17 at 19:37
  • `Count` isn't needed for retrieving items from the pool. It is needed to determine whether a pool cleanup is required. For example, you might say `if count > 100 for more than two consecutive timer cycles then pop (up to) 50 items off and dispose them, otherwise if count > 20 for more than three consecutive cycles then pop (up to) 4 items off and dispose them` or whatever. Otherwise you need to ask the Stack, and that is slow. – mjwills Dec 18 '17 at 19:39
  • @mjwills I may understood you wrong. But still I don't get it. The timer does not check the size of the stack. Its size is not important for my requirements, cannot be too large for my cases. So the timer only looks for the last time of any call. If no use has been detected, its starts cleaning. Stack.Count is only used in the cleaning process. However if we need to control the stack size in the future, keeping a separate counter that you have mentoined is the way to go; I think. I have edited my question according to this. – Koray Dec 19 '17 at 07:23
  • `So the timer only looks for the last time of any call.` Yep, that is certainly one way to solve the issue. The advantage of tracking the `Count` could be thought of as an optimisation. _For example, if the `Count` is 0 then bail out (the pool is empty, no point checking) or if it is trending down, don't bother doing any cleanup (since the pool is doing its job well)._ For your planned approach, how are you planning to keep track of the last time an object was used? – mjwills Dec 19 '17 at 08:03
  • @mjwills "actionOccured" is called when an insance is requested and released. It changes "_lastTicks" so that timer knows if it should finish or wait more. Let's think, if freeing process starts when an instance being used at that time. But if it is being used, then its not in the stack at all. And when it is released, will be added to stack, so a new timer will be generated. So no problem; I guess.. – Koray Dec 19 '17 at 12:35
  • Yep, that could definitely work. – mjwills Dec 19 '17 at 21:25

0 Answers0