50

I am looking to implement caching at a request level for a WCF Service. Each request to this service performs a large number of database calls. Think multiple data collectors. We need to allow one data collector to access the information already retrieved by a preceding data collector.

I was looking to use the new .Net 4.0 Memory cache for this by creating a specific instance per request.

Is this a good idea ? Or should I simply use a Dictionary object ?

BTW : The data collection is going to be in parallel, so there will be more complexities around locking but I could use concurrent collections for that as well.

Abhinav Gujjar
  • 2,570
  • 2
  • 25
  • 35

1 Answers1

63

If you don't need some kind of expiration logic, I would suggest using concurrent collections. You can easily implement a single entry caching mechanism combining ConcurrentDictionary and Lazy classes. Here is another link about Lazy and ConcurrentDictionary combination.

If you need your items to expire, then you better use the built-in MemoryCache and implement double-checked locking pattern to guarantee single retrieval of cache items. A ready to go implementation of double checked locking can be found in Locking pattern for proper use of .NET MemoryCache

Community
  • 1
  • 1
Yiğit Yener
  • 5,796
  • 1
  • 23
  • 26
  • Ok - I was thinking on the same lines, but I still don't understand the "why ?" of it. The MemoryCache provides me with some cache specific functions like AddOrGet. I'll lose that when I move to concurrent collections. Is expiration functionality the only reason to choose between Concurrent Collection versus MemoryCache? – Abhinav Gujjar Sep 26 '12 at 07:35
  • 5
    @zync The idea of using ConcurrentDictionary with Lazy is to ensure that you retrieve cache items once per application domain and you never change them. If this is not the case MemoryCache is better. However if you need single entry retrieval logic, using ConcurrentDictionary is more appropriate because MemoryCache.AddOrGetExisting method needs a value and you will still have to implement locking while retrieving this value. But with ConcurrentDictionary.GetOrAdd(TKey key, Func valueFactory) combined with Lazy, you leave all that locking stuff to framework library. – Yiğit Yener Sep 26 '12 at 08:00
  • 7
    By the way if your cache items are well defined before your services start (i.e. you do not cache dynamically) and you can tolarate loading all the data in cache once, you can use the static constructor of your service classes to load those cache items. This is better because static constructors are guaranteed to be executed once per application domain and before any instance constructor executed of that type. It is simple. – Yiğit Yener Sep 26 '12 at 08:08
  • 2
    @YiğitYener Unless you have data only available through an `async` API, of course (though you could have a `ConcurrentDictionary>` or even `ConcurrentDictionary>>` ). – Dai Jan 12 '20 at 15:14