0

I have a long running request to a web service which should be cached on the server side after completion. My problem is - I don't know how to prevent it being called concurrently/simultaneously before it's cached after first request.

My thought is I should create a data request Task and store it in a concurrent dictionary. So every other request should check if Task is already running and wait for it to complete.

I've ended up with this:

private static ConcurrentDictionary<string, Task> tasksCache = new ConcurrentDictionary<string, Task>();

public static T GetFromCache<T>(this ICacheManager<object> cacheManager, string name, Func<T> func)
{
    if (cacheManager.Exists(name))
        return (T)cacheManager[name];

    if (tasksCache.ContainsKey(name))
    {
        tasksCache[name].Wait();
        return (tasksCache[name] as Task<T>).Result;
    }

    var runningTask = Task.Run(() => func.Invoke());
    tasksCache[name] = runningTask;
    runningTask.Wait();

    var data = runningTask.Result;
    cacheManager.Put(name, data);
    tasksCache.TryRemove(name, out Task t);

    return data;
}

But this looks messy. Is there a better way?

Taylor Wood
  • 15,886
  • 1
  • 20
  • 37
Kindzoku
  • 1,368
  • 1
  • 10
  • 29

1 Answers1

1

I'd consider wrapping these in a Lazy<T> for each task, which has built-in semantics for controlling concurrent initialization.

This example demonstrates the use of the Lazy<T> class to provide lazy initialization with access from multiple threads.

You'll want to specify an appropriate LazyThreadSafetyMode.

Fully thread safe; uses locking to ensure that only one thread initializes the value. ExecutionAndPublication

Taylor Wood
  • 15,886
  • 1
  • 20
  • 37