I am trying LazyCache
in C# and .NET 6.
It is failing one of our tests, so I made a smaller reproduction here.
I expect LazyCache to do a thread safe cache so that one failed request does not impact others.
Specifically, I have set up a FailingCacher
such that the first underlying call fails, and the value from that not be cached. Subsequent calls should succeed and be cached, and the value should come from cache thereafter.
When I do 5 calls in sequence, this works as expected.
But when 5 calls are done concurrently, this fails and the same bad value is returned on 4 or 5 calls, which should not happen.
I expected to get the same outcome as the sequential case? Is it possible for LazyCache to do this? Do I need manual locking as well? But isn't that what LazyCache offers?
The code is:
using LazyCache;
public class FailingCacher
{
private int _callCount = 0;
private const string CacheTokenKey = "abc123";
private readonly IAppCache _appCache;
public FailingCacher(IAppCache appCache)
{
_appCache = appCache;
}
public int CallCount => _callCount;
public string GetTheValue()
{
try
{
return _appCache.GetOrAdd(CacheTokenKey, GetInternal);
}
catch (Exception)
{
return null;
}
}
private string GetInternal()
{
var currentCallCount = Interlocked.Increment(ref _callCount);
if (currentCallCount < 2)
{
throw new Exception($"Call {currentCallCount} fails, subsequent calls succeed");
}
return $"Success at call {currentCallCount}";
}
public void ClearCache()
{
_appCache.Remove(CacheTokenKey);
}
}
The main program is
using LazyCache;
using System.Collections.Concurrent;
void VerifyResults(List<string> list, string testType)
{
var allCount = list.Count;
var failedCount = list.Count(x => x == null);
var successCount = list.Count(x => x != null);
var expectedValueCount = list.Count(x => x == "Success at call 2");
string PassOrFail(bool cond) => cond ? "Pass" : "Fail";
Console.WriteLine($"{testType} All result count {allCount}: {PassOrFail(allCount == 5)}");
Console.WriteLine($"{testType} Failed count {failedCount}: {PassOrFail(failedCount == 1)}");
Console.WriteLine($"{testType} Success count {successCount}: {PassOrFail(successCount == 4)}");
Console.WriteLine($"{testType} Expected Value count {expectedValueCount}: {PassOrFail(expectedValueCount == 4)}");
}
void SequentialCallsReturnsFirstSuccessfulCachedValue()
{
var cacher = new FailingCacher(new CachingService());
cacher.ClearCache();
var results = new List<string>();
for (var i = 0; i < 5; i++)
{
var value = cacher.GetTheValue();
results.Add(value);
}
Console.WriteLine($"Sequential inner call count: {cacher.CallCount}");
VerifyResults(results, "Sequential");
}
async Task ConcurrentCallsReturnsFirstSuccessfulCachedValue()
{
var cacher = new FailingCacher(new CachingService());
cacher.ClearCache();
var tasks = new List<Task>();
var stack = new ConcurrentStack<string>();
for (var i = 0; i < 5; i++)
{
var task = Task.Run(() =>
{
var value = cacher.GetTheValue();
stack.Push(value);
});
tasks.Add(task);
}
await Task.WhenAll(tasks);
var results = stack.ToList();
Console.WriteLine($"Concurrent inner call count: {cacher.CallCount}");
VerifyResults(results, "Concurrent");
}
SequentialCallsReturnsFirstSuccessfulCachedValue();
await ConcurrentCallsReturnsFirstSuccessfulCachedValue();
Console.WriteLine("Done");
and the .csproj
file is
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net6.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="LazyCache" Version="2.4.0" />
</ItemGroup>
</Project>
The output is
Sequential inner call count: 2
Sequential All result count 5: Pass
Sequential Failed count 1: Pass
Sequential Success count 4: Pass
Sequential Expected Value count 4: Pass
Concurrent inner call count: 1
Concurrent All result count 5: Pass
Concurrent Failed count 5: Fail
Concurrent Success count 0: Fail
Concurrent Expected Value count 0: Fail
Done
As you can see, the Sequential part works as expected, the Concurrent does not. The concurrent output seems to indicate that the failed value from the first thread is being returned to other threads as well, although it should not be cached.
There are definitely timing-dependent issues as some runs show
Concurrent inner call count: 2
Concurrent All result count 5: Pass
Concurrent Failed count 4: Fail
Concurrent Success count 1: Fail
Concurrent Expected Value count 1: Fail
Done
This is much simplified; the original test was all async
and had http mocks. But the issue is exactly the same.