I have a generic cache used to manage the specified type, however, it's not quite working the way I expected it to. Here is my implementation:
public class GCManagedCache<T> where T : class
{
private readonly Dictionary<string, WeakReference> _cache = new Dictionary<string, WeakReference>();
public T this[string key]
{
get
{
WeakReference weakRef = null;
if (!_cache.TryGetValue(key, out weakRef))
{
return null;
}
if (weakRef != null && weakRef.IsAlive)
{
return weakRef.Target as T;
}
_cache.Remove(key);
return null;
}
set
{
_cache[key] = new WeakReference(value);
}
}
}
And here is my test:
[TestMethod]
public void TestCaching()
{
const string KEY = "test";
GCManagedCache<byte[]> cache = new GCManagedCache<byte[]>();
var o = new byte[1024];
cache[KEY] = o;
var y = cache[KEY]; // <-- seems to keep it alive even though it's nulled out after the next GC call.
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced, true);
o = null;
y = null;
GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced, true);
var x = cache[KEY]; // <--- returns a valid value when it should be null
Assert.IsTrue(x == null);
}
This is the line that is causing the unexpected behavior:
var y = cache[KEY];
With that line in the final assignment :
var x = cache[KEY];
always results in a valid object being returned. If I remove the assignment to "y", then it works as expected.