0

I'm using a

Dictionary<StockSymbol, List<CandleData>> _CACHE = new
    Dictionary<StockSymbol, List<CandleData>>();

to store data.

I'm using this to keep track of the 'most recent' minute to run subsequent queries. essentially

if (curr.Minute > StockSymbol.LastMinute)
{
    List<CandleData> new_candles = GetData(StockSymbol.symbol, StockSymbol.LastMinute,
        StockSymbol.LastMinute.AddMinute(30));
    if(new_candles.Count>0)
    {
        _CACHE[StockSymbol.symbol].Value.AddRange(new_candles);
    }
}

This works fine when I have small set of keys (FX, Crypto, etc). But when I attempt this with stocks (9900+ keys)-- Eventually it (_CACHE) will have an out of memory exception. I did some research online and it suggests there is a 2GB limit for .NET.. so what is a workaround for this? I have enough RAM on my machine (128gb+).

Theodor Zoulias
  • 34,835
  • 7
  • 69
  • 104
SAKK
  • 9
  • 3
  • What is the maximum number of `CandleData` that any of the `List` may contain at any moment? – Theodor Zoulias Sep 20 '21 at 20:40
  • @TheodorZoulias Last 24 hours worth of 1 minute data. ~10,000 keys, each with a max List size of 1,440. – SAKK Sep 20 '21 at 20:44
  • It sounds strange to me that an `OutOfMemoryException` can be triggered by so few data. Could you identify the line of code that throws the `OutOfMemoryException`? – Theodor Zoulias Sep 20 '21 at 20:46
  • Marked as duplicate: [C# dictionary - how to solve limit on number of items?](https://stackoverflow.com/questions/7876853/c-sharp-dictionary-how-to-solve-limit-on-number-of-items) – Theodor Zoulias Sep 20 '21 at 20:47
  • @TheodorZoulias In what way sis the question *not* a duplicate? – Servy Sep 20 '21 at 20:49
  • Another one of the suggestions in that other SO post was targeting x64 runtime rather than x86 as well. Have you tried that? – emagers Sep 20 '21 at 20:50
  • @Servy it doesn't look like duplicate to me, because 13,000,000 items differ from 10,000 items by 3 orders of magnitude. Hence my reopen vote. – Theodor Zoulias Sep 20 '21 at 20:53
  • @TheodorZoulias The duplicate describes how to solve the problem (in many different ways). That they're running into problems with different numbers of objects doesn't change that they're running into the same limit. The limit isn't even a number of items after all; it's *sometimes* correlated. You can hit the size limit with a single object, if that single object is big enough. – Servy Sep 20 '21 at 20:57
  • @Servy the OP doesn't seem very enthusiastic about the solutions proposed in the [linked question](https://stackoverflow.com/questions/7876853/c-sharp-dictionary-how-to-solve-limit-on-number-of-items), and quite rightfully since the problem described in that question seems significantly different than their problem. But if they find a solution there (hopefully), everybody will be happy by closing this question again as a duplicate of the same question. – Theodor Zoulias Sep 20 '21 at 21:08
  • Don't bother with Large Address Aware (as @ThomasWeller). That requires that you boot the OS into a state compatible with that, and that causes all sorts of issues for normal application loads. Have you looked at System.Runtime.Caching.MemoryCache? Unlike a dictionary, you can give it policies that will evict old entries when there is memory pressure. Another option is to use a WeakReference to whatever is holding your cache. It will allow your cache to be collected by the Garbage Collector when there's memory pressure (you'll need to check that the reference is valid whenever you use it) – Flydog57 Sep 20 '21 at 21:23
  • @TheodorZoulias The fun thing about OOME is that the problem may be entirely unrelated to the code that throws the exception. – Tech Inquisitor Sep 20 '21 at 21:43
  • Soooo... Redis? – Caius Jard Sep 20 '21 at 22:02
  • @TechInquisitor my experience with `OutOfMemoryException`s is quite slim to be honest, but I can't imagine that knowing the line that throws the exception can be a totally useless information. – Theodor Zoulias Sep 20 '21 at 23:41
  • 2
    @TheodorZoulias Imagine that one part of a program periodically allocates and frees a moderately sized chunk of memory, while another part slowly leaks. It's very likely that the perfectly innocent part allocating and freeing memory will be the part that eventually throws OOME. You cannot diagnose OOME from its stack trace at all, only by reasoning about the program as a whole, preferably with the help of a memory analyzer. – Tech Inquisitor Sep 21 '21 at 14:20

1 Answers1

1

If you compile for 32 bit, you'll get 2 GB of virtual memory. While 2^32 would be 4 GB, that thing has nasty historical reasons, misusing the highest bit of the pointer for an additional boolean, leaving us with 2^31 only.

If you compile for 32 bit and activate the Large Address Aware (LAA) flag, you get something from 2 GB to 3 GB on 32 bit operating systems and 4 GB on 64 bit operating systems. Just skip that option - it has too many conditions and needs modifications with editbin and even bcdedit for the 32 bit OSs.

If you compile for 64 bit, you get tons of memory (8TB), more than you can probably buy. If you compile for AnyCPU, which is default in Visual Studio, you need to uncheck "Prefer 32 bit" in the project settings to unleash the power of 64 bit.

Where to make the settings

There is also no 13 million keys limit, IMHO. I can easily get 95M entries in a simple dictionary here. You can run the following test code to check:

static void Main()
{
    var d = new Dictionary<int, int>();
    int count = 0;
    try
    {
        while (count < int.MaxValue)
        {
            d.Add(count, count);
            count++;
        }
    }
    catch (OutOfMemoryException oom)
    {
        d.Clear();
        GC.Collect(2);
        GC.WaitForFullGCComplete(1000);
        Console.WriteLine(oom.Message);
    }
    Console.WriteLine($"Added {count} items to dictionary");
    Console.ReadLine();
}

Using random strings instead of ints as the value also shows that this will use more than 2 GB of memory and even more than my PC has available with physical RAM. It will be swapped to disk when it does not fit into RAM.

RAM usage

If compiling as 64 bit does not help, you should start to find out what exactly eats your RAM, like

Thomas Weller
  • 55,411
  • 20
  • 125
  • 222