I have a New Dictionary(Of String, Long()) with 3,125,000 unique (string) keys. I am distributing close to 1 billion (935,984,413) values (all longs) amongst the keys, and populate a long() array for each key.
This works fine and very fast for average datasets, let's say for 1,5000,000 string Keys and 500,000,000 Long values to be distributed, this is done in about 2 hours.
However, for the abovementioned dataset, once I get halfway through my data, the process is running extremely slow and at the current trend may never finish ...
I think I am running out of memory, the application is using 5GB of memory, and I believe it is now limited by my system (8GB of RAM).
How can I calculate the amount of memory I need for the above situation? The size of the string Keys average around 5 characters.
Thanks!