I need to work with a large amount of data in memory. I am loading it from an SQLite database on SSD and using EF6 to construct business objects from it. As soon as the Process Memory window shows usage hitting 3.2GB I get an Out Of Memory exception from Entity Framework.
For now I am just loading into lists. I had read somewhere that there were limits on the sizes of list structures so instead of using one big list I have created multiple simple DataBlock container objects to each hold a chunk of the required data. It doesnt seem to make any difference. The PC has plenty of RAM (16GB). I am using a new context to populate each DataBlock and then destroying them.
For Each DataBlock In DataBlocks
Using Context As New mainEntities
Dim FirstRecordTimeUTC As Long = TimeFunctions.ConvertDateToUTC(DataBlock.StartDate)
Dim LastRecordTimeUTC As Long = TimeFunctions.ConvertDateToUTC(DataBlock.EndDate)
Dim CandlesInRange = (From Cand In Context.HistoricalRecords
Where Cand.time_close_utc >= FirstRecordTimeUTC
Where Cand.time_close_utc <= LastRecordTimeUTC
Order By Cand.id
Select Cand).ToList
DataBlock.AllCandles = CandlesInRange
Dim RTsInRange = (From Cand In Context.HistoricalRecordRealTimes
Where Cand.time_close_utc >= FirstRecordTimeUTC
Where Cand.time_close_utc <= LastRecordTimeUTC
Order By Cand.id
Select Cand).ToList
DataBlock.AllRTs = RTsInRange
Dim StatsInRange = (From Cand In Context.InstrumentStats
Where Cand.time_close_utc >= FirstRecordTimeUTC
Where Cand.time_close_utc <= LastRecordTimeUTC
Order By Cand.id
Select Cand).ToList
DataBlock.AllStats = StatsInRange
End Using
Next
The compiler platform is set to 'Any CPU'. System is as follows:
Win 10 64, VS 2017, 16GB RAM, Ryzen 5 3600
Any thoughts on what I am doing wrong would be much appreciated.