0

after a lot of searching I have unfortunately not found a solution for this problem under .Net 6.0.

My application is running on a x64 Linux server and the dictionary contains just 3,192,915 elements and I get the OutOfMemoryException.

I have tried gcAllowVeryLargeObjects but this seems to be ignored.

x.runtimeconfig.json: `

{
  "runtimeOptions": {
    "tfm": "net6.0",
    "framework": {
      "name": "Microsoft.NETCore.App",
      "version": "6.0.0"
    },
    "configProperties": {
      "System.Reflection.Metadata.MetadataUpdater.IsSupported": false,
      "System.Reflection.NullabilityInfoContext.IsSupported": true
    },
    "gcAllowVeryLargeObjects": {
      "enabled": true
      }
  }
}

`

Denys
  • 3
  • 1
  • 3
  • 2
    Out of curiosity, what is your use case for having so many items in a dictionary? – ProgrammingLlama Oct 28 '22 at 08:27
  • Having 3 million entries in a dictionary isn't automatically a problem - it depends on how much memory those entries take up... – Jon Skeet Oct 28 '22 at 08:34
  • I initialize some transactions of a blockchain and before I put everything in the database, I process 3 months of transactions. – Denys Oct 28 '22 at 08:40
  • 1
    Do you actually need 3 million items in a dictionary all at once? Can you do it in a streaming way? I ask because if you're having problems with 3 million now, if that amount grows considerably in future, even if you solve the problem now, you might hit it again in future. If you can redesign now, it might be better. – ProgrammingLlama Oct 28 '22 at 08:43
  • @ProgrammingLlama what do you mean in streaming way? I wanted to use O(1) on finding transactions on there ID. And to process the 3 months and then using database for finding keys and changing them. – Denys Oct 28 '22 at 08:48
  • I also delete keys when the transaction got marked as "deleted", because then this get later another ID, so I don't need it, so I already manage the dictionary. – Denys Oct 28 '22 at 08:51
  • Streaming, as in: item comes in, process, item goes out. I just meant that if you fix this so you can store 5 million records in a dictionary, what happens when you need to store 6 million, etc.? – ProgrammingLlama Oct 28 '22 at 08:57
  • My image of what you're doing (maybe it's wrong) is that you're loading all of your existing transactions from the database into a dictionary, and then pulling the new ones, going through, matching them up, and then performing any necessary updates to the database. In such a scenario (assuming SQL Server), I would be inclined to batch transactions (say 100 at a time) into a MERGE INTO query, and do the updates in SQL. I don't know what your specific use case is and if that kind of thing is possible here though. – ProgrammingLlama Oct 28 '22 at 09:02
  • Ah I see, I currently go block by block and process the transactions, who sent what where, for how long etc and when the ID is marked as "removed" I remove it from the list again. So currently >3m entries exist, but theoretically as long as they exist but no action is performed on, they are relatively uninteresting, so if those x blocks have not been touched, you could remove them from the dictionary and swap them out to the database. Somehow so, I think there is no way around the database, the database is also local on the server, so should not really be a problem, since the ID is indexed. – Denys Oct 28 '22 at 09:09

0 Answers0