0

How to Deserialize to object of around 2GB.

Getting out of memory exception after adding 100000 records to List.

Please let me know if there is an alternative to return around 900000 records in a list object or string?

var repsonse = httpClient.PostAsync(url,content);
List<object> lstObj = new List<object>();

using (var responseStream = response.Result.Content.ReadAsStreamAsync().Result)
{
    using (var textReader = new StreamReader(responseStream))
    {
        using (var jsonReader = new JsonTextReader(textReader))
        {
            while (jsonReader.Read())
            {
                object o = serializer.Deserialize<object>(jsonReader);
                lstObj.Add(o);
            }
        }
    }
}
Sathish Guru V
  • 1,417
  • 2
  • 15
  • 39
phanish
  • 45
  • 1
  • 12
  • 1
    What are you going to do with those objects? You should not load them all in memory. Why not process them when read, one by one? – Oguz Ozgul Apr 09 '20 at 14:33
  • I use the list of objects to pass it on to the next step i.e. formatting the content – phanish Apr 09 '20 at 14:51
  • 2
    1) What line of code throws the `OutOfMemoryException`? 2) What does your JSON look like? Is it an array of objects? If you look at [this answer](https://stackoverflow.com/a/43747641) to [How to parse huge JSON file as stream in Json.NET?](https://stackoverflow.com/q/43747477) you only want to deserialize when `reader.TokenType == JsonToken.StartObject`. Also, by deserializing to `object` you are actually deserializing to `JToken` which requires a lot of memory. Can you deserialize to a typed data model -- i.e. a POCO? That will save lots of memory on property names. – dbc Apr 09 '20 at 15:03
  • @dbc lstObj.Add(o); line throws out of memory exception since list object is not able to add more than 100000 records. i am checking for reader.TokenType == JsonToken.StartObject in the code but still the same problem – phanish Apr 09 '20 at 15:16
  • If the `List lstObj` cannot add more than 100000 item references, it probably means you are running in 32 bit mode and memory is fragmented enough that the list capacity cannot be expanded to 200000 (contiguous) items even though you have not actually run out of memory. In such situations switching to a 64 bit build as suggested below makes sense. – dbc Apr 09 '20 at 15:21
  • @dbc switched to AnyCpu still the issue remains. – phanish Apr 09 '20 at 15:32
  • Executables built with `AnyCpu` don't always run as 64bit, see [this answer](https://stackoverflow.com/a/12066861). Try setting x64 explicitly. If you really are running in 64 bit mode more details about your JSON and configuration would help us to help you. – dbc Apr 09 '20 at 15:39
  • 1
    You could also try using a [`LinkedList`](https://learn.microsoft.com/en-us/dotnet/api/system.collections.generic.linkedlist-1?view=netframework-4.8) instead of a `List` since it does not need to allocate a large contiguous block of memory and copy everything when it needs to grow. – Brian Rogers Apr 10 '20 at 01:32
  • @BrianRogers tried using LinkedList instead of a List but still i am getting the outofmemory exception – phanish Apr 10 '20 at 03:47

1 Answers1

-1

if you realy need to load 2gb of an object in memory, Go to the project properties, you need to select Target platform "AnyCpu" , so you can run on 64bits and use the memory enter image description here