-2

I am fetching 300k+ records in a object from database. I am trying to serialize the heavy object below:

List<user> allUsersList = businessProvider.GetAllUsers();
string json = JsonConvert.SerializeObject(allUsersList);

I am getting following exception while serializing the list - allUsersList.

Out of memory exception

i am using newtonsoft.json assembly to deserialize it.

kaarthick raman
  • 793
  • 2
  • 13
  • 41
  • i will deserialize to display it in the UI. should i try someother Datastructure like List or Dictionary? – kaarthick raman Aug 17 '16 at 10:03
  • 3
    why @GiladGreen? It's irrelevant. The issue is the object when serialised as a string is too big. He needs to use a stream or up his thresholds, etc – Liam Aug 17 '16 at 10:06
  • If the User class has property with Type User , then there will be cyclic, and makes Newtonsoft to throw an exception, Also if the list is too big like more then 10,000 of users also could throw such exceptions, Could you please show the code of the User class – Tarek Abo ELkheir Aug 17 '16 at 10:08
  • If performance is not a concern, you can split the list, serialize one by one, and then combine it using persistent storage such as hard-drive. – hendryanw Aug 17 '16 at 10:08
  • Thanks for all ur response. @Hendry yes i did try that , but it takes too much of time to process and performance is a concern – kaarthick raman Aug 17 '16 at 10:09
  • 2
    300k is not only too much for JSon - it's also way too much for any User. – H H Aug 17 '16 at 10:10

1 Answers1

2

Based your comment

i will deserialize to display it in the UI

If you like your users, you wouldn't let them see 300k string records in UI. Instead, you should add searching, ordering, and paging functionality so that your users will only receive a handful of relevant results. This will improve usability and as a side effect fix your server side problem with the memory.

Even if you find a fix to serialise a long string into with memory, try thinking about the load on the database and on the web server now. Imagine if your site suddenly becomes popular and lots of users start hitting it with requests. If a single request uses that much memory your site will collapse pretty quickly.

oleksii
  • 35,458
  • 16
  • 93
  • 163
  • i am using sitefinity CMS where i am using a custom membership provider which uses these data. As you said the paging,sorting ,searching will be handled by sitefinity. all i have to do is pass the records to the sitefinityuserlist from the api. – kaarthick raman Aug 17 '16 at 10:15
  • This doesn't answer the question – Liam Aug 17 '16 at 10:15
  • OK, so just limit the number of records you bring back. @Liam, yes it does: *This will improve usability and as a side effect fix your server side problem* – oleksii Aug 17 '16 at 10:16
  • So what do we do incase i have neccessity to pass all 300K+ from this action in a single call? – kaarthick raman Aug 17 '16 at 10:28
  • I would speak to someone who created this requirement and try convincing her/him that there is a better approach. If and only if, it was something like an API call from a machine (so not a user) - I'd find a way to serialise it and compress it for the response to be short. As others have suggested, there are streams. I'd also think about JSON as a format, there might be a better binary serialisation formats for this, such as FlatBuffers. In my opinion, you are trying to solve the wrong problem. – oleksii Aug 17 '16 at 10:47
  • thanks for your suggestion. – kaarthick raman Aug 17 '16 at 11:09