1

I'm building a console application/windows service that consumes json from an API and then deserializes to objects. The service is triggered by a timer that runs on interval, so each time the timer elapses the API is called and response json is deserialized to be compared against objects in memory to see if anything has changed (no event triggers on the source API).

The problem is that each time json is deserialized, the memory usage of the application increases unexpectedly.

Is this a structure problem, a misuse of Json.Net, or a memory leak in the Json.Net implementation?

Tried "using()" on JOjbects and JTokens but they're not disposable. Is there no garbage collection? Also considered removing the "response","enrollment", and "enrollment" levels from the json string manually via search or something but that seems like it should be unnecessary.

Json:

    {
       "response":{
          "code":"OK",
          "enrollments":{
             "enrollment":[
                {
                   "id":"enrollment_id",
                   ...,
                   "course":{...},
                   "user":{...}
                },
                {
                   "id":"enrollment_id",
                   ...,
                   "course":{...},
                   "user":{...}
                }, 
            ...]}}}

Models:

[Serializable]
class Enrollment
{
    props ... { get; set; }

    ... and some serializable objects below
    public Course Course { get; set; }
    public User User { get; set; }

}

Json Handler

    public static List<Enrollment> JToEnrollments(string json)
    {
        List<Enrollment> enrollments = new List<Enrollment>();
        JObject jo = JObject.Parse(json);
        JToken eList = jo["response"]["enrollments"]["enrollment"];
        string jString = "";
        foreach (var e in eList)
        {
            jString = JsonConvert.SerializeObject(e);
            Enrollment enrollment = JsonConvert.DeserializeObject<Enrollment>(jString);
            enrollments.Add(enrollment);
        }
        jo = null; // attempt at cleanup
        eList = null; // attempt at cleanup

        return enrollments;
    }

Desired result is a temporary List object that will be compared to a List object in memory. The List in memory will then be updated if necessary and the temporary List is disposed.

Note: Looking at Memory Usage, one of the larger increases in memory is a List that has stuff like Action, NewtonSoft.Json.Serialization.DynamicValueProvider inside.

Dogmabase
  • 11
  • 1
  • 4
  • Setting an object to null won't do anything. The references will go out of scope at the end of the method. You mentioned several things your code does including a timer and an API call. Have you ruled those out? I'm also a little but confused by the part where the enrollment is serialized and then deserialized on the next line. (But there's plenty of ways of using this stuff that I'm not familiar with.) – Scott Hannen Aug 07 '19 at 00:36
  • 1
    1) How big is `jString`? If it's longer than 42,500 characters then it gets allocated on the [large object heap](https://stackoverflow.com/a/8953503/3744182), which isn't compactified by default. 2) What version of Json.NET are you using? There was a memory leak serializing `enum` values in 12.0.1 that was fixed in 12.0.2, see https://github.com/JamesNK/Newtonsoft.Json/issues/1991 – dbc Aug 07 '19 at 00:42
  • Json.net [caches](https://stackoverflow.com/q/33557737/3744182) type information, so memory use will permanently increase the *first* time you serialize a type. It shouldn't increase every time though. You're not using dynamic proxies or run-time code generation, are you? – dbc Aug 07 '19 at 00:45
  • 1
    Why are you serializing to a string then immediately deserializing? – mason Aug 07 '19 at 00:45
  • I agree that the serialize then deserialize seems like it's got to be the wrong approach. I'm putting together a test project to see if that can be improved. – Dogmabase Aug 07 '19 at 00:47
  • @mason - because they want to deserialize `e` to `Employment` but don't know the API [`e.ToObject()`](https://www.newtonsoft.com/json/help/html/M_Newtonsoft_Json_Linq_JToken_ToObject__1.htm) which is the preferred method. – dbc Aug 07 '19 at 00:47
  • what do you mean under 'increases unexpectedly'? What is the idle memory consumption and what is the maximum you were able to hit just looping that code? Please provide a repro code and version of Json.NET you are using. – fenixil Aug 07 '19 at 00:47
  • Agree a [mcve] would help clarify things a lot. – dbc Aug 07 '19 at 00:48
  • Not using dynamic proxies or run-time code generation. Json.NET version is 12.0.2 – Dogmabase Aug 07 '19 at 00:50
  • 1
    Then try eliminating `jString` entirely and use [`e.ToObject()`](https://www.newtonsoft.com/json/help/html/M_Newtonsoft_Json_Linq_JToken_ToObject__1.htm) which is the preferred method. Or do `return eList.ToObject>();` which would be even simpler. – dbc Aug 07 '19 at 00:52
  • @dbc, The eList.ToObject>(); is much more elegant and seems to have reduced the memory usage quite a bit, thank you. Also, the json string is some 100k+ characters, next I'm going to look into (de)serializing from stream as recommended in the Json.NET documentation. Edit:To clarify, do all strings over 42.5k characters go to large object heap or just jStrings? – Dogmabase Aug 07 '19 at 01:20
  • *To clarify, do all strings over 42.5k characters go to large object heap or just jStrings?* -- see [Why Large Object Heap and why do we care?](https://stackoverflow.com/a/8953503/3744182): any object larger than 85,000 bytes goes on the large object heap, with a special exception for arrays of doubles. *next I'm going to look into (de)serializing from stream as recommended in the Json.NET documentation.* - yes, that makes sense. Beyond that, there are some tricks you can do to eliminate the intermediate `jo` representation. – dbc Aug 07 '19 at 02:28

0 Answers0