0

I'm working on a program that deals with contact information using Json.NET extension.

My data is a JSON file located on my local disk that stores information as the following:

"id": "1",
  "forename": "Mikael",
  "surname": "Smith",
  "birth_date": "27/10/1995",
  "availability": true,
  "experience": "beginner",
  "advisers": [
    {
      "name": "Boris",
      "relation": "Mentor",
      "number": "123456"
    }

To save memory what I do is initially load all the forename fields to a listbox and when you click on of the name it loads the object like this:

JObject full_json = (JObject)JsonConvert.DeserializeObject(File.ReadAllText(file_path, Encoding.UTF8));

        var queried_json = (from a in full_json["ContactBook"]
                         where a["id"].ToString().Equals(id.ToString())
                         select a).ToList();

        JObject person = JObject.Parse(queried_json[0].ToString());


Until today I held about 10-20 records in the file and so there was no impact on my performance, but once I'll reach 100-200 records it would be a bit more problematic.

The only way I came up with for working with a file and Json.NET is loading it all to my memory, querying it and then discarding it when my query is done, which sounds a bit inefficient.

Should I stick to loading to memory or querying the file?

P.S: the encoding has to be UTF-8 since I'm working if special letters

OmerR
  • 49
  • 1
  • 6
  • 2
    200 hundred records won't be problematic to memory either. Is there a reason you don't consider a database? SQLite can be embedded with the application as a file, there is no server component. – Crowcoder Mar 24 '18 at 11:13
  • 1
    Agree that 200 (or even 2000) records aren't a problem. But two tips: 1) Avoid `File.ReadAllText()` and stream directly as shown in [Can Json.NET serialize / deserialize to / from a stream?](https://stackoverflow.com/a/17788118) and https://www.newtonsoft.com/json/help/html/Performance.htm#MemoryUsage; 2) Deserialize to an explicit model rather than a `JToken` hierarchy. This eliminates the need to retain all the property name strings in memory and may be more performant as explained in https://stackify.com/top-11-json-performance-usage-tips/ item #6. – dbc Mar 24 '18 at 16:37
  • Also, there's no need to ever convert to a string and re-parse. So `JObject.Parse(queried_json[0].ToString());` should be `(JObject)queried_json[0];` – dbc Mar 24 '18 at 16:44
  • @Crowcoder good question, sadly the program is more someone from work that doesn't really understand (or willing to) computers and looks for something as simple as possible – OmerR Apr 02 '18 at 19:35
  • 1
    @dbc thanks for the tips, you really shed some light for me, I dropped the `File.ReadAllText()` and used the Stream and Deserializer with a model and it works way better thanks! – OmerR Apr 02 '18 at 19:37
  • It would literally be as simple as a file if you still want to consider it. It is zero configuration and zero maintenance. The user wouldn't even know where the data is. – Crowcoder Apr 02 '18 at 20:23

0 Answers0