I'm working on a program that deals with contact information using Json.NET extension.
My data is a JSON file located on my local disk that stores information as the following:
"id": "1",
"forename": "Mikael",
"surname": "Smith",
"birth_date": "27/10/1995",
"availability": true,
"experience": "beginner",
"advisers": [
{
"name": "Boris",
"relation": "Mentor",
"number": "123456"
}
To save memory what I do is initially load all the forename fields to a listbox and when you click on of the name it loads the object like this:
JObject full_json = (JObject)JsonConvert.DeserializeObject(File.ReadAllText(file_path, Encoding.UTF8));
var queried_json = (from a in full_json["ContactBook"]
where a["id"].ToString().Equals(id.ToString())
select a).ToList();
JObject person = JObject.Parse(queried_json[0].ToString());
Until today I held about 10-20 records in the file and so there was no impact on my performance, but once I'll reach 100-200 records it would be a bit more problematic.
The only way I came up with for working with a file and Json.NET is loading it all to my memory, querying it and then discarding it when my query is done, which sounds a bit inefficient.
Should I stick to loading to memory or querying the file?
P.S: the encoding has to be UTF-8 since I'm working if special letters