I want to merge two nested dicts with each other, sometimes they have the same amount of dicts in dicts sometimes not.
By merging I mean I want to keep everything in my already existing .json file and only update the new changed values from my example dict dict_01
. If a key or whole dictionary does not exist, I want to add a new dict in dict if let's say 'name_03' does not yet exist.
import json
from pprint import pprint
json_filepath = 'database.json'
dict_01 = {"database": {"name_01": {"name": "name_01",
"count": 10,
"size": "3"},
"name_03": {"name": "name_01",
"count": 10,
"size": "3"}}}
with open(json_filepath, 'r', encoding="utf-8") as f:
data = json.load(f)
pprint(data)
This seems to only replace my current dict. Therefore all existing information is deleted. Not what I'm after.
data.update(dict_01) # That doesn't do what I want it to do
with open(json_filepath, 'w+') as f:
json.dump(data)
Sample content of database.json
file
{"database": {"name_01": {"count": 1,
"file_count": 1,
"folder_count": 0,
"hdd_master_name": "name_01_suffix",
"last_scanned": "14/04/20 15:55",
"name": "name_01",
"server_path": "root.txt",
"size": "0.18"},
"name_02": {"all_types_count": 4,
"file_count": 8,
"folder_count": 0,
"hdd_master_name": "name_02_suffix",
"last_scanned": "14/04/20 15:55",
"name": "name_02",
"server_path": "...",
"size": "50.34"}}}
The result I'm looking for:
{"database": {"name_01": {"count": 10,
"file_count": 1,
"folder_count": 0,
"hdd_master_name": "name_01_suffix",
"last_scanned": "14/04/20 15:55",
"name": "name_01",
"server_path": "root.txt",
"size": "3"},
"name_02": {"all_types_count": 4,
"file_count": 8,
"folder_count": 0,
"hdd_master_name": "name_02_suffix",
"last_scanned": "14/04/20 15:55",
"name": "name_02",
"server_path": "...",
"size": "50.34"},
"name_03": {"name": "name_01",
"count": 10,
"size": "3"}}}