0

I have a problem. I have an array with a dict inside, and this dict is nested. I found pd.json_normalize(...) but unfortunately I have many nested keys e.g. contactEditor.

I found this question Use json_normalize to normalize json with nested arrays , How to convert a nested dict, to a pandas dataframe and so on, But I am not aware how to enter all the nested keys, I have run the code snippet below I got KeyError: 'contactSoldToParty'.

So how can I convert this list respectively this dict into a dataframe, with all the nested keys.

df= pd.json_normalize(my_data, all_nested_keys)

my_data = [
{'_id': 'orders/213123',
 'contactEditor': {'name': 'Max Power',
  'phone': '1234567',
  'email': 'max@power.com'},
 'contactSoldToParty': {'name': 'Max Not',
  'phone': '123456789',
  'email': 'maxnot@power.com'},
 'isCompleteDelivery': False,
 'metaData': {'dataOriginSystem': 'Goods',
  'dataOriginWasCreatedTime': '10:12:12',},
 'orderDate': '2021-02-22',
 'orderDateBuyer': '2021-02-22',
},
{'_id': 'orders/12323',
 'contactEditor': {'name': 'Max Power2',
  'phone': '1234567',
  'email': 'max@power.com'},
 'contactSoldToParty': {'name': 'Max Not',
  'phone': '123456789',
  'email': 'maxnot@power.com'},
 'isCompleteDelivery': False,
 'metaData': {'dataOriginSystem': 'Goods',
  'dataOriginWasCreatedTime': '10:12:12',},
 'orderDate': '2021-02-22',
 'orderDateBuyer': '2021-02-22',
 },
]
all_nested_keys = []
for key in collection__orders_copy_new[0]:
    if isinstance(collection__orders_copy_new[0][key], dict):
        #print(key)
        all_nested_keys.append(key)

print(all_nested_keys)

What I want

id             contactEditor_name contactEditor_phone contactEditor_email ...
orders/213123  Max Power          ...                 ...                 ...
orders/12323   Max Power2         ...                 ...                 ...
Test
  • 571
  • 13
  • 32
  • 1
    Given your example... does `pd.json_normalize(my_data, sep='_')` not work? (if you don't specify keys - it defaults to all (or up to N levels deep))... – Jon Clements May 05 '22 at 06:41
  • @JonClements thank you for your comment. Well, that worked. I was not not clear about that. – Test May 05 '22 at 06:47

0 Answers0