Let's say we have a dictionary
import numpy as np
d={}
d["s0"]=3
d["s1"]=np.int16(3)
d["s2"]=np.array("hello")
d["s3"]=np.array([2])
d["s4"]=np.linspace(0,2, 3)
One way to save this dictionary is to use json. Which means serializing and storing the data as a list. In this case there can be loss of precision.
Another way is to convert this into a pandas DataFrame and save that to hdf:
import pandas as pd
df=pd.DataFrame(d)
#dd.io.save("test.h5", d)
store = pd.HDFStore('store.h5')
store["data"]=df
But this failed. I get:
ValueError: arrays must all be same length
A yet third way is to use deepdish:
dd.io.save("test.h5", d)
The problem with this method was it wants my keys to be strings and misses key data without throwing up error:
$h5ls test.h5
s3 Dataset {1}
s4 Dataset {3}
Note that "s0", "s1" and "s2" were not saved to the file and no error was reported. So what is the safest way to store a python dictionary to an hdf file?
I don't want to use pickle dump because it will be hard to read back in Fortran. This question is not a duplicate of this question because it shown how those methods failed to store needed data.