The json
module only knows how to handle the basic Python types that it maps to the basic JSON types—list
, dict
, str
, float
, bool
, and NoneType
.
But you can override the JSON encoder and decoder objects to add in code to handle additional types in any way you want. This is documented, with some examples, but isn't entirely trivial.
But, more simply, if you know the structure of the data you're trying to save and load, you can just transform it on the fly. For example:
def save_stuff(f, list_of_arrays):
list_of_lists = [list(arr) for arr in list_of_arrays]
json.dump(f, list_of_lists)
def load_stuff(f):
list_of_lists = json.load(f)
list_of_arrays = [np.array(lst) for lst in list_of_lists]
return list_of_arrays
Or you can go the opposite direction: convert the list of arrays to an array of arrays and use Numpy to save that.
Either way, this is less efficient, because you have to create those temporary lists. If you have gigantic arrays, or if you're doing this zillions of times, you will probably want to do the more complicated work of overriding the encoder and decoder.