JSON is a rigidly structured format, and Python's json
module, by design, won't try to coerce types it doesn't understand.
Check out this SO answer. While __dict__
might work in some cases, it's often not exactly what you want. One option is to write one or more classes that inherit JSONEncoder
and provides a method that turns your type or types into basic types that json.dump
can understand.
Another option would be to write a parent class, e.g. JSONSerializable
and have these data types inherit it the way you'd use an interface in some other languages. Making it an abstract base class would make sense, but I doubt that's important to your situation. Define a method on your base class, e.g. def dictify(self)
, and either implement it if it makes sense to have a default behavior or just have it it raise NotImplementedError
.
Note that I'm not calling the method serialize
, because actual serialization will be handled by json.dump
.
class JSONSerializable(ABC):
def dictify(self):
raise NotImplementedError("Missing serialization implementation!")
class YourDataType(JSONSerializable):
def __init__(self):
self.something = None
# etc etc
def dictify(self):
return {"something": self.something}
class YourIncompleteDataType(JSONSerializable):
# No dictify(self) implementation
pass
Example usage:
>>> valid = YourDataType()
>>> valid.something = "really something"
>>> valid.dictify()
{'something': 'really something'}
>>>
>>> invalid = YourIncompleteDataType()
>>> invalid.dictify()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in dictify
NotImplementedError: Missing dictify implementation!
Basically, though: You do need to handle this yourself, possibly on a per-type basis, depending on how different your types are. It's just a matter of what method of formatting your types for serialization is the best for your use case.