I'd like to use pydantic
for handling data (bidirectionally) between an api and datastore due to it's nice support for several types I care about that are not natively json-serializable. It has better read/validation support than the current approach, but I also need to create json-serializable dict
objects to write out.
from uuid import UUID, uuid4
from pydantic import BaseModel
class Model(BaseModel):
the_id: UUID
instance = Model(the_id=uuid4())
print("1: %s" % instance.dict()
print("2: %s" % instance.json()
prints
{'the_id': UUID('4108356a-556e-484b-9447-07b56a664763')}
>>> inst.json()
'{"the_id": "4108356a-556e-484b-9447-07b56a664763"}'
Id like the following:
{"the_id": "4108356a-556e-484b-9447-07b56a664763"} # eg "json-compatible" dict
It appears that while pydantic has all the mappings, but I can't find any usage of the serialization outside the standard json
~recursive encoder (json.dumps( ... default=pydantic_encoder)
) in pydantic/main.py
. but I'd prefer to keep to one library for both validate raw->obj (pydantic is great at this) as well as the obj->raw(dict) so that I don't have to manage multiple serialization mappings. I suppose I could implement something similar to the json
usage of the encoder, but this should be a common use case?
Other approaches such as dataclasses(builtin)
+ libraries such as dataclasses_jsonschema
provide this ~serialization to json-ready dict
, but again, hoping to use pydantic for the more robust input validation while keeping things symmetrical.