0

I am using sqlalchemy (declarative) to map large amounts of json data to a db table. I have a model like this:

class Incident(Base):
    __tablename__ = 'incident'
    sys_id = Column(String(100), primary_key=True)
    state = Column(String)
    assigned_to = Column(String)

The data looks like this:

{
    "sys_id": "123456"
    "state": "Closed",
    "assigned_to": {
                "display_value": "Fred Bloggs",
                "link": "http://some.url"
            }
}

or sometimes like this:

{
    "sys_id": "123456"
    "state": "Unassigned",
    "assigned_to": ""
}

I get an error because the assigned_to field is sometimes a python dictionary. sqlalchemy.exc.ProgrammingError: (pyodbc.ProgrammingError) ('Invalid parameter type. param-index=1 param-type=dict', 'HY105')

I have many fields in the same form as "assigned_to". All I want to do is convert them to strings. I am sure this is saightforward but this is my first exposure to sqlalchemy

Tino
  • 63
  • 4
  • 13
  • Why not use a JSON column type, rather than String (assuming your RDBMS has such a type)? Then sqlalchemy will handle serialisation and deserialisation automatically? Or if you intend to pass only strings, preprocess the data to dump dicts to json? – snakecharmerb Aug 31 '20 at 08:11
  • looks like I can't use a JSON type: sqlalchemy.exc.CompileError: (in table 'incident', column 'u_contractual_region'): Compiler can't render element of type – Tino Aug 31 '20 at 08:15
  • 2
    Some of the answers to [this question](https://stackoverflow.com/q/4038314/5320906) may be useful. – snakecharmerb Aug 31 '20 at 08:21

0 Answers0