You could dynamically load the entire set of models like the below prototype. It doesn't handle a lot of edge cases like passing more info to a table or building relationships.
I think the real problem will be when you add a column you have to alter the table and potentially manipulate/migrate the existing data for it to make sense. Something like alembic can attempt to auto generate migrations but most of the time you have to customize those or add parts that could not be automatically detected. So you could just reload all the models from the csv and then run the manually coded migration to update the actual data in the database. I don't think it would really be worth it compared to just changing the model file manually since you still have to code up the migration.
Even if the models were stored in JSON and your migration altered both the db and the JSON it would be a huge custom thing you'd always be working around. Ie. This doesn't handle association tables that don't have a corresponding class. There would be so many edge cases. Interesting idea though.
There are some links at end.
Anyways, this was my thought:
#necessary sqlalchemy imports ...
Base = declarative_base()
engine = create_engine("sqlite://", echo=False)
# Constrain allowed types by filtering them through whitelist.
allowed_types = dict([(t.__name__, t) for t in (String, Integer,)])
# Extract this from CSV.
models = [{
'classname': 'Customer',
'tablename': 'customers',
'cols': [
{'name': 'customer_id', 'type': 'String', 'extra_kwargs': {'primary_key': True}},
{'name': 'full_name', 'type': 'String'},
{'name': 'occupation', 'type': 'String'},
],
}]
# Create class and table for each model and then shove the class
# into the globals()
for model in models:
cols = [Column(col['name'], allowed_types[col['type']], **col.get('extra_kwargs', {})) for col in model['cols']]
globals()[model['classname']] = type(model['classname'], (Base,), {
'__tablename__': model['tablename'],
'__table__': Table(
model['tablename'],
Base.metadata,
*cols),
})
Base.metadata.create_all(engine)
with Session(engine) as session:
customer = Customer(customer_id='some-guy', full_name='Some Guy', occupation='Some Job')
session.add(customer)
customers = session.query(Customer).all()
assert len(customers) == 1
assert customers[0].customer_id == 'some-guy'
alembic operations
dynamic class generation
type built-in