automap_base
from sqlalchemy.ext.automap
(tableNamesDict is a dict with only the Pandas tables):
metadata = MetaData()
metadata.reflect(db.engine, only=tableNamesDict.values())
Base = automap_base(metadata=metadata)
Base.prepare()
Which would have worked perfectly, except for one problem, automap requires the tables to have a primary key. Ok, no problem, I'm sure Pandas to_sql
has a way to indicate the primary key... nope. This is where it gets a little hacky:
for df in dfs.keys():
cols = dfs[df].columns
cols = [str(col) for col in cols if 'id' in col.lower()]
schema = pd.io.sql.get_schema(dfs[df],df, con=db.engine, keys=cols)
db.engine.execute('DROP TABLE ' + df + ';')
db.engine.execute(schema)
dfs[df].to_sql(df,con=db.engine, index=False, if_exists='append')
I iterate thru the dict
of DataFrames
, get a list of the columns to use for the primary key (i.e. those containing id
), use get_schema
to create the empty tables then append the DataFrame
to the table.
Now that you have the models, you can explicitly name and use them (i.e. User = Base.classes.user
) with session.query
or create a dict of all the classes with something like this:
alchemyClassDict = {}
for t in Base.classes.keys():
alchemyClassDict[t] = Base.classes[t]
And query with:
res = db.session.query(alchemyClassDict['user']).first()