We are upgrading a custom application from PHP to Python / Django / SQLAlchemy. The vendor database it is built on is huge... several thousand tables with 100's of columns, multiple keys, and many layers of dependencies. The PHP application uses OCI and SQL to interact with the database. Database queries can involve as many as 10 tables with complex join conditions, and there are 100s of queries in the application. Trying to create models for all these tables doesn't seem practical. Is it possible to use SQLAlchemy without creating models for all these tables?--just execute the sql directly?
Asked
Active
Viewed 67 times
-1
-
Another link: [Working with the DBAPI cursor directly](https://docs.sqlalchemy.org/en/14/core/connections.html#working-with-the-dbapi-cursor-directly) – Christopher Jones Sep 23 '22 at 05:00
1 Answers
0
Maybe you want to take a look at this https://docs.sqlalchemy.org/en/14/core/reflection.html
You can use reflect to get the tables as a dict
engine = create_engine()
metadata_obj = MetaData()
metadata_obj.reflect(bind=engine)
users_table = metadata_obj.tables['users']
addresses_table = metadata_obj.tables['addresses']

bitflip
- 3,436
- 1
- 3
- 22