I need to export data from one DB to another instance / deployment ( eg: testing to production ) (same vendor/schema) . Can I achieve this using SQL Alchemy ? Its for a simple Import Export functionality for the end user.
The end user will click on all the values (lets say Employees) and he clicks on Export . All the Employee records along with their Deparment details ( since its a foreign key ) should be loaded into Python - and then written to a suitable XML format . Now the user has "XMLized" the database rows , he can go to another machine say Production and uploaded the "XML" and I write it to the DB - by overwriting Employees with same Name or uploading new Employee if it doesnt exist . First though I have to load the Department since its a Fkey using the same UPSERT conditions I specify .
In the below example I would like to supply the Table Employee and expect relevant Department Table to be loaded and saved to XML . This is the tricky thing I am trying to achieve .
SqlA can make my code database un-aware. I already have scripted this using normal sqls and python but I need a higher paradigm where fkeys needs to be loaded . Further I might need to load some tables which I specify which has one to many relation with my original table . I.e Say Employers Documents might be stored in a table called Documents , with one Empoyee having more than one Doc .
engine = create_engine('sqlite:///:memory:')
meta = MetaData()
employees = Table('employees', meta,
Column('employee_id', Integer, primary_key=True),
Column('employee_name', String(60), nullable=False, key='name'),
Column('employee_dept', Integer, ForeignKey("departments.department_id"))
)
sqlemployees.create(engine)