The following simple example describes my problem with my postgres DB (although my question is more about sqlalchemy than postgres):
I have a table called detection
with columns:
- id
- item
- price_in_cents
- shop_id
I have another table called item
with the following columns:
- id
- detection_id (foreign key to detection.id)
- price_in_dollar
I want to move the entire dataset of a certain shop from table detection
to table item
whilst also performing an operation to convert cents to dollars (the example is theoretical, my real problem has a different operation than cents to dollars).
In raw SQL I can use the following query:
INSERT INTO item (detection_id, price_in_dollar)
SELECT id AS detection_id,
price_in_cent / 100 AS price_in_dollar
FROM detection
WHERE shop_id = {shop_id}
Is it possible to replicated this query using SQLAlchemy? Due to the volume of the data (could be millions of rows) I do not want to first download the data to do the operation and then upload it. My example that would work would be:
q = session.query(Detection).filter(Detection.shop_id == shop_id)
for detection_record in q:
session.add(Item(detection_id=detection_record.id,
price_in_dollar=detection_record.price_in_cent / 100))
session.commit()
This would however download all the data to the machine first instead of doing all the work in the DB itself and thus has different behaviour than my example query.