1

I am working on a Flask application which maps an existing datamodel built in Postgresql. I have mapped the tables using automap and can query the objects properly:

Base = automap_base()
Base.prepare(db.engine, reflect=False)
Object1 = Base.classes.table1

I am now querying table1 and I need to retrieve a timeseries-like list. The object has a numeric value and a datetime in TIMESTAMP postgresql format.

When I run the query and try to export result as a json with jsonify

query = session.query(Object1.value, Object1.createtime).order_by(Object1.createtime.desc())
return jsonify(str(query.all()))

I get this result back:

"[(5543157, datetime.datetime(2021, 4, 3, 7, 29, 53)), (5543156, datetime.datetime(2021, 4, 3, 7, 29, 37))]"

how can I get back a unix timestamp instead of the datetime.datetime format? I would like to avoid to go through a loop to convert every element as the query might retrieve more than 1M elements.

Mik1893
  • 317
  • 4
  • 14

1 Answers1

3

Instead of selecting the column directly, select an expression that results in the value you want. In this case, you can use date_part('epoch', <timestamp>) to generate a Unix epoch-seconds timestamp.

from sqlalchemy import func
# ...
query = session.query(
   Object1.value, func.date_part('EPOCH', Object1.createtime)).order_by(Object1.createtime.desc())
# ...

SQLAlchemy side of this (I have't used it myself in some time) is courtesy of this answer.

AdamKG
  • 13,678
  • 3
  • 38
  • 46