0

Suppose I have a table of measurements, each with a type column. I would like to group the measurements by their type.

Using the ORM, I can think of doing types = [m.type for m in session.query(Measurement).distinct(Measurement.type).all()] and then looping over types and doing a filter_by(type==type), but I feel that this will thrash the db.

I'm using postgres 10. Is it possible that I can thrash the db, and it will cache the intermediate results? If not, should I use something like redis to do the caching myself? The best option IMO is to optimize the queries done through SQLalchemy, but I do not know how to do so.

My desired output (format not necessarily matching) is something like:

types = ['type 1', 'type 2', 'type 3', ...]
counts = [1, 2, 3, ...]
Brandon Dube
  • 428
  • 1
  • 10
  • 26

0 Answers0