1

I have a 73 GB big table table1 . Most of my queries involve fetching the latest 100 entries in the table which satisfy some condition. I am using Django as my backend for this web app. My query is something similar to this:

table_object = table1.objects.filter(first_filter_field=6)
order_by = '-created_on'
table_object = .filter(not_null_field__isnull=False).order_by(order_by)
table_object = table_object[offset:(offset + limit)]

I think this query takes too much time.

  1. How can I measure the time taken by this query? (in Postgres)
  2. How can I improve its performance? I just need latest 100 creatives which satisfy some condition (basically WHERE statement addition in normal query.)

I am new to Django, so please be descriptive and give links to the related material wherever possible.

Community
  • 1
  • 1
Sourav Prem
  • 1,063
  • 13
  • 26

1 Answers1

4

1) You can use something called EXPLAIN ANALYZE. This'll help you. Refer to this post for more info.
2) Use db_index=True while creating an attribute. This'll help you speed up searches as it indexes the column to which it is added.
Note - For the primary key column, db_index=True is by default. However, you add db_index=True for more than one column.

Pranjal
  • 500
  • 6
  • 12