1

I have a postgres database table with nearly 700k records. I would want to have a python script that fetches the records one-by-one at 5seconds interval until the last record.

How do I go about it?

arilwan
  • 3,374
  • 5
  • 26
  • 62
  • Sample data and desired results would really help. For instance, do you mean 5 seconds since the last record? Or fixed 5-second intervals? – Gordon Linoff Apr 12 '19 at 11:53
  • Do you edit database between fetches? You can just iterate over `cursor.execute` and `sleep` in a loop – Adrian Krupa Apr 12 '19 at 12:02
  • I mean going through the records, in an interval of 5secs between the next fetch. Database not edited, read-only. – arilwan Apr 12 '19 at 12:25

1 Answers1

3

Do you need anything more than that?

from time import sleep

import psycopg2

conn = psycopg2.connect("dbname=test user=postgres")
cur = conn.cursor()
cur.execute("SELECT * FROM table")

for record in cur:
    # process record
    sleep(5)

cur.close()
conn.close()
Adrian Krupa
  • 1,877
  • 1
  • 15
  • 24
  • the whole goal is to read through the records and have the values sent over http in json format. – arilwan Apr 12 '19 at 17:11
  • Can you please suggest some guides here: https://stackoverflow.com/questions/55737528/python-retrieving-postgresql-query-results-as-formatted-json-values – arilwan Apr 18 '19 at 19:20