1

I'm doing a pretty big Postgres query. Is it possible to echo the results gradually in spite of waiting for the end of the query to echo all the results?

My code:

$sql1 = pg_query("select zhu_gid from indicateurs.note where zhu_gid not in (select gid from public.zh);");

while($data = pg_fetch_array($sql1))
   {
   echo $data['zhu_gid']."/";
   }
Erwin Brandstetter
  • 605,456
  • 145
  • 1,078
  • 1,228
Julien
  • 45
  • 1
  • 3
  • 15
  • I am not sure you can, but if you are using ajax requests to get the results you can add ``LIMIT`` and ``OFFSET`` to the query to get batches of results, and execute it as much times as needed to get it all. Otherwise, in that specific case, PHP will wait for the end of the query before carrying on with the rest of the code – ness Apr 18 '19 at 14:15
  • You can get PHP to flush the output one echo at a time if need be, but it still won't start echoing until the results have all been returned from the database – ADyson Apr 18 '19 at 14:58

1 Answers1

1

As to your question: No. Postgres itself first builds the result set, then returns to the client.

You could use a cursor for that (which still has to execute the query successfully first). Or multiple queries, partitioning the result with LIMIT / OFFSET or (preferably) deterministic WHERE conditions. You need to add a deterministic ORDER BY for those. But all of these pagination solutions have their specific difficulties. Related:

Aside, I'd not use NOT IN. See:

Consider instead:

SELECT n.zhu_gid
FROM   indicateurs.note n
LEFT   JOIN public.zh ON zh.gid = n.zhu_gid
WHERE  zh.gid IS NULL;
Erwin Brandstetter
  • 605,456
  • 145
  • 1,078
  • 1,228