I have this query where I'm trying to get the total number of records based on the given filters and trying to fetch the limited number of data as required for pagination in a single query. Earlier I was doing this in a two step process where i calculated the number of records first based on Optimising Select number of rows in PostgreSql for multiple group by fields and then got the required number of rows in a different query based on the page size and the offset needed
EXPLAIN select count (1) OVER () AS numrows , lower(column1) as column1, column2, column3, column4, column5, column6, column7, column8, sum(column9) as column9, sum(column10) as column10
from tableName tablename
where
column8 in (SOME 50-100 sets)
and column_date >= '2013-09-01'
and column_date < '2013-09-30'
group by lower(column1), column2, column3, column4, column5, column6, column7, column8
ORDER BY column9 desc
LIMIT 1000 OFFSET 0
XN Limit (cost=1000134721702.61..1000134721702.63 rows=1000 width=67)
-> XN Merge (cost=1000134721702.61..1000135118514.84 rows=158724893 width=67)
Merge Key: sum(column9)
-> XN Network (cost=1000134721702.61..1000135118514.84 rows=158724893 width=67)
Send to leader
-> XN Sort (cost=1000134721702.61..1000135118514.84 rows=158724893 width=67)
Sort Key: sum(column9)
-> XN Window (cost=107149638.61..113101822.10 rows=158724893 width=67)
-> XN Network (cost=107149638.61..108340075.31 rows=158724893 width=67)
Send to slice 0
-> XN HashAggregate (cost=107149638.61..108340075.31 rows=158724893 width=67)
-> XN Seq Scan on tableName tablename (cost=0.00..67468415.44 rows=1587248927 width=67)
Filter: ((column_date < '2013-09-30'::date) AND (column_date >= '2013-09-01'::date) AND (column8 = ANY ('{SOME 50-100 sets}'::integer[])))
version detail
PostgreSQL 8.0.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.4.2 20041017 (Red Hat 3.4.2-6.fc3), Redshift 1.0.666
Thanks