What you appear to be looking for is asynchronous queries with pipelining. Queries are queued up on the connection, executed in the order they're sent, and their results are returned in the order of execution.
This is supported by the PostgreSQL protocol. Some client drivers, including libpq
and PgJDBC
, support asynchronous queries. However, the only driver I'm aware of that supports pipelining is PgJDBC, and it doesn't support pipelining for individual queries, only queries in a batch run via executeBatch
.
To use libpq's async mechanism to do what you want, you'll need a pool of connections (since it can't pipeline on one connection), and you'll need to periodically check to see if the query is complete on a connection. You can't fire and forget, and there's no callback mechanism for completions. Something like an epoll
loop to check for new pending input would do, and it sounds like that'd be a good fit for node.js's event-driven model.
There's been some discussion on adding a pipelined asynchronous mode to libpq
, where queries return a future object, and you can check whether the result object is populated and/or wait on it, while also dispatching more queries. Nothing has been implemented yet AFAIK. You might want to get involved. It's not technically all that hard, and is mostly a matter of not having people who want it enough to help implement it. Speak up on pgsql-hackers.
I was even wondering if by just writting PERFORM myProcedure()
it would just work since perform expects no return value.
No. It still waits. It has to know if there's been an error or not, and subsequent code is also allowed to expect that any side-effects of the procedure are complete.
For node-postgres, see:
It looks like asynchronous callbacks are the default way to do things on node. It won't be pipelined.
So, I'd either use a pool of connections with libpq
using async queries and an epoll
loop, or I'd use node-postgres.