When is it important to use Client instead of Pool in node-postgres?
In the case of serverless functions, you should probably always use the Pool
. However, sometimes you can borrow ("checkout") a Client from the pool, for:
const myId = "20";
const client = await pool.connect(); // borrow a client from the pool
try {
} catch (e) {
const result = await client
.query('SELECT * FROM users WHERE id = $1', [myId]);
} finally {
client.release(); // return the client
}
A Pool with size of 1 would act just like a client. Using a pool from the start would allow you to run multiple SQL queries in parallel in the same executing function in the future, since each client can only run 1 query at a time.
// JS will wait for one before trying to get two.
// If the query for two isn't affected by one, why await.
const one = await client.select()...
const two = await client.select()...
// Looks like the requests can be concurrent...
// but if pool uses `max:1` or a client is used, it still won't be.
const onePromise = client.select()...
const twoPromise = client.select()...
const [one, two] = Promise.all([onePromise, twoPromise]);
// Concurrent requests using Pool
// `max:1` or a client is used, it still won't be.
const onePromise = pool.select()...
const twoPromise = pool.select()...
const [one, two] = Promise.all([onePromise, twoPromise]);
Are there any issues associated with using Pool with max: 1 serverless?
Yes. In some cases, might be slow. If you're running queries in parallel, a pool with max: 1
or a single client has the same issue: you have to wait for the first query to finish before starting the second.
When to limit serverless concurrency or pool size
In rare cases with huge demand and therefore more serverless functions running simultaneously, you might exhaust postgres's max client count ("The default is typically 100 connections.").
Even if you controlled the max concurrent serverless functions to be 100, each function might make more than 1 client (if the pool size is > 1). This would happen if you run multiple queries in parallel in the same function. This means at max_connections=100
and for example, 5 parallel queries from the same worker, you'll hit the limit even with 20 concurrent functions.
Either (or do all of them):
- reduce the max pool size, (might slow down your function)
- reduce the worker concurrency, (will make your users/clients wait for a function to be freed)
- handle more client connections on Postgres
- try Neon ("serverless" postgres) connection pooling:
To overcome this limitation, Neon supports connection pooling using PgBouncer, allowing Neon to support up to 10000 concurrent connections.
Useful resources