0

im trying to inserting hundred / thousand data based on selected data from others table, but i found an error "too many clients", here is the error too many client

Im using pgp (pg promise) lib, here is my snippet

function call(){
   for (let index = 0; index < 5; index++) {
        getPendingData().then((result) => {
            db.tx((t) => {
                let counter = 0;
                const queries = result.map((data) => {
                    counter++;
                    return db.none(`insert into test_data (id, isdeleted, parentid) values ('${uuidv1()}', 0, '${uuidv1()}x-${uuidv1()}' ) `);
                });
                return t.batch(queries);
            });
        });
   }
}


let  getPendingData = async () => {
     return db.task('getPendingData', async (t) => {
         return await t.any('select * from other_table');
     });
}


(call())

im setup max pg conn is 100, any clue how to solved this without add max conn?

  • Install [uuid-ossp](https://www.postgresql.org/docs/current/uuid-ossp.html), then run a single `insert into test_data (id, isdeleted, parentid) select uuid_generate_v1()::text, 0, uuid_generate_v1()::text || 'x-' uuid_generate_v1()::text from other_table;` query. – Bergi Nov 28 '21 at 04:50
  • @Bergi i think issue not comming from uuid, its comming from client conn – Ronaldo Cristover Nov 28 '21 at 05:05
  • 1
    No, the issue is coming from your approach. Don't make hundredths or thousands of queries to the database (even in a transaction), if the database can do all the work in a simple single query. – Bergi Nov 28 '21 at 05:07
  • but i need to insert a lot data bases on selected queries, and the queries return a thousand of rows – Ronaldo Cristover Nov 28 '21 at 05:14
  • 1
    Just split queries into chunks and run only chunk queries in parallel and process all chunks sequentially. For instance, you can split an array of queries using `lodash` `chunk` function – Anatoly Nov 28 '21 at 10:13

1 Answers1

4

There are too many problems in your code, unfortunately...

It is incorrect both in how you use promises, and in how you use pg-promise.

The issue with the promises is that you're supposed to chain them, that means using the result from db.tx (return db.tx(...)), which you're not doing, creating a loose promise, and as a result, a loose connection associated with the transaction. Also, return await is an anti-pattern.

The issue with pg-promise usage is that you're supposed to execute queries against the transaction/task context t that you are creating (as shown here). But you're executing each query against db - root connection instead, which spawns countless connection requests.

Also, creating a task to execute just one query doesn't make any sense.

If that's not bad enough, you're doing string concatenation for values, which is a strict no-no in query formatting.

And the last, multiple inserts should be executed as multi-row queries, not as separate queries, which is a waste of performance - see multi-row inserts.

vitaly-t
  • 24,279
  • 15
  • 116
  • 138