8

I have the following code in nodejs that uses the pg (https://github.com/brianc/node-postgres) My code to create subscriptions for an employee is as such.

    client.query(
      'INSERT INTO subscriptions (subscription_guid, employer_guid, employee_guid) 
       values ($1,$2,$3)', [
        datasetArr[0].subscription_guid,
        datasetArr[0].employer_guid,
        datasetArr[0].employee_guid
      ],


      function(err, result) {
        done();

        if (err) {
          set_response(500, err, res);
          logger.error('error running query', err);
          return console.error('error running query', err);
        }

        logger.info('subscription with created');
        set_response(201);

      });

As you have already noticed datasetArr is an array. I would like to create mass subscriptions for more than one employee at a time. However I would not like to loop through the array. Is there a way to do it out of the box with pg?

Sairam Krish
  • 10,158
  • 3
  • 55
  • 67
lonelymo
  • 3,972
  • 6
  • 28
  • 36
  • 1
    Use whatever interface node offers to PostgreSQL's `COPY` command. – Richard Huxton Jun 03 '14 at 08:16
  • @RichardHuxton: As per http://www.postgresql.org/docs/9.1/static/sql-copy.html 'COPY' command works only with STDIN (csv/file upload). How do I get it to work it with an array? – lonelymo Jun 03 '14 at 09:33
  • I don't know - that's why it's a comment not an answer. You'll need to read the documentation for the node-postgres library. – Richard Huxton Jun 03 '14 at 09:42
  • Wrap this all up in a transaction and execute the entire sequence at once: https://github.com/vitaly-t/pg-promise/wiki/Learn-by-Example#transactions – vitaly-t Jun 28 '15 at 00:42

6 Answers6

6

I did a search for the same question, but found no solution yet. With the async library it is very simple to use the query several times, and do the necessary error handling.

May be this code variant helps. (for inserting 10.000 small json objects to an empty database it took 6 sec).

Christoph

function insertData(item,callback) {
  client.query('INSERT INTO subscriptions (subscription_guid, employer_guid, employee_guid)
       values ($1,$2,$3)', [
        item.subscription_guid,
        item.employer_guid,
        item.employee_guid
       ], 
  function(err,result) {
    // return any err to async.each iterator
    callback(err);
  })
}
async.each(datasetArr,insertData,function(err) {
  // Release the client to the pg module
  done();
  if (err) {
    set_response(500, err, res);
    logger.error('error running query', err);
    return console.error('error running query', err);
  }
  logger.info('subscription with created');
  set_response(201);
})
TheFive
  • 168
  • 1
  • 10
6

It looks for me that the best way is the usage PostgreSQL json functions:

client.query('INSERT INTO table (columns) ' +
        'SELECT m.* FROM json_populate_recordset(null::your_custom_type, $1) AS m',
        [JSON.stringify(your_json_object_array)], function(err, result) {
      if(err) {
            console.log(err);
      } else {
            console.log(result);
      }
});
Sergey Okatov
  • 1,270
  • 16
  • 19
4

To do Bulk insert into Postgresql from NodeJS, the better option would be to use 'COPY' Command provided by Postgres and pg-copy-streams.

Code snippet from : https://gist.github.com/sairamkrish/477d20980611202f46a2d44648f7b14b

/*
  Pseudo code - to serve as a help guide. 
*/
const copyFrom = require('pg-copy-streams').from;
const Readable = require('stream').Readable;
const { Pool,Client } = require('pg');
const fs = require('fs');
const path = require('path');
const datasourcesConfigFilePath = path.join(__dirname,'..','..','server','datasources.json');
const datasources = JSON.parse(fs.readFileSync(datasourcesConfigFilePath, 'utf8'));

const pool = new Pool({
    user: datasources.PG.user,
    host: datasources.PG.host,
    database: datasources.PG.database,
    password: datasources.PG.password,
    port: datasources.PG.port,
});

export const bulkInsert = (employees) => {
  pool.connect().then(client=>{
    let done = () => {
      client.release();
    }
    var stream = client.query(copyFrom('COPY employee (name,age,salary) FROM STDIN'));
    var rs = new Readable;
    let currentIndex = 0;
    rs._read = function () {
      if (currentIndex === employees.length) {
        rs.push(null);
      } else {
        let employee = employees[currentIndex];
        rs.push(employee.name + '\t' + employee.age + '\t' + employee.salary + '\n');
        currentIndex = currentIndex+1;
      }
    };
    let onError = strErr => {
      console.error('Something went wrong:', strErr);
      done();
    };
    rs.on('error', onError);
    stream.on('error', onError);
    stream.on('end',done);
    rs.pipe(stream);
  });
}

Finer details explained in this link

Sairam Krish
  • 10,158
  • 3
  • 55
  • 67
1

Create your data structure as:

[ [val1,val2],[val1,val2] ...]

Then convert it into a string:

 JSON.stringify([['a','b'],['c']]).replace(/\[/g,"(").replace(/\]/g,")").replace(/"/g,'\'').slice(1,-1)

append it to the query and you are done!

Agreed it has string parsing costs but its way cheaper than single inserts.

davejal
  • 6,009
  • 10
  • 39
  • 82
0

You can use json_to_recordset to parse json inside Postgresql

client.query(
  'SELECT col1, col2
   FROM json_to_recordset($1) AS x("col1" int, "col2" VARCHAR(255));'
  , [JSON.stringify(your_json_object_array)]
)

This is very similar to Sergey Okatov's answer that uses instead json_populate_recordset.

I don't know what's the difference between both approaches, but with this method is clearer the syntax when dealing with multiple columns

Madacol
  • 3,611
  • 34
  • 33
-4

Use an ORM; eg: Objection.

Also, Increase the Connection pool size based on your db server and the number of active connection you need.

someMovie
  .$relatedQuery('actors')
  .insert([
    {firstName: 'Jennifer', lastName: 'Lawrence'},
    {firstName: 'Bradley', lastName: 'Cooper'}
  ])
  .then(function (actors) {
    console.log(actors[0].firstName);
    console.log(actors[1].firstName);
  });
Solano
  • 45
  • 1
  • 8