46

A single row can be inserted like this:

client.query("insert into tableName (name, email) values ($1, $2) ", ['john', 'john@gmail.com'], callBack)

This approach automatically comments out any special characters.

How do i insert multiple rows at once?

I need to implement this:

"insert into tableName (name, email) values ('john', 'john@gmail.com'), ('jane', 'jane@gmail.com')"

I can just use js string operators to compile such rows manually, but then i need to add special characters escape somehow.

stkvtflw
  • 12,092
  • 26
  • 78
  • 155
  • 1
    Is there a reason you cannot simply execute INSERT twice? – vitaly-t Jan 25 '16 at 10:32
  • 8
    regarding to pg documentation this approach is very undesirable due to performance decrease – stkvtflw Jan 25 '16 at 10:38
  • If executing 2 inserts instead of 1 will danger the performance of your application, then `node-postgres`, isn't for you at all. But I believe you are looking at it the wrong way, trying to optimize where you shouldn't. This library can insert 10,000 records in under 1 second easily. – vitaly-t Jan 25 '16 at 10:47
  • was the answer satisfactory in your case? If so, please accept it. – vitaly-t Apr 05 '16 at 20:11

8 Answers8

39

Use pg-format like below.

var format = require('pg-format');

var values = [
  [7, 'john22', 'john22@gmail.com', '9999999922'], 
  [6, 'testvk', 'testvk@gmail.com', '88888888888']
];
client.query(format('INSERT INTO users (id, name, email, phone) VALUES %L', values),[], (err, result)=>{
  console.log(err);
  console.log(result);
});
Brian Burns
  • 20,575
  • 8
  • 83
  • 77
web.dev.etc
  • 479
  • 5
  • 5
13

One other way using PostgreSQL json functions:

client.query('INSERT INTO table (columns) ' +
  'SELECT m.* FROM json_populate_recordset(null::your_custom_type, $1) AS m',
  [JSON.stringify(your_json_object_array)], function(err, result) {
    if (err) {
      console.log(err);
    } else {
      console.log(result);
    }
});
Brian Burns
  • 20,575
  • 8
  • 83
  • 77
Sergey Okatov
  • 1,270
  • 16
  • 19
  • 2
    You'd probably use `NULL::table` for the type, though you'll need to `SELECT` the same individual columns that are listed in the `INSERT`. – Bergi Dec 05 '20 at 16:45
  • 1
    You don't have to define a custom type for the call to the `json_populate_recordset()` function, instead you can specify the name of the table into which you are inserting as the type. For instance when inserting into a table named `foo` the function call might look as follows: `json_populate_recordset(null::foo)`. – Neoheurist Feb 08 '22 at 02:27
  • 1
    @Neoheurist you are right in the case when json structure corresponds to foo. If json structure differs, then you need to define you_custom_type. – Sergey Okatov Feb 08 '22 at 06:13
  • 1
    You can avoid declaring custom types, by using instead `SELECT col_1, col_2 FROM json_to_recordset($1) AS x("col_1" INTEGER, "col_2" VARCHAR)` – Madacol Jun 23 '23 at 14:32
7

Following this article: Performance Boost from pg-promise library, and its suggested approach:

// Concatenates an array of objects or arrays of values, according to the template,
// to use with insert queries. Can be used either as a class type or as a function.
//
// template = formatting template string
// data = array of either objects or arrays of values
function Inserts(template, data) {
    if (!(this instanceof Inserts)) {
        return new Inserts(template, data);
    }
    this.rawType = true;
    this.toPostgres = function () {
        return data.map(d=>'(' + pgp.as.format(template, d) + ')').join(',');
    };
}

An example of using it, exactly as in your case:

var users = [['John', 23], ['Mike', 30], ['David', 18]];

db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('$1, $2', users))
    .then(data=> {
        // OK, all records have been inserted
    })
    .catch(error=> {
        // Error, no records inserted
    });

And it will work with an array of objects as well:

var users = [{name: 'John', age: 23}, {name: 'Mike', age: 30}, {name: 'David', age: 18}];
  
db.none('INSERT INTO Users(name, age) VALUES $1', Inserts('${name}, ${age}', users))
    .then(data=> {
        // OK, all records have been inserted
    })
    .catch(error=> {
        // Error, no records inserted
    });

UPDATE-1

For a high-performance approach via a single INSERT query see Multi-row insert with pg-promise.

UPDATE-2

The information here is quite old now, see the latest syntax for Custom Type Formatting. What used to be _rawDBType is now rawType, and formatDBType was renamed into toPostgres.

vitaly-t
  • 24,279
  • 15
  • 116
  • 138
0

You are going to have to generate the query dynamically. Although possible, this is risky, and could easily lead to SQL Injection vulnerabilities if you do it wrong. It's also easy to end up with off by one errors between the index of your parameters in the query and the parameters you're passing in.

That being said, here is an example of how you could do write this, assuming you have an array of users that looks like {name: string, email: string}:

client.query(
  `INSERT INTO table_name (name, email) VALUES ${users.map(() => `(?, ?)`).join(',')}`,
  users.reduce((params, u) => params.concat([u.name, u.email]), []),
  callBack,
)

An alternative approach, is to use a library like @databases/pg (which I wrote):

await db.query(sql`
  INSERT INTO table_name (name, email)
  VALUES ${sql.join(users.map(u => sql`(${u.name}, ${u.email})`), ',')}
`)

@databases requires the query to be tagged with sql and uses that to ensure any user data you pass is always automatically escaped. This also lets you write the parameters inline, which I think makes the code much more readable.

ForbesLindesay
  • 10,482
  • 3
  • 47
  • 74
0

Simple usage of pg module's built-in parameterized query with flattened values. Thought it could be what some are looking for. The following is for convenience.

function makeValuesString(columnCount, rowCount) {
  return Array.from({ length: rowCount }, (_, i) => `(${Array.from({ length: columnCount }, (_, j) => `$${i * columnCount + j + 1}`).join(', ')})`).join(', ')
}

const columns = ['id', 'column1', 'column2', 'column3'] // columnNames
const query = {
  text: `INSERT INTO tableName (${columns.join(', ')})
  VALUES ${makeValuesString(columns.length, rows.length)}`,
  values: rows.reduce((acc, cv) => {
    acc.push(...cv) // rowValues
    return acc
  }, [])
}

const result = await client.query(query)

-1

Using npm module postgres (porsager/postgres) which has Tagged Template Strings at the core:

https://github.com/porsager/postgres#multiple-inserts-in-one-query

const users = [{
  name: 'Murray',
  age: 68,
  garbage: 'ignore'
},
{
  name: 'Walter',
  age: 80,
  garbage: 'ignore'
}]

sql`insert into users ${ sql(users, 'name', 'age') }`

// Is translated to:
insert into users ("name", "age") values ($1, $2), ($3, $4)

// Here you can also omit column names which will use all object keys as columns
sql`insert into users ${ sql(users) }`

// Which results in:
insert into users ("name", "age", "garbage") values ($1, $2, $3), ($4, $5, $6)

Just thought I'd post since it's like brand new out of beta and I've found it to be a better philosophy of SQL library. I think would be preferable over the other postgres/node libraries posted in other answers. IMHO

700 Software
  • 85,281
  • 83
  • 234
  • 341
  • 1
    I strongly recommend that library from personal experience, and the gif is super helpful for developers to decide whether that syntax is appropriate for their situation. I found it on the [porsager/postgres](https://github.com/porsager/postgres) readme. I have no affiliation. Just trying to provide the most helpful answer possible. – 700 Software Sep 15 '22 at 21:22
-3

Hi I know I am late to the party, but what worked for me was a simple map.

I hope this will help someone seeking for same

   let sampleQuery = array.map(myRow =>
    `('${myRow.column_a}','${myRow.column_b}') `
    ) 

 let res = await  pool.query(`INSERT INTO public.table(column_a, column_b) VALUES ${sampleQuery}  `)
  • 2
    OP needs to parse the column values and escape special characters as mentioned in the question and that is why this is not easily a good solution. Furthermore, this is vulnerable to SQL injection – Abdullah Oladipo May 20 '22 at 10:00
-6
client.query("insert into tableName (name, email) values ($1, $2),($3, $4) ", ['john', 'john@gmail.com','john', 'john@gmail.com'], callBack)

doesn't help? Futher more, you can manually generate a string for query:

insert into tableName (name, email) values (" +var1 + "," + var2 + "),(" +var3 + ", " +var4+ ") "

if you read here, https://github.com/brianc/node-postgres/issues/530 , you can see the same implementation.

Vyacheslav
  • 26,359
  • 19
  • 112
  • 194
  • 7
    This would work fine for a small number of inserts, but for hundreds of inserts, you need something in a loop. Furthermore, your second suggestion would be vulnerable to SQL injection. – Chris Watts Jun 04 '18 at 15:48