Read several sources stating how COPY
is faster than INSERT
statements in PostgreSQL when performing bulk ingests. Sequelize.js has a bulkCreate()
function for doing bulk INSERTs.
Is it possible for Sequelize to do COPY
instead of INSERT
, and do you expect COPY
to be faster? I am ingesting 10,000-1,000,000 rows at a time in each of the 5 parallel processes running for a total of 2 hours. Hope some speedups can be achieved using COPY
and Sequelize.
Actual database is using TimescaleDB extension
Example Code
let Note = sequelize.define('notes', {
description: Sequelize.STRING
});
let notes = [
{ description: 'Tai chi in the morning' },
{ description: 'Visited friend' },
{ description: 'Went to cinema' },
{ description: 'Listened to music' },
{ description: 'Watched TV all day' },
{ description: 'Walked for a hour' },
];
Note.bulkCreate(notes, { logging: console.log }).then(() => {
console.log('notes created');
}).catch((err) => {
console.log(err);
}).finally(() => {
sequelize.close();
});
Generated SQL
INSERT INTO "notes" ("id","description") VALUES (DEFAULT,'Tai chi in the morning'),
(DEFAULT,'Visited friend'),(DEFAULT,'Went to cinema'),(DEFAULT,'Listened to music'),
(DEFAULT,'Watched TV all day'),(DEFAULT,'Walked for a hour') RETURNING *;