0

I have a Typescript object like this (the properties are made up, but the object is in the form listed below):

shipments = [
{
id: 1,
name: 'test',
age: 2,
orderNumber: null
},
{
id: 2,
name: 'test2',
age: 4,
orderNumber: '1434'
},
]

I need to write a postgresql statement that takes that array and puts it into a table that has columns id, name, age, and orderNumber. I can't do any iteration on the data (that's why I'm trying to stuff an array I already have using one import statement - because it's way faster than iteration). I need to take that array - without adding any kind of Typescript manipulation to it - and use it in an postgresql insert statement. Is this possible? To maybe make more clear what I want to do, I want to take the shipments object from above and insert it similar to what this insert statement would do.

INSERT INTO table (id, name, age, orderNumber) VALUES (1, 'test', 2', null), (2, 'test2', 4, '1434')

But more automated such as:

INSERT INTO table (variable_column_list) VALUES shipments_array_with_only_values_not_keys

I saw an example using json_to_recordset, but it wasn't working for me, so the use case may have been different.

This is what I am currently doing, using adonis and multiInsert; however, that only allows 1000 records at a time. I was hoping for all the records in one postgres statement.

await Database.connection(tenant).table(table).multiInsert(shipments)

Thanks in advance, for the help!

Just Me
  • 91
  • 4
  • 12

1 Answers1

0

Are you sure you don't wanna use any ORM / libs for that?

You can generate SQL from array like this (not the best solution, just quick one):

const getQuery = shipments => `INSERT INTO table (${Object.keys(shipments[0]).map(key => '`' + key + '`').join(', ')})\nVALUES\n${shipments.map(row => `(${Object.values(row).map(value => value ? typeof value !== 'number' ? '`' + value + '`' : value : 'null').join(', ')})`).join(',\n')}`;
console.log(getQuery(shipments));

Output:

INSERT INTO table (`id`, `name`, `age`, `orderNumber`)
VALUES
(1, 'test', 2, null),
(2, 'test2', 4, '1434')

All records will be merged into one insert query, but:

  1. Large amount of data per one query is unreasonable and causes crashes / freezes. So you still need to chunk data somehow (This question might be useful):
for (let chunk of chunks) {
  await Database.rawQuery(getQuery(chunk))
}
  1. No dynamic structure here! Each array element should have same structure with same set of keys in exactly same order:
interface IShipment {
  id: number;
  name: string;
  age: number;
  orderNumber?: string;
} // or whatever
const shipments: IShipment[] = [ ... ] // always should be
  1. Interpolation depends of type: in shipments.map I interpolate numbers, and null(~ish) values. All other types I convert to string, which is OK for this case but completely wrong in general. For example or array should be stringified, not converted to string.
  2. You need to deal with possible injections (SQL injection wiki page). For example with promise-mysql package you can use pool.escape() method for your values to prevent injection:
Object.values(row).map(value => pool.escape(value)); // or somehow else

Conclusion: Push all records into one statement is not the best idea, especially on your own. I suggest chunking & inserting via adonis you aleady used:

const shipments: IShipment[] = [ ... ] // any amount of records
cosnt chunkSize = 1000; // adonis limit

const chunks: IShipment[][] = shipments.reduce((resultArray, item, index) => { 
  const chunkIndex = Math.floor(index / chunkSize);
  if(!resultArray[chunkIndex]) resultArray[chunkIndex] = [];
  resultArray[chunkIndex].push(item);
  return resultArray;
}, []); // split them into chunks

for (let chunk of chunks) {
  await Database.rawQuery(getQuery(chunk)); // insert chunks one-by-one
}
Xeelley
  • 1,081
  • 2
  • 8
  • 18
  • I already do what you say (chunk the data and use Adonis). Any .map or other iterative functions are what I need to avoid because they are literally 10+ times slower (so 10 hours to run versus 1, which is a huge difference). I just want to speed it up even more. Using a postgres query and pushing all records at once is for sure faster than going the ORM way, which when inserting values looks like it does it all at once, but under the hood it doesn't. I insert the same number of records elsewhere, it's just that the values are already the values only, versus an object of key/value. – Just Me Nov 16 '21 at 15:53
  • I appreciate your response. What I'm looking for is what my original post says, and only that, though. Thanks! – Just Me Nov 16 '21 at 16:03
  • @JustMe Agreed, but here are still huge DB load for large data. And you still need to deal with invalid interpolations and injections on your own. I'll update my anwer little bit latter w/ `json_to_query` solution. – Xeelley Nov 16 '21 at 16:19
  • @Xeeley Awesome. Thanks so much! – Just Me Nov 16 '21 at 17:29