0

I have got a Postgres function that accepts an array of a custom composite type. I have got it to work by generating a huge SQL text query, but past 100,000 array items, it fails. This could be either Node or Postgres, I am not sure.

I know that in Npgsql, the .NET Postgres driver, that you can pass in arrays as parameters. Is it possible to do that in node-postgres as well?

Jeff
  • 12,085
  • 12
  • 82
  • 152

1 Answers1

0

AFAIK, Node.js restricts its processes to just under 2GB of memory apiece. Depending on the size of your objects, you might be running into that memory ceiling with an Array that large.

If you're inserting many items at once, it might be more efficient and reliable to stream your objects into the database with the COPY command instead of a bulk INSERT with a large Array. Here are a few useful posts that describe the essentials of that approach:

Connor
  • 1,815
  • 3
  • 17
  • 25
  • What I am really looking for though, is the ability to pass native arrays to postgres without having to generate SQL. It's possible in .NET, so I am curious why it's not in node-pg. – Jeff Nov 29 '18 at 20:15
  • Offhand, I don't know of a simplified way to pass an Array as a parameter to a PL/pgSQL function from Node.js without matching the syntax PostgreSQL expects. There may be packages that handle it for you on npm, but I don't know what they're called. This question and its answers might be relevant: https://stackoverflow.com/questions/13328016/ As to why it's not in `node-postgres`, perhaps it's not a common enough use case? For bulk inserts, `pg-format` accepts Array and Object structures to build the `VALUES` clause for you, but that's not compatible with your function approach. – Connor Nov 29 '18 at 21:20