0

Oracle has the ability to do bulk inserts by passing arrays as bind variables. The database then does a separate row insert for each member of the array:

http://www.oracle.com/technetwork/issue-archive/2009/09-sep/o59odpnet-085168.html

Thus if I have an array:

string[] arr = { 1, 2, 3}

And I pass this as a bind to my SQL:

insert into my_table(my_col) values (:arr)

I end up with 3 rows in the table.

Is there a way to do this in PostgreSQL w/o modifying the SQL? (i.e. I don't want to use the copy command, an explicit multirow insert, etc)

John Fedak
  • 421
  • 1
  • 4
  • 9

4 Answers4

1

Nearest that you can use is :

insert into my_table(my_col) SELECT unnest(:arr)
Roman Tkachuk
  • 3,096
  • 1
  • 16
  • 15
1

PgJDBC supports COPY, and that's about your best option. I know it's not what you want, and it's frustrating that you have to use a different row representation, but it's about the best you'll get.

That said, you will find that if you prepare a statement then addBatch and executeBatch, you'll get pretty solid performance. Sufficiently so that it's not usually worth caring about using COPY. See Statement.executeBatch. You can create "array bind" on top of that with a trivial function that's a few lines long. It's not as good as server-side array binding, but it'll do pretty well.

Craig Ringer
  • 307,061
  • 76
  • 688
  • 778
0

I'm not sure which language you're targeting, but in Java, for example, this is possible using Connection.createArrayOf().

Related question / answer: error setting java String[] to postgres prepared statement

  • 1
    This link appears to be describing passing an array as a parameter- not the equivalent of the Oracle ArrayBind. (i.e. you would end up with a single row with an array value in a column, not the multiple row expansion that the ArrayBind functionality provides) – John Fedak Sep 02 '17 at 15:29
0

No, you cannot do that in PostgreSQL.

You'll either have to use a multi-row INSERT or a COPY statement.

Laurenz Albe
  • 209,280
  • 17
  • 206
  • 263