Here's a plpgsql
function for postgres 9.6
. It tries to INSERT
a row, and if the insert doesn't fail (due to a key constraint violation), then it runs a few more commands.
CREATE FUNCTION foo(int, text, text)
RETURNS void AS
$$
BEGIN
INSERT INTO table1 (id, val1, val2) VALUES ($1, $2, $3) ON CONFLICT DO NOTHING;
IF FOUND THEN
INSERT INTO table2 (table1_id, val1) VALUES ($1, $2);
UPDATE table3 SET (val2, time) = ($3, now()) WHERE table1_id = $1;
END IF;
END
$$
This function processes a single record, but how could you modify it to process a batch of thousands of records?
I found an answer, which suggests to make each of the 3 function arguments an array. But is there a way to do it where I'm passing in arguments that more closely represent how the records would look in my application?
For example, the ideal solution would be my application code calls select foo($1)
, where the parameter $1
is a JSON array of objects, where each inner object is a record to be inserted.
[
{ "id": "1", "val1": "1-val1", "val2": "1-val2" },
{ "id": "2", "val1": "2-val1", "val2": "2-val2" },
{ "id": "3", "val1": "3-val1", "val2": "3-val2" },
{ "id": "4", "val1": "4-val1", "val2": "4-val2" }
]
The second-best option would be my application code calls select foo($1, $2, $3, $4)
, where each parameter is a JSON object corresponding to a record to be inserted.
{ "id": "1", "val1": "1-val1", "val2": "1-val2" } // This would be $1
{ "id": "2", "val1": "2-val1", "val2": "2-val2" } // This would be $2
I'm looking at the various JSON functions offered by Postgres here and they seem relevant to this but I can't figure out which exactly to use. Is what I'm looking to do even possible? Would using JSON arrays instead of JSON objects anywhere make this possible?