0

We've normalised our rather large table to separate store names and product details, something like:

CREATE TABLE store_products
(
    id BIGINT DEFAULT PRIMARY KEY NOT NULL,
    storeid INTEGER REFERENCES store(id),
    productid BIGINT REFERENCES product_details(id)
    producturlid BIGINT REFERENCES product_url(id)
);

We get a dump of a stores products in JSON format and we iterate through each of the items and insert it into Postgres via a Delphi console app. It does this concurrently (so 4 threads insert and get through the JSON array of products).

It first checks if the store name exists in the store table, if not inserts it and caches the returning id, does the same for the product by name and product_url by url.

Is there a Postgres way of inserting data into this table (and foreign tables) in one go without having to do all these checks one by one? I had a look at CTE (WITH) but I'm unsure that's the solution to this problem.

The back and forth nature of the code and it being within a transaction becomes with heavy.

There are some older SO posts that are similar, but the solutions don't seem to be viable. The one I'm looking at seems to be this one using USPERT and CTEs.

Community
  • 1
  • 1
Lisa Anna
  • 87
  • 1
  • 9

1 Answers1

0

After quite a bit of playing around, I settled on the answer from this stackoverflow post.

The answer is Function with UPSERT in Postgres 9.5 by @Erwin Brandstetter, what a great way of doing it! so impressive.

Community
  • 1
  • 1
Lisa Anna
  • 87
  • 1
  • 9