Here's what works already, but it's using a loop:
(I am updating the nickname, slug
field on the remote table for each row in a local table)
DECLARE
row_ record;
rdbname_ varchar;
....
/* select from local */
FOR row_ IN SELECT rdbname, objectvalue1 as keyhash, cvalue1 as slug, cvalue2 as nickname
FROM bme_tag
where rdbname = rdbname_
and tagtype = 'NAME'
and wkseq = 0
LOOP
/* update remote */
PERFORM dblink_exec('sysdb',
format(
'update bme_usergroup
set nickname = %L
,slug = %L
where rdbname = %L
and wkseq = 0
and keyhash = %L'
, row_.nickname, row_.slug, row_.rdbname, row_.keyhash)
);
END LOOP;
Now, what I would like to do instead is to do a bulk UPDATE (remote) FROM (local)
PERFORM dblink_exec('sysdb',
'update (remote)bme_usergroup
set nickname = bme_tag.cvalue2, slug=bme_tag.cvalue1
from (local).bme_tag s
where bme_usergroup.rdbname = %L
and bme_usergroup.wkseq = 0
and bme_usergroup.keyhash = s.keyhash
and bme_usergroup.rdbname = s.rdbname
)
I've gotten this far by looking a various solutions (postgresql: INSERT INTO ... (SELECT * ...)) and I know how to separate the remote and local tables of the query in the context of SELECT
, DELETE
and even INSERT/SELECT
. And I can do that direct update with bind variables too. But how about UPDATE FROM
?
If it's not possible, should I look into Postgres's FOREIGN TABLE
or something similar?
The local and remote db are both on the same Postgres server. One additional bit of information, if it matters, is that either database may be dropped and restored separately from the other, and I'd prefer a lightweight solution that doesn't take a lot of configuration each time to reestablish communication.