PostgreSQL version is 9.0.
I have to optimize a plpgsql function. The idea is just to run though all documents and test if related rows 902,903,905,907 in table webdte.doc_tip_cifra
already exist. If they not already exist, insert null rows to satisfy the validation afterwards. It is ridiculously slow at the moment even if I only use one of the 4 conditions and use half of the amount of rows it has to be run against. Anybody has an idea for a performance boost?
CREATE OR REPLACE FUNCTION webdte.addtagobligatoriosventa(idlibro bigint)
RETURNS character AS
$BODY$
DECLARE
id_documento bigint;
validador integer;
validador1 integer;
validador2 integer;
validador3 integer;
validador4 integer;
tipo_cifra integer;
--counts integer[];
BEGIN
SELECT INTO validador1, validador2, validador3, validador4
max(CASE id_tipo_cifra WHEN 901 THEN 1 ELSE 0 END)
,max(CASE id_tipo_cifra WHEN 902 THEN 1 ELSE 0 END)
,max(CASE id_tipo_cifra WHEN 905 THEN 1 ELSE 0 END)
,max(CASE id_tipo_cifra WHEN 907 THEN 1 ELSE 0 END)
FROM webdte.doc_tip_cifra
WHERE id_doc = id_documento;
if (validador1 = 0) then
insert into webdte.doc_tip_cifra (id_doc, id_tipo_cifra, tasa_imp, val_imp)
values (id_documento, 901, 0, 0);
end if;
if (validador2 = 0) then
insert into webdte.doc_tip_cifra (id_doc, id_tipo_cifra, tasa_imp, val_imp)
values (id_documento, 902, 0, 0);
end if;
if (validador3 = 0) then
insert into webdte.doc_tip_cifra (id_doc, id_tipo_cifra, tasa_imp, val_imp)
values (id_documento, 905, 0, 0);
end if;
if (validador4 = 0) then
insert into webdte.doc_tip_cifra (id_doc, id_tipo_cifra, tasa_imp, val_imp)
values (id_documento, 907, 0, 0);
end if;
END;
$BODY$
LANGUAGE plpgsql VOLATILE;
Maybe it is better to decide for an insert trigger, that inserts 4 null rows on doc_tip_cifra
on every document insert to avoid this stupid expensive loop over all documents and test 4 times for each document?
What do you think?