0

I'm doing an ETL with a table with 100 million records where I capture the information from TERADATA and import it into ORACLE. The process is taking too long and I want to know if there is any way to improve the performance of my query with some tuning.
MERGE INTO TABLE_A TB USING (
  select t.COLUMN_A as COLUMN_A_OLD
  from TABLE_B t
  left outer join STAGE s
  on s.COLUMN_B = t.COLUMN_B
  and s.COLUMN_C  = t.COLUMN_C
  and s.COLUMN_D = t.COLUMN_D
  and s.COLUMN_E = t.COLUMN_E
  and s.COLUMN_F = to_date('yyyy-mm-dd 00:00:00','yyyy-mm-dd hh24:mi:ss')
  where t.COLUMN_F = to_date('2100-12-31 00:00:00','yyyy-mm-dd hh24:mi:ss')
  and s.COLUMN_C is null
) stg
on ( stg.COLUMN_A_OLD = tb.COLUMN_A )
WHEN MATCHED THEN
UPDATE SET
  TB.COLUMN_F = sysdate,
  TB.COLUMN_G = $$PARAMETER ,
  TB.COLUMN_H = sysdate;
commit;


Thanks.

Tom AL
  • 57
  • 6
  • 1
    Can you start with posting the query plan, approximate data volumes, available indexes and what "too long" means to you in this context? Does `table_a` have 100 M rows? `table_b`? `staging`? All of the above? Are you running this query once? Or many times? It seems weird that you'd do an ETL process that just updates two columns to `sysdate` and a third column to, I assume, a static value you're passing in. – Justin Cave Nov 16 '21 at 19:54
  • If all you are doing is an update then why are you using merge? Seems like overkill to me – NickW Nov 16 '21 at 20:57
  • Check the minimum information to post [here](https://stackoverflow.com/a/34975420/4808122) if you expect *meaningfull* feedback. Also clarify that all tables are in the same Oracle database and not some of them in Teradata... – Marmite Bomber Nov 16 '21 at 21:04

1 Answers1

0

Have you tried something like this?


alter session enable parallel dml;

 merge /*+ parallel(10) */ into ….

The PARALLEL hint will enable read-parallelism, but to also enable write-parallelism you will need to either run the above ALTER SESSION command or use the hint /*+ ENABLE_PARALLEL_DML */.

Beefstu
  • 804
  • 6
  • 11