0

I have one SQL that fetch big number of rows and then insert to database all in one iteration.

INSERT INTO Summary (Name,StartDate,EndDate,InitialSize,Completed,Remaining) SELECT (Status.Type + ' ' +Status.Generic) AS Name,DATEADD(day, -28, Status.Pat2) AS StartDate, DATEADD(day, 1, Status.Pat2) AS EndDate, COUNT(*) AS InitialSize, SUM(CASE WHEN Status.sw = 'Y' THEN 1 ELSE 0 END) AS Completed, SUM(CASE WHEN Status.sw = 'Y' THEN 0 ELSE 1 END) AS Remainig FROM .......

jdbcTemplate.batchUpdate(sql);

How can I make this to be faultTolerant meaning if there is exception due to duplicate key (column Name) then insert will continue without the duplicate row.

Thanks you

angus
  • 3,210
  • 10
  • 41
  • 71
  • which RDBMS.....? – Marmite Bomber Sep 04 '20 at 08:33
  • I am using SQL Server – angus Sep 04 '20 at 12:28
  • 1
    There are elegant solutions (or workaround if you want) in several RDBMS. In MySQL you would use [INSERT IGNORE](https://www.mysqltutorial.org/mysql-insert-ignore/), in Oracle [`insert /*+ ignore_row_on_dupkey_index (...`](https://blogs.oracle.com/sql/how-to-skip-duplicate-key-errors-ora-00001-in-oracle-database) but in SQL Server you would have to use either subquery to eliminate the existing keys or switch to the `MERGE` statement - see the discussion [here](https://stackoverflow.com/q/10703752/4808122) – Marmite Bomber Sep 04 '20 at 12:53

0 Answers0