1

Is there a mechanic similar to Oracle PL/SQL's SAVE EXCEPTIONS in Microsoft T-SQL? Currently I am doing the update using a cursor and it is extremely slow.

The description of SAVE EXCEPTIONS from Oracle's site:

SAVE EXCEPTIONS allows an UPDATE, INSERT, or DELETE statement to continue executing after it issues an exception. When the statement finishes, an error is issued to signal that at least one exception occurred. Exceptions are collected into an array that you can examine using %BULK_EXCEPTIONS after the statement has executed.

link to the Save exceptions definition: http://download.oracle.com/docs/cd/E11882_01/timesten.112/e13076/sqlexamples.htm#TTPLS364

cfeduke
  • 23,100
  • 10
  • 61
  • 65
Phil
  • 498
  • 6
  • 14
  • 1
    If you are talking relational databases, where maintaining referential integrity is important, I cant see how this is a desirable feature. – StingyJack Dec 06 '10 at 16:49
  • +1 @StingyJack - Sort of throws atomicity out the window... – JNK Dec 06 '10 at 16:56
  • @StingyJack - I am guessing that this ties into deferred constraint checking and as long as **what is committed** validates against the constraints there is no problem. If that is the case then this is not possible in SQL Server. The best solution would probably be to rewrite your cursor based code in a set based way though. – Martin Smith Dec 06 '10 at 17:08

3 Answers3

2

If you are importing a large number of records, use an SSIS package ansd send the failed rows to an exception table. If you can;t uses SSIS for some reason, consider cleaning your data before trying to insert it, so that you have no failed rows. For instance delete any records that have a null where you are required to havea value, null out bad dates, etc.

If you are coming from Oracle, you need to stop using cursors and use set-based logic instead. SQL Server does not perform well with cursors.

HLGEM
  • 94,695
  • 15
  • 113
  • 186
  • Thank you HLGEM. Edit (pressed enter too quick): I did not even think of SSIS as I was updating records and was unaware I could update records using it. – Phil Dec 06 '10 at 18:41
1

I think the closest you could come to simulating this behavior would be to disable/enable (with check) the constraints. The downside with this approach is that the bad data is now in your table and you can't enable the constraints until it's cleaned up. You'd need to decide if this is an acceptable risk in your particular case.

ALTER TABLE YourTable NOCHECK CONSTRAINT ALL

/* Perform your DML operations */

ALTER TABLE YourTable WITH CHECK CHECK CONSTRAINT ALL

/* Deal with any errors that are thrown: 
   'The ALTER TABLE statement conflicted with the CHECK constraint ...' 
   clean up the bad data then enable constraints again */
Joe Stefanelli
  • 132,803
  • 19
  • 237
  • 235
  • Thanks for your answer. I appreciate you including the downside as it would definitely not be acceptable risk! Thanks again! – Phil Dec 06 '10 at 18:44
1

Not sure exactly what kind of exceptions you are expecting. Some more detail along this line might be helpful.

I don't believe there is anything equivalent in MS SQL to what you are describing. A few ideas to do something somewhat similar:

  • You can use a TRY ... CATCH in SQL, but that's going to fail the whole batch if something goes wrong, not just the problematic rows.
  • An SSIS bulk insert task can be configured to have a separate path for "failed" rows, which you can then treat however you want.
  • If you are talking about unique index duplicates (insert all these rows, and if any are dups then just ignore them, but don't fail the whole batch), then you can declare the unique index with the IGNORE_DUP_KEY option (see this SO question)

Anything further, you'd probably need to be more explicit about what kinds of errors you imagine encountering.

Community
  • 1
  • 1
BradC
  • 39,306
  • 13
  • 73
  • 89
  • Thanks Brad. I am goign to go the SSIS route that both you and HLGEM suggested. I appreciate the suggestions and help! – Phil Dec 06 '10 at 18:43