1

I have query for updating and inserting large amount of data:

update [TableA]
set ColumnA1 = 'ValueA1'

DECLARE @MyCursor CURSOR;
DECLARE @MyField char(16);
BEGIN
    SET @MyCursor = CURSOR FOR
    select  ColumnA1 from [TableA]
        Where ColumnA2 = ValueA2  

    OPEN @MyCursor 
    FETCH NEXT FROM @MyCursor 
    INTO @MyField

    WHILE @@FETCH_STATUS = 0
    BEGIN

      FETCH NEXT FROM @MyCursor 
      INTO @MyField 

      Insert Into TableB(ColumnB1, ColumnB2, ColumnB3, ColumnB4, ColumnB5, ColumnB6)
      Values(@MyField, ValueB1, ValueB2, ValueB3, ValueB4, ValueB5)
    END; 

    CLOSE @MyCursor ;
    DEALLOCATE @MyCursor;
END;

As you notice first I update TableA and then I use updated TableA records to insert some logs information to TableB. I want do whole sql command in transaction manner. I scare that using begin transaction locks entire sql server database. Already I use Entity Framework and I didn't know how it handles this sort of commands without locking sql server.

Seyed Morteza Mousavi
  • 6,855
  • 8
  • 43
  • 69

1 Answers1

1

As per Hamlets comment, there is no "whole database" lock. The closest would probably be to put the database into single user mode.

However, holding extensive (exclusive) locks on rows or tables on highly contended data will impair the performance of other queries concurrently attempting to read committed data, or other writers.

e.g. Since the update below potentially changes all rows, it will almost certainly TABLOCK on TableA, so if you wrap your entire posted SQL in a transaction, TableA will also be locked for the duration of the cursor.

UPDATE [TableA]
SET ColumnA1 = 'ValueA1';

It appears that the cursor itself is unnecessary and undesirable - wherever possible, replace RBAR approaches with set based approaches. The below insert should be equivalent:

INSERT Into TableB(ColumnB1, ColumnB2, ColumnB3, ColumnB4, ColumnB5, ColumnB6)
SELECT ColumnA1, ValueB1, ValueB2, ValueB3, ValueB4, ValueB5
FROM [TableA]
WHERE ColumnA2 = ValueA2;

To answer your final question Entity Framework wraps the .SaveChanges() call into a transaction by default - up until this point, all changes made to the DbContext are tracked in memory only.

IMO, ORMS like EntityFramework aren't ideal for large bulk / batch operations such as the sql you've presented here, given that you have little control over the resultant Sql.

If you are going to move this bulk job into .Net, can I suggest

  • Your mass update can be done via a basic SqlCommand
  • Bulk inserts can be done efficiently using SqlBulkCopy

You'll be able to control a transaction over the above with a shared SqlConnection (SqlBulkCopy has a constructor taking a SqlConnection).

As an aside, if your requirement didn't require that you do the entire sequence of operations under a transaction, you could consider batching your updates + inserts into batches of < 5000 per commit, to avoid lock escalation issues.

Community
  • 1
  • 1
StuartLC
  • 104,537
  • 17
  • 209
  • 285
  • But I don't want my Table Also be locked! – Seyed Morteza Mousavi Jul 22 '15 at 07:19
  • @SeyedMortezaMousavi, in addition to the post you also can look at the Isolation Levels. – Hamlet Hakobyan Jul 22 '15 at 07:37
  • @SeyedMortezaMousavi If you want a transaction around all your work, locks are inevitable. You do however have the choice to move [concurrent Snapshot isolation](http://www.brentozar.com/archive/2013/01/implementing-snapshot-or-read-committed-snapshot-isolation-in-sql-server-a-guide/) (and hence move from pessimistic to optimistic locking) – StuartLC Jul 22 '15 at 12:00