-1

I have a C# project which connects to a TSQL database. The project runs multiple sequential update statements on a single table, eg.:

 private void updateRows() {
      string update1 = "UPDATE WITH (ROWLOCK) table SET ... WHERE ...;"
      string update2 = "UPDATE WITH (ROWLOCK) table SET ... WHERE ...;"
      string update3 = "UPDATE WITH (ROWLOCK) table SET ... WHERE ...;"

      // execute updates one after the other
 }

Ideally, these statements will be batched to avoid making multiple round-trips to/from the database:

string update = "
    UPDATE WITH (ROWLOCK) table SET ... WHERE ...;
    GO
    UPDATE WITH (ROWLOCK) table SET ... WHERE ...;
    GO
    UPDATE WITH (ROWLOCK) table SET ... WHERE ...;
    GO
";

My question is, if the statements are batched, does this increase the chance of deadlock errors occurring due to table scans?

As there is less time between each statement, I imagine that this could increase chances of deadlocks as one update could introduce a row or page lock, which may not be released by the time the next statement is executed. However if the update statements were not batched, then there is more time for row or page locks to be released between each update statement, therefore less chance of deadlocks occurring.

user2181948
  • 1,646
  • 3
  • 33
  • 60
  • why are you using a ROWLOCK hint? where is the enclosing transaction? Can I suggest you explain the actual problem you are trying to solve. You may have an XY problem... – Mitch Wheat Apr 20 '18 at 01:19
  • @MitchWheat In the past, the sequential updates caused page locks and therefore deadlocks when later updates were executed. – user2181948 Apr 20 '18 at 01:22
  • so you've fixed the symptoms, not the cause.... – Mitch Wheat Apr 20 '18 at 01:22
  • 1
    The "GO" is not part of the SQL language, and not needed to put sequential statements together in C# command. – Joel Coehoorn Apr 20 '18 at 01:23
  • I strongly recommend you explain the original problem in more detail: it's likely that you don't have the appropriate indexes defined. – Mitch Wheat Apr 20 '18 at 01:25
  • @MitchWheat The original problem is in this question: https://stackoverflow.com/questions/48980416/deadlock-error-on-same-table-with-two-update-sql-statements - deadlocks occurring due to sequential, separate update statements – user2181948 Apr 20 '18 at 01:28
  • 1
    and not a single answer is correct! Why is there even a need for a separate "Update product dimensions" ? – Mitch Wheat Apr 20 '18 at 01:30
  • @MitchWheat I understand that ROWLOCK does not solve the problem as it is merely a suggestion for the database, and is not completely enforced. Fixing the deadlock errors require core changes in the project logic and not just adding a ROWLOCK hint. – user2181948 Apr 20 '18 at 01:53
  • "Fixing the deadlock errors require core changes in the project logic and not just adding a ROWLOCK hint." - so what's stopping you making those changes? – Mitch Wheat Apr 20 '18 at 02:23
  • 1
    @MitchWheat Nothing, in fact they're already in progress. I just thought I could ask a question on SO that might be tangentially related to the work already in progress, in case it may be affected, however that doesn't seem to be the case without being rebuked. – user2181948 Apr 20 '18 at 02:39
  • rebuked? I'm getting to the root of the problem, instead of helping you with some inappropriate duct tape! – Mitch Wheat Apr 20 '18 at 02:39
  • @MitchWheat "so what's stopping you from making those changes?" came off as a curt response. I apologise. The fact is that those changes are in progress, but this question I thought was at least tangentially related, and therefore thought I should ask anyway. – user2181948 Apr 20 '18 at 02:50

1 Answers1

1

I guess you're not going to like my answer, here are my 2 cents, let me try and explain

  1. first, your rowlock might not work, you might end up getting a table lock of your DDL doesn't allow SQL server to apply row locking in your transaction.
  2. SQL likes set operations, it performs great when you update large datasets in one time.

I have a similar issue, I need to update large volumes of user transactions but I have no spare IO in the systems. I end-up using an 'ETL like' update,

In C# I'm using a bulk insert to get all data in the database in one go. Here is my method.

protected void BulkImport(DataTable table, string tableName)
{
    if (!CanConnect)
        return;

    var options = SqlBulkCopyOptions.FireTriggers | SqlBulkCopyOptions.CheckConstraints |
                    SqlBulkCopyOptions.UseInternalTransaction;
    using (var bulkCopy = new SqlBulkCopy(_con.ConnectionString, options))
    {
        bulkCopy.DestinationTableName = tableName;
        bulkCopy.BulkCopyTimeout = 30;
        try
        {
            lock(table){
            bulkCopy.WriteToServer(table);
            table.Rows.Clear();
            table.AcceptChanges();
            }
        }
        catch (Exception ex)
        {
            var msg = $"Error: Failed the writing to {tableName}, the error:{ex.Message}";
            Logger?.Enqueue(msg);
            try
            {
                var TE= Activator.CreateInstance(ex.GetType(), new object[] { $"{msg}, {ex.Message}", ex });
                Logger?.Enqueue(TE as Exception);
            }
            catch
            {
                Logger?.Enqueue(ex);

            }

        }
        finally
        {
            bulkCopy.Close();
        }
    }
}

Please note that DataTable is not thread safe, you need to lock the DataTable when interacting (insert rows, clear table) with it.

Then I dump the data in a staging table and use a merge statement to bring the data into the database where I need it.

I do +100k records per second on 50 or so tables and have not had any performance or deadlock issues till now.

halfer
  • 19,824
  • 17
  • 99
  • 186
Walter Verhoeven
  • 3,867
  • 27
  • 36