2

I've just run a query from a WinForms app that updates 1 field for 2k records, but it'll usually be more. Only thing is the app just stops responding, it takes around 30 seconds to update 10 records and that's just not good enough. How am I screwing up?

The code

using (SqlConnection conx = GetConnection())
{
  conx.Open();
  using (SqlCommand cmd = new SqlCommand())
  {
    cmd.Connection = conx;

    int counter = 0;
    foreach (DataRow row in mainResultsTable.Rows)
    {
       if(controlResult)
       {
         thisSerial = row[2].ToString();
         col = "[someColumn]";
       }
       cmd.CommandText = "UPDATE SomeTable$ SET [ToBeUpdated] = '" + DateTime.Now.ToString() + "' WHERE SomeColumn ='" + thisSerial.ToString() + "'";
       cmd.ExecuteNonQuery();
       counter++;
       if (counter == 10)
       {
         MessageBox.Show("10");
       }
    }
  }
}

PS. The test isn't super-accurate, I waited for the Messagebox to pop up.

So what I finally did is create a temporary table on SQL Server, import that upload data and do it all in a batch, deleting the temp-table after the operation is done, is this advisable?

Dani
  • 2,480
  • 3
  • 21
  • 27
  • 5
    "it takes around 30 seconds to update 10 records" - something very wrong with that performance, even factoring in you should be updating in a single batch. – Mitch Wheat Jul 25 '11 at 10:15
  • Have you tried only creating one SqlCommand and just changing the "Text" property everytime? – Jethro Jul 25 '11 at 10:16
  • @Mitch Wheat, yup, hence the question. – Dani Jul 25 '11 at 10:18
  • @Jethro, going to try doing that now – Dani Jul 25 '11 at 10:18
  • 1
    @Dani : What I'm trying to say is that even if you fix and update in a batch, those numbers suggest you have other problems also. – Mitch Wheat Jul 25 '11 at 10:19
  • @Mitch Wheat, This is a secondary query, the 1st one compares 20k records to 800k returning the matched values, the results of which are fed into a DataSet from which the updates are made. Think it's not disposing properly? – Dani Jul 25 '11 at 10:22

4 Answers4

3

You would do better, if possible, to run one SQL call to update all of your records, which will be significantly faster. If you can define a query that does this, with parameters, it should update satisfactorily.

Alternatively, you will have to consider whether you can thread this, depending on whether you need it to finish before you proceed.

Schroedingers Cat
  • 3,099
  • 1
  • 15
  • 33
3

There are a few things that could be done, off the top of my head:

  1. Check that your database server is well. Maybe some process is stuck at 100% processor time, and your database is slowed down.

  2. Check your database for locks. Issuing 10 updates should not take more than a fraction of a second unless the count of the table is in the order of billions.

  3. Check whether the table is indexed by the SomeColumn column. This can dramatically increase performance.

  4. Try to run the same 10 statements from SQL Studio to see how long they take, to isolate the side of the problem (client code / database server)

  5. Try to send less update statements, if mainResultsTable has 1.000.000 rows, you'll send 1.000.000 update statements. If you use SQL Server 2008, you could use a stored procedure with table-valued parameters, otherwise, you could compile a XML parameter, and then parse it on the server. You could also do the IN clause trick from Peter response, but I'd rather use TVP or XML

  6. Try to do a Application.DoEvents() to keep the UI responsive (this is usually the wrong approach, but it's a quick and effective sometimes fix)

  7. Use a BackgroundWorker to spin the whole updating process into an another thread.

Aside from that, this code appers to be open to second-order SQL Injection, if the thisSerial property is a varchar.

SWeko
  • 30,434
  • 10
  • 71
  • 106
2

Fetch all serials into a list:

var serials = new List<string>();
foreach (DataRow row in mainResultsTable.Rows)
    serials.Add(row[2].ToString());

Update all rows with one sql query:

"UPDATE SomeTable SET [ToBeUpdated] = '" + DateTime.Now.ToString() + "' WHERE SomeColumn IN ('" + String.Join("','", serials) + "')";
Peter
  • 3,916
  • 1
  • 22
  • 43
  • this approach has its limits - see http://stackoverflow.com/questions/1869753/maximum-size-for-a-sql-server-query-in-clause-is-there-a-better-approach-close – Yahia Jul 25 '11 at 10:19
  • @Yahia - In this case the author would be fine with only 2,000 rows. – Security Hound Jul 25 '11 at 11:11
1

You must have an index on someColumn. Otherwise the UPDATE has to scan the table end-to-end to find the rows that needs to be updated.

But approaching the problem of updating 2k rows one-by-one on the client side and applying each update individually is fundamentally flawed. For once, it will be unperformed. But more important is likely incorrect. The rows you're updating are 'live' and change after you looked at them. It is much better to do the update as a single set oriented operation on the server side. How to do that depends on a number of factors we cannot guess simply looking at your post. I recommend you read up about Table-Valued Parameters in SQL Server 2008.

Remus Rusanu
  • 288,378
  • 40
  • 442
  • 569