1

We have this C# code that will update SQL server database table based on flags in a structure.

public struct stSRK
{ 
    public string Sno;
    public string SClaimID;
    public bool SType;
    public string SText; 
}

Data in the structure can go upto 5000-10000 records at a time.Currently we are using the following C# code to update the database and it makes 'n' number of database calls based on the struct. Would help there is an alternative to do a bulk update using the same struct . Please see the method below

public int UpdateClaims(List<stSRK> _lt, string strClaimType)
{
try
{
    int iCount = 0;
    for (int i = 0; i < _lt.Count; i++)
        {
            if (_lt[i].sType == true)
            {
                if (_lt[i].SText == 'A')
                {
                iCount += Convert.ToInt16(SQL.ExecuteSQL("UPDATE table SET ClaimType = '" + strClaimType + "' WHERE 
                    (Sno = '" + _lt[i].Sno + "' AND ClaimID = '" + _lt[i].SClaimID + "')"));
                }
                else
                {
                iCount += Convert.ToInt16(SQL.ExecuteSQL("UPDATE table SET ClaimType = '" + strClaimType + "' WHERE 
                    (Sno = '" + _lt[i].Sno + "' AND ClaimID = '" + _lt[i].SClaimID + "')"));
                }
            }
            else
            {
                if (_lt[i].SText == 'B')
                {
                iCount += Convert.ToInt16(SQL.ExecuteSQL("UPDATE table SET ClaimType = '" + strClaimType + "' WHERE 
                    (Sno = '" + _lt[i].Sno + "' AND ClaimID = '" + _lt[i].SClaimID + "')"));
                }
                else
                {
                iCount += Convert.ToInt16(SQL.ExecuteSQL("UPDATE table SET ClaimType = '" + strClaimType + "' WHERE 
                    (Sno = '" + _lt[i].Sno + "' AND ClaimID = '" + _lt[i].SClaimID + "')"));
                }
            }
        return iCount;
        }
catch (Exception e)
{
    throw e.Message;
}
Habib
  • 219,104
  • 29
  • 407
  • 436
UnlimitedMeals
  • 131
  • 1
  • 4
  • 10

3 Answers3

1

Your query seems to be the same for all cases..

Can't you rewrite your code to build just one SQL statement?

Something like this:

public int UpdateClaims(List<stSRK> _lt, string strClaimType)
{
    try
    {
        string allSno = String.Join(",", _lt.Select(l=> l.Sno.ToString()).ToArray());
        string allClaimID = String.Join(",", _lt.Select(l=> l.SClaimID.ToString()).ToArray());

        // Warning: NOT Production code, SQLInjection hazard!
        return Convert.ToInt16(SQL.ExecuteSQL("UPDATE table SET ClaimType = '" + strClaimType + @"' WHERE 
                        (Sno IN(" + allSno + ") AND ClaimID IN(" + allClaimID + "))"));
    catch (Exception e)
    {
        //This is not a good idea, as this way you loose all innerException 
        // information about the exception. And you might just want to remove 
        // the try-catch alltogether if you just rethrow the exception.
        throw e.Message;
    }
}
Peter
  • 14,221
  • 15
  • 70
  • 110
  • Worth noting: this approach does not scale very well to larger updates - ideally should be a hybrid approach where you do batches of 1,000 records (or whatever makes sense for the types of records/environment you are dealing with). If you try the above with 200,000 records, SQL server will likely choke horribly on the query parsing. – Tao Sep 18 '13 at 12:10
  • Also, there may be a logical flaw in your proposed where clause: If I want to delete a record where `Sno=1 AND ClaimID=2`, and I want to delete another record where `Sno=2 AND ClaimID=1`, I will also accidentally be deleting records where `Sno=1 AND ClaimID=1` or `Sno=2 AND ClaimID=2`. – Tao Sep 18 '13 at 12:12
  • Thanks Tao, you are very right and I should've mentioned this in my answer. I jumped to the probably wrong assumption that they are always unique. It is up to the OP to put this into context as I have no clue what Sno means. – Peter Sep 18 '13 at 12:58
  • 1
    That has SQL injection written all over it. While it is just a refactoring of the OP's code, I would really suggest that your own answer parameterize the query. Answers have a way of being used by others as gospel. – Eric J. Nov 13 '13 at 00:59
1

My code:

 DataTable tblDetails = new DataTable("tblPlanDetail");

        tblDetails.Columns.Add("row1").DataType = typeof(Int32);
        tblDetails.Columns.Add("row2").DataType = typeof(DateTime);
        tblDetails.Columns.Add("row3").DataType = typeof(String); ;
        tblDetails.Columns.Add("row4").DataType = typeof(Int32); ;


        for (int i = 0; i < table.Rows.Count; i++)
        {

            for (int j = 1; j <= DateTime.DaysInMonth(monthYear.Year, monthYear.Month); j++)
            {
                DataRow row = tblDetails.NewRow();

                DateTime DayOfMonth = new DateTime(monthYear.Year, monthYear.Month, j);
                row["row1"] = idPlan;
                row["row2"] = DayOfMonth;
                row["row3"] = table.Rows[i][0];
                row["row4"] = Int32.Parse((string)table.Rows[i][j]);

                tblDetails.Rows.Add(row);
            }
        }

        try
        {

            SqlBulkCopy sqlbulk = new SqlBulkCopy(connection, SqlBulkCopyOptions.KeepIdentity, transaction);
            sqlbulk.ColumnMappings.Add("row1", "row1");
            sqlbulk.ColumnMappings.Add("row2", "row2");
            sqlbulk.ColumnMappings.Add("row3", "row3");
            sqlbulk.ColumnMappings.Add("row4", "row4");

            sqlbulk.DestinationTableName = "tblPlanDescription";
            sqlbulk.BatchSize = 2;
            sqlbulk.WriteToServer(tblDetails);
            transaction.Commit();
        }
        catch (Exception exp)
        {

            transaction.Rollback();
        }
        finally
        {
            transaction.Dispose();
            connection.Close();
        }

That is bulk insert. You can insert it into temp table. Then you can execute SP, which will create neccessary indexes and update neccesary data on server side

look at here

Community
  • 1
  • 1
Yuriy Vikulov
  • 2,469
  • 5
  • 25
  • 32
0

Try to use linq queries for mass update. That will improve performance of the application.

You can get help for mass updation linq queries over here:

How to run a mass update/delete query in Linq?

http://msdn.microsoft.com/en-us/library/bb399339.aspx

Hope this will help you.

Community
  • 1
  • 1
Freelancer
  • 9,008
  • 7
  • 42
  • 81
  • 5
    NO! Linq is slower (sometimes extremely slower) than ADO.Net for bulk operations. – Eric J. Mar 11 '13 at 05:05
  • @EricJ. is it? i thought linq is faster. I also use Linq for 25000 to 30000 records insertion,updation deletions. i observed, it makes operation faster for 20 records per seconds than ADO.NET – Freelancer Mar 11 '13 at 05:07
  • @Freelancer Yes, **Bulk Operations** on SQL data can be much, much faster than the equivalent Linq. Linq inserts/updates are done record by record, where a bulk operation can insert/update thousands of records in a single operation. – Corey Mar 11 '13 at 06:15
  • @Freelancer: Do you mean Linq-to-SQL or Linq-to-Entities (EF)? I have not used Linq-to-SQL, but Linq-to-Entities is *way* slower in my use case. Google *entity framework slow* and you will see many posts on the issue. – Eric J. Mar 11 '13 at 18:05
  • @EricJ.: EF is slower in pretty much *all* cases – Ed S. Nov 13 '13 at 00:37
  • @EdS.: There's certainly overhead associated with the EF abstraction layers. Then again, *some* programmers write some pretty bad DB access code. I would bet that there are *some* cases were *some* programmers will benefit from EF in terms of performance. In many common cases EF will be slower than what a good programmer would write, but trivially so. Then there are cases like mass update where EF's performance issues compared to hand-written code really matter. – Eric J. Nov 13 '13 at 00:53
  • @EricJ.: Yeah, I'm just bitter. I'm a systems guy, so not a lot of DB experience, and I've had to pull EF out of two programs due to performance issues. It's my own fault really; I used it for something it was never intended for. I do question some design choices (namely it's "chatty" nature and massive memory consumption), but now I only use it for trivial db access patterns, where it works great. – Ed S. Nov 13 '13 at 07:17