0

I have a list of 70,000 custom objects in C#, that hold four values. I need to insert all of these into a SQL Server table, if they don't already exist in a different table (same DB). I currently have a single SProc that inserts an individual record, with a foreach loop calling the SProc for each item in my list. Inserting 70,000 is taking about 14 minutes, which feels very slow to me, so is there a better approach?

The rough SProc is:

CREATE Procedure [dbo].[up_UpdateOverrideTable](
    @ItemId varchar(100),
    @ItemValue1 decimal(8,5),
    @ItemValue2 decimal(8,5),
    @Source varchar(2)
)
AS
BEGIN
    DECLARE @LocItemId varchar(100)
    SET @LocItemId = @ItemId
    DECLARE @LocItemValue1 decimal(8,5)
    SET @LocItemValue1 = @ItemValue1
    DECLARE @LocItemValue2 decimal(8,5)
    SET @LocItemValue2 = @ItemValue2
    DECLARE @LocSource varchar(2)
    SET @LocSource = @Source

    DELETE FROM OverrideTable WHERE ItemId = @LocItemId

    IF EXISTS (SELECT ItemId FROM InitialTable WHERE ItemId = @LocItemId)
        INSERT INTO OverrideTable VALUES (@LocItemId, @LocItemValue1, @LocItemValue2, @LocSource)

END
GO

The C# that calls this is also below:

using (SqlConnection conn = GetNewConnection()){
    conn.Open();
    using (var tran = conn.BeginTransaction()){
        using (SqlCommand cmd = new SqlCommand("up_UpdateOverrideTable", conn, tran)){
            cmd.CommandType = CommandType.StoredProcedure;

            try{
                foreach (var item in overrides){
                    cmd.Parameters.Clear();
                    // add parameters here 

                    var response = cmd.ExecuteNonQuery();

                    completedInserts.Add(new OverrideItem(item.Id, Convert.ToBoolean(response)));
                }
            }
            catch (Exception ex){
                tran.Rollback();
                throw;
            }

            tran.Commit();
            conn.Close();
        }
    }
}

Is there something I'm missing here, or is this the best I can hope for? I'm wondering whether creating a DataTable in C# would do the trick?

edparry
  • 688
  • 1
  • 10
  • 34
  • have you tried `SqlBulkCopy`? –  Jul 25 '18 at 08:38
  • 1
    There are some optimizations possible: 1) don't `delete` and `reinsert` but use `update`. 2) don't use the select in a foreach but try to rewrite it as a single select call. 3) use `bulk insert` if possible – Stefan Jul 25 '18 at 08:41
  • 2
    You might also consider passing a table valued parameter to your procedure – pinkfloydx33 Jul 25 '18 at 08:42
  • You can [pass a collection to SP](https://blogs.msdn.microsoft.com/kebab/2014/07/16/passing-arrays-key-value-pairs-and-other-collections-to-sql-stored-procedures-using-table-valued-parameters/) and do the loop there – Renatas M. Jul 25 '18 at 08:42
  • do refer to https://stackoverflow.com/questions/24877240/fastest-way-to-insert-1-million-rows-in-sql-server – Squirrel Jul 25 '18 at 08:45
  • Possible duplicate of [Fastest way to insert 1 million rows in SQL Server](https://stackoverflow.com/questions/24877240/fastest-way-to-insert-1-million-rows-in-sql-server) – Ian Kemp Jul 25 '18 at 08:47
  • SQLBulkCopy performs inserts, but I notice you mention that you may already have the records. You could BulkCopy to a staging or temp table, then use Set based SQL to insert only the records that don't exist, then throw away the others – Cato Jul 25 '18 at 09:09

2 Answers2

2

I'm wondering whether creating a DataTable in C# would do the trick?

Yes, using a DataTable in C# is a step towards speeding up the operation. You then need to use it as a data source for SqlBulkCopy as Matt suggested in comments.

DataTable source = null;//your data source as a DataTable
SqlBulkCopy bulkCopy = new SqlBulkCopy("your connection string");
bulkCopy.DestinationTableName = "your target table name";
await bulkCopy.WriteToServerAsync(source);
Aly Elhaddad
  • 1,913
  • 1
  • 15
  • 31
  • SqlBulkCopy looks to be the way forward, thanks. Using this method we're down to a few seconds per 10,000 records which is much more like what I expected. – edparry Jul 25 '18 at 13:32
1

You have few sql commands which can slow down the procedure response..

  1. DELETE FROM OverrideTable WHERE ItemId = @LocItemId
  2. IF EXISTS (SELECT ItemId FROM InitialTable WHERE ItemId = @LocItemId) (Make sure you have index on ItemId)

Unless optimization on db level is not done, C# code can't do anything.

And regarding c# code, I think you should create command inside foreach loop.

try{
    foreach (var item in overrides)
    {
        using (SqlCommand cmd = new SqlCommand("up_UpdateOverrideTable", conn, tran))
        {
            cmd.CommandType = CommandType.StoredProcedure;
                // add parameters here 

            var response = cmd.ExecuteNonQuery();

            completedInserts.Add(new OverrideItem(item.Id, Convert.ToBoolean(response)));
        }
    }
}
catch (Exception ex)
{
    tran.Rollback();
    throw;
}

tran.Commit();
conn.Close();
Saad Shaikh
  • 175
  • 1
  • 13