0

We execute SQL Server stored procedure mySP to insert into the database (say it will call the stored procedure about 300,000 times).

ALTER PROCEDURE [dbo].[mySP]
    @ThePath VARCHAR(255),
    @TheID INT
AS
    INSERT INTO myTbl (TheTime, ThePath, TheID)
    VALUES (GETDATE(), @ThePath, @TheID)

This is the code to execute stored procedure mySP:

using (SqlConnection con = new SqlConnection(connectionString))
{
:
     foreach (.....)  //about 300,000 times
     {
         SqlCommand MyCmd = new SqlCommand("mySP", con);
         MyCmd.CommandType = CommandType.StoredProcedure;
         MyCmd.Parameters.Add(new SqlParameter("ThePath", sMyPath));
         MyCmd.Parameters.Add(new SqlParameter("TheID", sMyID));

         iRow = MyCmd.ExecuteNonQuery();

         MyCmd.Dispose();
         MyCmd = null;
     }
}

How can I do this in a batch of say 5000, then commit transaction ?

Thank you

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
faujong
  • 949
  • 4
  • 24
  • 40
  • why don't you do it over the SP? do a loop inside the SP with the batch number you want. – RoMEoMusTDiE Jan 30 '19 at 21:46
  • Hi, I need to do the loop in the application to get the information that I need to insert into the database. – faujong Jan 30 '19 at 21:54
  • Batching means writing a multi-statement SQL. Looping isn't needed. Batching multiple inserts doesn't make them faster either, it saves the network roundtrip. *Bulk inserts* on the other hand do improve performance by using minimal logging. You can bulk insert data with the `bcp` command, `the `BULK INSERT` statement or the SqlBulkCopy class – Panagiotis Kanavos Jan 31 '19 at 10:41
  • SqlBulkCopy can accept a DataTabel or DataReader as source, which means you can feed it the results of other database queries – Panagiotis Kanavos Jan 31 '19 at 10:41

2 Answers2

1

When inserting thousands of rows at a time you are better off using something like .Net's own SqlBulkCopy, which basically means you have to populate a DataTable first and then use SqlBulkCopy to write it to the database. See https://sqlbulkcopy-tutorial.net/ for good examples. A little advice is to toy around with the BatchSize property; generally leaving it as-is is good enough. Don't forget to use a transaction as well.

Example:

using (SqlConnection connection = new SqlConnection("server=local);database=MyDatabase;integrated security=SSPI"))
{
    connection.Open();

    DirectoryInfo directory = new DirectoryInfo(@"C:\Temp\");

    var files = directory.GetFiles("*.dat");

    foreach (var file in files)
    {
        if (file.Length > 0)
        {
            string fileName = Path.GetFileNameWithoutExtension(file.FullName);

            SqlCommand cmd = new SqlCommand("dbo.uspStagingGetTableStructure", connection);
            cmd.CommandType = CommandType.StoredProcedure;
            cmd.Parameters.AddWithValue("@tableName", fileName);

            using (SqlDataReader reader = cmd.ExecuteReader())
            {
                while (reader.Read())
                {
                }
            }

            //
            // Prepare Bulkcopy target (datatable)
            //

            string target = "MyBulkCopyTarget";

            DataTable dataTable = new DataTable(target);
            SqlBulkCopy bulkcopy = new SqlBulkCopy(connection);
            bulkcopy.DestinationTableName = target;
            bulkcopy.BulkCopyTimeout = 600;

            //
            // Map columns source -> target
            //

            //foreach (var dataField in dataFields)
            //{
            //    if (dataField.sqlDbType == SqlDbType.Int)
            //        dataTable.Columns.Add(dataField.fieldName, typeof(int));
            //    else
            //        dataTable.Columns.Add(dataField.fieldName, typeof(string));
            //    bulkcopy.ColumnMappings.Add(dataField.fieldName, dataField.fieldName);
            //}

            //
            // Populate Bulkcopy target (datatable)
            //

            string line = string.Empty;
            using (StreamReader reader = file.OpenText())
            {
                while ((line = reader.ReadLine()) != null)
                {
                    DataRow row = dataTable.NewRow();

                    //
                    // Use proper data types
                    //

                    //foreach (var dataField in dataFields)
                    //{
                    //    if (dataField.sqlDbType == SqlDbType.Int)
                    //        row[dataField.fieldName] = Convert.ToInt32(line.Substring(dataField.fieldStartPos, dataField.fieldLength));
                    //    else
                    //        row[dataField.fieldName] = line.Substring(dataField.fieldStartPos, dataField.fieldLength);
                    //}

                    dataTable.Rows.Add(row);
                }
            }

            //
            // Push datatable to server
            //

            bulkcopy.WriteToServer(dataTable);
        }
    }
}
}
}
Thailo
  • 1,314
  • 7
  • 13
  • Hi, in my case, since I do a foreach loop in the application to get the input data, how do I do the sqlbulkCopy ? Instead of executing the stored procedure mySP inside the loop, do I want to populate the DataTable inside the Loop, then sqlbulkCopy the dataTable ? – faujong Jan 30 '19 at 22:43
  • Yes, please see the example I added. – Thailo Jan 31 '19 at 10:36
  • No - don't use or recommend [addwithvalue](http://www.dbdelta.com/addwithvalue-is-evil/) – SMor Jan 31 '19 at 14:37
  • Thank you Thailo. I will try that. – faujong Jan 31 '19 at 15:12
0

Create type in SQL Server and pass table as parameter. Please look below

How to pass table value parameters to stored procedure from .net code

https://www.c-sharpcorner.com/UploadFile/ff2f08/table-value-parameter-use-with-C-Sharp/

Once, you pass the table variable to the stored procedure, you can insert data in one insert select statement.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Maddy
  • 907
  • 3
  • 10
  • 25
  • Hi Maddy/Marc_s..with table value parameter, do we insert a number of records then commit (like the SqlBulkCopy) ? My concern is I don't want to fill up the database Transaction Log, so I would like to insert say 5000 records, then commit, then the next 5000 records then commit, etc. – faujong Jan 31 '19 at 15:11
  • Yes we can insert n numbers of record as per your need. Create loop for 5000 rows in data table and call store proc. – Maddy Jan 31 '19 at 17:18