3

I am in need of help speeding up my app please. The application reads about 53,000 lines of delimited records, parses each row, and then proceeds to send each row to the database to be written. So far, I have identified the Database end as the major bottleneck,and I would appreciate help tuning this up please. Currently, it takes about 20 minutes to process all the records(53,000) with 190 fields each, I would like to significantly reduce that number starting with the code that sends the data to the database.

I use Enterprise Library 5, taking advantage of connection pooling there) to connect to the db like so

   internal void SaveItem(String connString)
    {
        try
        {
            ImportDataAccessor dbacess = new ImportDataAccessor(connString);

            foreach (ItemDetail item in itemEZdetails)
            {
                if (dbacess.SaveProduct(RecordID, item))
                {
                    updatecounter++;
                }
            }
            successfulsaves = dbacess.RowsProcessed;
            updatecounter = dbacess.TotalRows;
        }
        catch (Exception ex)
        {
            throw ex;
        }
    }


public bool SaveProduct(String RecordUID, ItemDetail item)
{
 //. . . . start function here
 DbCommand insertCommand = db.GetStoredProcCommand("IDW_spEZViewRecordImport");
 db.AddInParameter(insertCommand, "SessionUID", DbType.String, recordUID);
  // the other 189 Parameters go here 
        int rowsAffected = db.ExecuteNonQuery(insertCommand);
                   // Object sreturnval = (String)db.GetParameterValue(insertCommand, "ReturnVal");
                    String returnval = String.Empty;
                    if ( ! (db.GetParameterValue(insertCommand, "ReturnVal") == DBNull.Value))
                        returnval = (String)db.GetParameterValue(insertCommand, "ReturnVal");
                    if (returnval == "60")
                        RowsProcessed++;
                    result = rowsAffected > 0;
}
//end of line add

How do I achieve this with the code I have now. Thanks in advance.

Kobojunkie
  • 6,375
  • 31
  • 109
  • 164
  • Have a look here http://stackoverflow.com/questions/7070011/writing-large-number-of-records-bulk-insert-to-access-in-net-c – tranceporter Oct 17 '12 at 16:26

2 Answers2

1

Use SQLBulkCopy as it is very fast when entering multiple rows.

DataTable newProducts = MakeTable();

    using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
    {
        bulkCopy.DestinationTableName = 
            "dbo.BulkCopyDemoMatchingColumns";

        try
        {
            // Write from the source to the destination.
            bulkCopy.WriteToServer(newProducts);
        }
        catch (Exception ex)
        {
            Console.WriteLine(ex.Message);
        }
    }

You just need to pass it a DataTable with matching column names, and voila!

System Down
  • 6,192
  • 1
  • 30
  • 34
0

Probably, you'll get better performance by using multiple records INSERT command in your stored procedure, something like this:

Insert into ItemType (ItemTypeName)
VALUES
('TYPE1'),
('TYPE2'),
('TYPE3')
Mohsen Afshin
  • 13,273
  • 10
  • 65
  • 90