1

We are trying to insert 20000 or more records using SqlBulkCopy.WriteToServer method, but it is taking 140 sec to insert into the table.

Is there any way we can improve the time using SqlBulkCopy as this is widely used for bulk insertion?

Calling code:

using (SqlBulkCopy bulkCopy = new SqlBulkCopy(dConn)) 
{
    bulkCopy.ColumnMappings.Add("SamplingFileId", "SamplingFileId");
    /*... 74 more columns ...*/
    bulkCopy.BulkCopyTimeout = 500; //Assign Destination Table Name
    bulkCopy.DestinationTableName = "cvr_ReviewSamplingLoans";
    bulkCopy.BatchSize = 5000; 
    bulkCopy.WriteToServer(dtLoansToUpload);
}

Sql Table Definition:

CREATE TABLE [dbo].[ReviewSamplingLoans](
[Id] [int] IDENTITY(1000,1) NOT NULL,
[SamplingFileId] [int] NOT NULL,
[ReviewId] [int] NOT NULL,
[LoanType] [nvarchar](10) NOT NULL,
[LoanNumber] [nvarchar](50) NOT NULL,
[BorrowerName] [nvarchar](150) NOT NULL,
[OriginalUPB] [money] NOT NULL,
[CurrentUPB] [money] NOT NULL,
[OriginalMonthlyPayment] [money] NULL,
[CurrentMonthlyP&I] [money] NULL,
[PaymentFrequency] [nvarchar](2) NULL,
[OriginalNoteRate] [float] NULL,
[CurrentInterestRate] [float] NULL,
[OriginationDate] [date] NULL,
[MaturityDate] [date] NULL,
[FirstPaymentDate] [date] NULL,
[InterestPaidThruDate] [date] NULL,
[NextPaymentDate] [date] NULL,
[DelinquencyStatus] [nvarchar](10) NULL,
[LatePaymentCount] [int] NULL,
[StatedTerm] [int] NULL,
[OriginalAmortizationTerm] [int] NULL,
[NoteType] [nvarchar](10) NULL,
[RateType] [nvarchar](10) NULL,
[LoanProductType] [nvarchar](10) NULL,
[BalloonDueDate] [date] NULL,
[BalloonPaymentAmount] [money] NULL,
[PropertyCity] [nvarchar](50) NULL,
[PropertyState] [nchar](2) NULL,
[PropertyZipcode] [nvarchar](10) NULL,
[PropertyType] [nvarchar](10) NULL,
[PropertyTypeDescription] [nvarchar](50) NULL,
[LienPosition] [tinyint] NULL,
[OriginalLTV] [float] NULL,
[CurrentLTV] [float] NULL,
[SeniorLienBalance] [money] NULL,
[OriginalCommitmentAmount] [money] NULL,
[CurrentCommitmentAmount] [money] NULL,
[OriginalCombinedLTV] [float] NULL,
[CurrentCombinedLTV] [float] NULL,
[OriginalPropertyValue] [money] NULL,
[CurrentPropertyValue] [money] NULL,
[AppraisalDate] [date] NULL,
[AppraisalType] [nvarchar](10) NULL,
[PMI] [nchar](1) NULL,
[LoanPurpose] [nchar](1) NULL,
[OccupancyType] [nchar](1) NULL,
[DocumentationType] [nvarchar](10) NULL,
[OriginalFICO] [smallint] NULL,
[CurrentFICO] [smallint] NULL,
[LastFICOUpdatedDate] [date] NULL,
[DebtToIncomeRatio] [float] NULL,
[InterestOnlyPeriod] [smallint] NULL,
[Modication] [nchar](1) NULL,
[Foreclosure] [nchar](1) NULL,
[ArmProductType] [nvarchar](10) NULL,
[ArmMargin] [float] NULL,
[ArmIndexType] [nvarchar](10) NULL,
[FirstRateAdjustmentDate] [date] NULL,
[NextRateAdjustmentDate] [date] NULL,
[LifeFloor] [float] NULL,
[LifeCeiling] [float] NULL,
[InitialRateCap] [float] NULL,
[PeriodicRateCap] [float] NULL,
[InternalRiskRating] [nvarchar](10) NULL,
[CurrentDebtServiceRatio] [float] NULL,
[CurrentNetOperIncome] [float] NULL,
[OriginalDebtServiceRatio] [float] NULL,
[OriginalNetOperIncome] [float] NULL,
[CurrentOccupancy] [float] NULL,
[PropertyTotalSquareFootage] [int] NULL,
[UnitCount] [smallint] NULL,
[PrepaymentPenalty] [nchar](1) NULL,
[PrepaymentPenaltyType] [int] NULL,
[PrepaymentPenaltyTerm] [smallint] NULL,
CONSTRAINT [PK_ReviewSamplingLoans] PRIMARY KEY CLUSTERED 
(
    [Id] ASC
)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

ALTER TABLE [dbo].[ReviewSamplingLoans]  WITH NOCHECK ADD  CONSTRAINT [FK_ReviewSamplingLoans_ReviewCollateralTypes] FOREIGN KEY([LoanType])
REFERENCES [dbo].[ReviewSamplingLoans] ([Code])
GO

ALTER TABLE [dbo].[ReviewSamplingLoans] CHECK CONSTRAINT [FK_ReviewSamplingLoans_ReviewCollateralTypes]
GO

ALTER TABLE [dbo].[ReviewSamplingLoans]  WITH NOCHECK ADD  CONSTRAINT [FK_ReviewSamplingLoans_ReviewSamplingFiles] FOREIGN KEY([ReviewId])
REFERENCES [dbo].[Reviews] ([Id])
GO

ALTER TABLE [dbo].[ReviewSamplingLoans] CHECK CONSTRAINT [FK_ReviewSamplingLoans_ReviewSamplingFiles]
GO

Regarding Size, excel file contains 20000 rows which are added to datatable upon validation and using sqlbulkcopy inserting into table.

TT.
  • 15,774
  • 6
  • 47
  • 88
  • http://blogs.msdn.com/b/yingqin/archive/2014/02/28/preliminary-findings-to-get-good-performance-of-sqlbulkcopy.aspx – Tim Schmelter Sep 11 '15 at 11:16
  • 1
    Without any specific information it's hard to respond. *How* are you using SqlBulkCopy? What are you passing, a DataTable, DataReader? How big are the records? Fast or slow network? Are you using transactions? Does the target table have triggers? What is the recovery model of the database? – Panagiotis Kanavos Sep 11 '15 at 11:25
  • we are passing datatable to SqlBulkCopy.WriteToServer, minimum record count is 20000. Network is relatively good and we are not using transactions. – Vamshi Chilukuri Sep 11 '15 at 11:29
  • Target table does not contain triggers. – Vamshi Chilukuri Sep 11 '15 at 11:31
  • OK, let's start with the basics 1) show us the full SQL definition of the table you are inserting into, 2) what is the throughput of your network, 3) how large (avg) is each record? – RBarryYoung Sep 11 '15 at 11:36
  • Oh and show us the code calling `SqlBulkCopy`. – RBarryYoung Sep 11 '15 at 11:37
  • network throughput is 10mbps, destination table contains 75 columns. – Vamshi Chilukuri Sep 11 '15 at 11:47
  • using (SqlBulkCopy bulkCopy = new SqlBulkCopy(dConn)) { bulkCopy.ColumnMappings.Add("SamplingFileId", "SamplingFileId"); bulkCopy.BulkCopyTimeout = 500; //Assign Destination Table Name bulkCopy.DestinationTableName = "cvr_ReviewSamplingLoans"; bulkCopy.BatchSize = 5000; bulkCopy.WriteToServer(dtLoansToUpload); } – Vamshi Chilukuri Sep 11 '15 at 11:49
  • above code is not complete, we have columnmappings for 75 columns.. – Vamshi Chilukuri Sep 11 '15 at 11:50
  • Again, we need the full SQL definition of the table including any keys, indexes. Please edit your question and add that in. If you don't know how to get that, let us know and we can explain it. Also we need the average size *in bytes* of your records. – RBarryYoung Sep 11 '15 at 15:01
  • Hi, i have provided sql table definition and looking for answer. – Vamshi Chilukuri Sep 15 '15 at 07:26
  • See https://stackoverflow.com/questions/24200/whats-the-fastest-way-to-bulk-insert-a-lot-of-data-in-sql-server-c-client – TT. Dec 07 '17 at 10:40

1 Answers1

0

SqlBulkCopy can be performed faster when you enable the Tablelock on the options.

Have a look here SqlBulkCopyOptions Enumeration

But keep in mind that once the operation has started, no one will be able to use the table until it is finished. Microsoft says that this tenchnique improves the speed when inserting large amounts of data.

Tony
  • 27
  • 1
  • 7