How you can get the id of the inserted data using sqlbulkcopy? The id is needed for the formation of the child table. Download example I saw on msdn (http://msdn.microsoft.com/en-us/library/ac44f8yy(v=vs.110).aspx) and it works. But how to get id is not written. Program code in c#
-
AFAIK, You can't. If you need that, and still need to use bulk insert, you should bulk insert into a temporary table, and then insert into the real table from that, using the OUTPUT clause of INSERT. – Lasse V. Karlsen Aug 08 '14 at 16:18
-
Please don't tag spam, only tag your question with relevant languages (for you, SQL and C#). – eddie_cat Aug 08 '14 at 16:28
-
Is there a specific reason you are using `SqlBulkCopy`. Does it have a particular feature that you need or are you just trying to insert multiple records in a single call? – Solomon Rutzky Aug 08 '14 at 21:28
-
@srutzky I'm just trying to insert several thousand records in a single call. – Feanon Aug 09 '14 at 05:53
-
In that case either one of my answers would work. If you need a specific feature of `SqlBulkCopy` (something in `BulkCopyOptions`, etc) then my first answer regarding the Trigger is appropriate. Otherwise my other answer about TVPs is the preferred method for multi-row operations (starting in SQL Server 2008). – Solomon Rutzky Aug 09 '14 at 17:32
4 Answers
It's not possible. You will have to use traditional INSERT
statements using scope_identity()
or using the OUTPUT
clause to get the values of IDs - assuming the ID's are coming from IDENTITY
column types.
SqlBulkCopy
isn't meant for transactional inserts. It's meant to copy large amounts of data between servers.

- 5,632
- 19
- 34
-
You are making a bad assumption here regarding the nature of the question. The question regards inserting multiple rows at one time. That is a valid case for using `SqlBulkCopy` provided that the OP is making use of features such as the ability to batch, etc. But there are still better ways than single-row INSERT with `SCOPE_IDENTITY()` to insert multiple rows. – Solomon Rutzky Aug 08 '14 at 21:31
-
The OP has confirmed that they are inserting thousands of records at a time. This means that doing single-row INSERTs is the slowest method of accomplishing this task. While `SqlBulkCopy` _might_ not be necessary, it is unfair / incorrect to say that it is not valid. Passing the entire set of data in as a TVP, or at least as XML, and capturing all new IDs via the OUTPUT clause (my 2nd answer here), is the preferred method for bulk operations if you don't need any of the special features of `SqlBulkCopy`. – Solomon Rutzky Aug 12 '14 at 17:38
If you are inserting multiple rows but do not need any of the special features of SqlBulkCopy
, you should consider using Table-Valued Parameters (TVPs). You can create a Stored Procedure that accepts a TVP and returns a result set. The body of the Stored Procedure would be something along the lines of:
CREATE PROCEDURE SchemaName.StoredProcName
(
@Rows SchemaName.TableTypeName READONLY
)
AS
SET NOCOUNT ON;
INSERT INTO SchemaName.ImportTable (Field1, ..., FieldN)
OUTPUT INSERTED.ID, INSERTED.IdentifyingField1, ..., INSERTED.IdentifyingFieldN
SELECT Field1, ..., FieldN
FROM @Rows;
Your app code would call the proc via SqlCommand.ExecuteReader() and you would get the new IDs and the identifying fields back as a SqlDataReader.
TVPs were introduced in SQL Server 2008. If you are using SQL Server 2005 you can still send in a batch of rows but only as XML. However, even as XML you can still do the multiple-row insert and the OUTPUT clause to capture the new IDs as shown above.
I wrote an article several years ago showing a few different options for calling Stored Procs that have Table-Valued Parameters from C# code. You can find it here, Streaming Data Into SQL Server 2008 From an Application, though free registration is required to read articles on that site.
Another example can be found in this answer, How to pass a table-value parameter, which shows the creation of the Table Type, the Stored Procedure (no OUTPUT clause), and the C# code.

- 1
- 1

- 46,688
- 9
- 128
- 171
One way to have the IDs is by forcing them into your identity using:
SET IDENTITY_INSERT (Transact-SQL)
use database
go
set identity_insert myDatabase.mySchema.MyTable on
go
...
set identity_insert myDatabase.mySchema.MyTable off
go
Those hardly apply to client code processes, though.
In addition to it, I have never attempted such procedures directly from within client code (C#, VB, whatever).
As stated by @CamBruce, those are meant for batch processes or else. Perhaps would you be best with a INSERT...SELECT...
statement as stated in the reference in your question:
SqlBulkCopy.WriteToServer Method (DataRow[])
IMPORTANT:
If the source and destination tables are in the same SQL Server instance, it is easier and faster to use a Transact-SQL INSERT … SELECT statement to copy the data.

- 23,773
- 22
- 96
- 162
-
Thanks for the answer! and there is a way to insert a table of 5000 records and further insertion of the second table, where one of the columns is an id records from the previous page. Foreign key. Using the programming language c #? – Feanon Aug 09 '14 at 04:11
-
Yes, of course! Simply deactivate the identity autoincrementing feature on your ID, then launch your operation of `INSERT ... SELECT` through regular ADO.NET. – Will Marcouiller Aug 11 '14 at 12:56
Assuming the following two statements to be true of your particular situation:
- There is at least one, if not more, fields of the input data that uniquely identify the data (i.e. they could be used as a PK if you weren't going to use the IDENTITY field).
- The process of this app code calling SqlBulkCopy is single-threaded (i.e. only one import process will / can run at a time).
then there an option. As others have stated, SqlBulkCopy is for batched / remote processing so you don't have direct access to the session in which it actually does the INSERT. Hence, you need a fully independent mechanism to capture values. Triggers are such a mechanism. You can have a trigger insert the new IDs along with the identifying field(s) into another table that you can read from after SqlBulkCopy finishes.
Create a table to temporarily hold the new IDs. Something along the lines of the following. Please note that all fields should be NOT NULL as any field that can contain NULLs could never be a part of a PK and hence are not suitable for your needs here.
CREATE TABLE SchemaName.TempIDs ( ID INT NOT NULL, IdentifyingField1 DataType NOT NULL, .... IdentifyingFieldN DataType NOT NULL, CreatedDate DATETIME NOT NULL CONSTRAINT [DF_TempIDs_CreatedDate] DEFAULT (GETDATE()) );
Create a Trigger on your import table to insert the relevant data into the
TempIDs
table:CREATE TRIGGER SchemaName.ImportTable_CaptureIDs ON SchemaName.ImportTable AFTER INSERT AS SET NOCOUNT ON; INSERT INTO SchemaName.TempIDs (ID, IdentifyingField1, ..., IdentifyingFieldN) SELECT ID, IdentifyingField1, ..., IdentifyingFieldN FROM INSERTED;
Update your app code to
DELETE FROM SchemaName.TempIDs;
at the beginning of the process. Of course, you need to consider that the process might fail and depending on how your app code is written, if you can get the child data into memory again to insert into the related tables, then you won't want to call the delete until that is taken care of. In that case, you will want to first check to see if any records are inSchemaName.TempIDs
and if so, process them and then delete them.Update your app code to
SELECT ID, IdentifyingField1, ..., IdentifyingFieldN
after the SqlBulkCopy step. You would use theIdentifyingFieldX
fields to match up to your collection in memory to graft the newID
field values into the appropriate records.

- 46,688
- 9
- 128
- 171
-
This is more of an excessive workaround. There is no reason to create triggers. You'll probably be at a net loss for any performance gained by using `SqlBulkCopy` in a way it was not intended. – Cam Bruce Aug 08 '14 at 21:15
-
@CamBruce: I appreciate the feedback but disagree. `SqlBulkCopy` is designed for efficient / bulk inserting of data, not purely from server to server as you state in your answer. If you look at the link in the question, he is passing in a `DataRow` array, so multiple rows from the app. This _is_ one of a few intended uses of `SqlBulkCopy`. A simple trigger that inserts into a table that has no contention on it will have very little impact. And since inserting a set is faster than single row inserts as you are suggesting, it is possible that your proposal is actually slower than mine. – Solomon Rutzky Aug 08 '14 at 21:22
-
@CamBruce: now that the OP has verified that they are indeed inserting thousands of records at a time, it should be clear that the solution I proposed here (i.e. the Trigger) is far from "an excessive workaround". Inserting thousands of records from an app _is_ an acceptable and intended use of `SqlBulkCopy` and the Trigger would be a negligible performance hit. In contrast, doing single-row inserts, even in a transaction, of thousands of records and using `SCOPE_IDENTITY()` would perform much slower / worse than this. – Solomon Rutzky Aug 12 '14 at 17:30