0

I need to bulk insert to SQL Server data from an Excel or CSV file. The data is close to 100k lines. I am using C#/.NET. In the Microsoft documentation I find this:

BULK INSERT Sales.Invoices
FROM '\\share\invoices\inv-2016-07-25.csv'
WITH (FORMAT = 'CSV'
  , FIRSTROW=2
  , FIELDQUOTE = '\'
  , FIELDTERMINATOR = ';'
  , ROWTERMINATOR = '0x0a');

BULK INSERT (Transact-SQL)

If I have an Excel file I can convert it to CSV and use this. This works great and it is very efficient, but my problem is that I do not have access to the file system on the SQL server. Has anyone an idea what is the best way to work around that or is there maybe a totally different approach that is better?

  • https://stackoverflow.com/questions/17405140/insert-excel-file-data-into-database-table-using-c-sharp – priya Jul 19 '20 at 18:10
  • To be able to `BULK INSERT` from a file, the SQL Server service account must have access to the location the file is on; if it doesn't it can't access it. You'll either need to place/copy the file to a location the Service Account has access to, give the Service Account access to the location the file(s) are currently in (may be unwise depending on your business' data privacy policies and how well privileged the account currently is), or create a new share where you can place files which need to be bulk loaded and give the Service Account access to that new share. – Thom A Jul 19 '20 at 18:11
  • 2
    Note you can also read directly from an xls(x) file, without converting it to a CSV using `OPENROWSET`. – Thom A Jul 19 '20 at 18:12
  • 2
    If you want to use C# to build a simple program to import your CSV file then you could use the SqlBulkCopy class with a common CSV File reader library – Steve Jul 19 '20 at 18:12
  • what about using just EF with list? – Iria Jul 19 '20 at 21:54

3 Answers3

0

I am seeing this problem like client wants to load data to one of the database table for some reason.

  • if they need to automate this process or they need to perform this so frequently or needs to do more than one file at a time, I would stream line the process by asking the client to copy the CVS files or Excel files to a server folder when they want to import to database.

After then I would write a folder watching .NET application that would reader CVS or Excel files and import data to database table. If it fails to load the data, it should move the file to "error files" Folder with error log. If it is success, application would move that file to "completed files".

Advantages of this approach.

  • multiple users can use this functionality to upload data to server.
  • User machine doesn't require any kind of installation.
  • Access to the folder can completely managed by AD.

Disadvantage

  • This is an old approach to handle this kind of use case.
  • User needs to manually check the log or server folder to know the status of the files.

Technical References

Dale K
  • 25,246
  • 15
  • 42
  • 71
0

You can try out SqlBulkCopy, allowing you to insert your data directly to the database by use of a DataTable.

More information can be found here: SqlBulkCopy.WriteToServerAsync.

silkfire
  • 24,585
  • 15
  • 82
  • 105
0

Using SQLBulkCopy seems to be a very good solution as it is explained

here

Here we have the suggestion of doing this:

var lines = System.IO.File.ReadAllLines(@"d:\data.txt");
if (lines.Count() == 0) return;
var columns = lines[0].Split(',');
var table = new DataTable();
foreach (var c in columns)
    table.Columns.Add(c);

for (int i = 1; i < lines.Count() - 1; i++)
    table.Rows.Add(lines[i].Split(','));

var connection = @"your connection string";
var sqlBulk = new SqlBulkCopy(connection);
sqlBulk.DestinationTableName = "Table1";
sqlBulk.WriteToServer(table);

This is pretty efficient and by using this code I can insert 100.000 records in less than 5 secs which is totally acceptable performance. Direct SQL Bulk insert on the SQL server itself is of course faster and inserts the data almost instantly but when as in my case that is not possible, we have a decent alternative here.