2

I want to read an Excel file which is 250 Mb large and has more than 300 000 rows. So when I try to process it in Excel source in SSIS it takes too much time and at last fails to process. Can anyone tell me how to split the big file into small ones or any other method that can help?

Hadi
  • 36,233
  • 13
  • 65
  • 124

2 Answers2

0

SQLCMD is better because timeout is MUCH longer. But than you have to save your Excel as CSV, and I do not know if that's possible for your.

If you have a CSV file (i.e. BulkDataFile.csv) and you create a file like this:

BULK INSERT dbo.TableForBulkData
FROM 'C:\BulkDataFile.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)

Then you can use SQLCMD -E -I import.sql (of you have save the file as import.sql)

Luuk
  • 12,245
  • 5
  • 22
  • 33
0

Best way is to read data by chunks, i have done this to read huge tables from SQL databases. But to read big excel files, i implemented this in a C# application not in SSIS since it is more complicated. I will provide some links for more info:

If you need to do this only using SSIS, i can provide a detailed answer later similar to what i provided in the first link above (SQLite).

SchemaMapper: C# schema mapping class library

Recently i started a new project on Git-Hub, which is a class library developed using C#. You can use it to import tabular data from excel, word , powerpoint, text, csv, html, json and xml into SQL server table with a different schema definition using schema mapping approach. It gives the ability to import big excel files. check it out at:

You can follow this Wiki page for a step-by-step guide:

Hadi
  • 36,233
  • 13
  • 65
  • 124