2

I need to load multiple individual Excel files into multiple SQL Server tables in the same database using SSIS.

I have 61 files need to loaded into 61 tables in the same SQL Server database.

I created tables in the SQL Server database.

I created SSIS project for Data Flow Task, Foreach loop container specification , Variables for connectivity as the picture below.

enter image description here

Data Flow control has:

enter image description here

Excel Source:

enter image description here

OLE DB Destination:

enter image description here

The problem is this way read all the files in the folder but add it to one table in the database.

I need to add each file to his independent table in the SQL database.

Thanks for helping.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Yamur
  • 339
  • 6
  • 20
  • Thanks @marcs_s for revising. – Yamur Jun 29 '15 at 12:24
  • Are all the files and tables the exact same column structure? – Tab Alleman Jun 29 '15 at 12:27
  • Yes exact column and structure for 61 files – Yamur Jun 29 '15 at 12:33
  • You'll have to use a variable as the table name. However it seems odd to want to have 61 tables with the same structure, it would be better to have on table for the rows, linked to a table which stores a row for each spreadsheet, linked via a foreign key relationship. – Steve Ford Jun 29 '15 at 12:45
  • Thank you @SteveFord, but could you explain more?. I need to have individual tables for files, this will help more for future comparison purpose. – Yamur Jun 29 '15 at 12:53
  • See if this helps for setting your destination table to a variable: https://social.msdn.microsoft.com/forums/sqlserver/en-US/e244d142-d376-40d4-8ae9-91707f7c22a2/ole-db-destination-table-name-or-view-name-variable-fast-load – Tab Alleman Jun 29 '15 at 13:15
  • Thank you @TabAlleman, I followed the link below but I did not get the the step number 3 to make dynamic. I have the names already in the DB. I want to read each file to the existed table and copy the data. – Yamur Jun 29 '15 at 13:43
  • Inside your ForEachFile loop, use a script task to set the variable to the correct table name for that file. – Tab Alleman Jun 29 '15 at 14:32
  • @Yamur I would still suggest you consider having a Batch table (ID, FileName, UploadDateTime) which holds details of the individual files and a DataRows table (ID, BatchId, RowNumber, ColA, ColB...) which holds the rows of each spreadsheet. in your for loop you would fetch the next file add a row to the batch table and then load the rows to the DataRows table (setting the BatchID to the ID of the row you just inserted to the Batch table). Comparisons are easy using SELECT * FROM DataRow DR1 INNER JOIN DataROW DR2 ON DR1.RowNumber = DR2.RowNumber WHERE DR1.BatchId = 1 AND DR2.BatchId = 2 etc. – Steve Ford Jun 29 '15 at 14:50
  • @TabAlleman. How should I define the variable to connect to the Database tables? – Yamur Jun 29 '15 at 18:00
  • @SteveFord, Thank you, but also this kind of work not only easy with SELECT. After I integrate the data in SQL server I need to do a software for processing and calculation. – Yamur Jun 29 '15 at 18:55
  • @TabAlleman, I would appreciate if you can answer the question in details with links with this scenario. – Yamur Jun 29 '15 at 18:57
  • @Yamur if you choose to implement multiple tables, every time you want to add a new file you will have to create a new table. Also if you need to add an index to improve performance you will have to add an index to all tables. If you want to find all rows which meet a certain condition you will have to query 61 tables rather than just the one. This will rapidly become unsustainable. Please look into Database Normalisation. – Steve Ford Jun 30 '15 at 09:36

0 Answers0