0

I am trying to insert data from multiple CSV files from a single folder to a single table.

I can do bulinsert for 1 file using the following code:

USE [dbname]
GO
BULK INSERT tablename
FROM 'path to csv files'
WITH
(
    FIRSTROW = 2, -- as 1st one is header
    FIELDTERMINATOR = ',',  --CSV field delimiter
    ROWTERMINATOR = '\n',   --Use to shift the control to next row
    TABLOCK
)
GO
jarlh
  • 42,561
  • 8
  • 45
  • 63

1 Answers1

0

The first thing I would say is I personally wouldn't do this in T-SQL, I'd use something a bit better suited to the task, the SSIS Multiple flat file connection manager for example. This is possible in T-SQL but it is not the most robust approach and has limitation (such as the directories that can even be accessed).

-- Set the directory to check for files
DECLARE @Directory NVARCHAR(200) = N'C:\Your File Location\';

-- Create a table variable to store the files output from xp_dirtree
DECLARE @Files TABLE (FileName NVARCHAR(255), Depth INT, IsFile BIT);

-- Get the files from the folder
INSERT @Files(FileName, Depth, IsFile)
EXECUTE master..xp_dirtree @Directory, 1, 1;

--select the filenames and use STRING_AGG to combine into a single string to execute
DECLARE @SQL NVARCHAR(MAX) = 
    (  SELECT   STRING_AGG(CONCAT('BULK INSERT YourTableName FROM ''', 
                              CONCAT(@Directory, f.FileName), '''  
                            WITH (
                                FIELDTERMINATOR = '','',
                                ROWTERMINATOR = ''\n'',
                                FIRSTROW = 1  
                            )'), ';' + CHAR(10))
        FROM    @Files AS f
        WHERE   f.IsFile = 1
        AND     f.FileName LIKE '%.csv'-- Optionally limit files
    ); 

PRINT @SQL;
-- Uncomment below line once you're happy with the SQL Printed
--EXECUTE sp_executesql @SQL; 

Or if you want a bit more control over error handling, you could use a cursor to iterate the files:

-- Set the directory to check for files
DECLARE @Directory NVARCHAR(200) = N'C:\Import Location\';

-- Create a table variable to store the files output from xp_dirtree
DECLARE @Files TABLE (FileName NVARCHAR(255), Depth INT, IsFile BIT);

-- Get the files from the folder
INSERT @Files(FileName, Depth, IsFile)
EXECUTE master..xp_dirtree @Directory, 1, 1;

--declare a cursor to loop through the files, filtered for csv
DECLARE FileCursor CURSOR LOCAL STATIC FORWARD_ONLY READ_ONLY 
FOR
SELECT  FilePath = CONCAT(@Directory, f.FileName)
FROM    @Files AS f
WHERE   f.IsFile = 1
AND     f.FileName LIKE '%.csv'; -- Optionally limit files

OPEN FileCursor;
DECLARE @FilePath NVARCHAR(255);
FETCH FROM FileCursor INTO @FilePath;

WHILE @@FETCH_STATUS = 0
BEGIN
    PRINT @FilePath;
    DECLARE @sql NVARCHAR(MAX) = 
        CONCAT('BULK INSERT YourTableName FROM ''', @FilePath, '''  
                WITH (
                    FIELDTERMINATOR = '','',
                    ROWTERMINATOR = ''\n'',
                    FIRSTROW = 1  
                )');
    BEGIN TRY
        EXECUTE sp_executesql @SQL; 
    END TRY
    BEGIN CATCH
        -- Do something to handle errors
    END CATCH 
    FETCH NEXT FROM FileCursor INTO @FilePath; 
END


CLOSE FileCursor;
DEALLOCATE FileCursor;

As I say though, SSIS or another dedicated ETL tool is a better choice.

GarethD
  • 68,045
  • 10
  • 83
  • 123
  • I would mention `xp_dirtree` is an undocumented system extended stored proc so it may change or be removed without notice in a future SQL Server version or CU. – Dan Guzman Jan 23 '23 at 20:03