30

I have some code like this that I use to do a BULK INSERT of a data file into a table, where the data file and table name are variables:

DECLARE @sql AS NVARCHAR(1000)
SET @sql = 'BULK INSERT ' + @tableName + ' FROM ''' + @filename + ''' WITH (CODEPAGE=''ACP'', FIELDTERMINATOR=''|'')'

EXEC (@sql)

The works fine for standard tables, but now I need to do the same sort of thing to load data into a temporary table (for example, #MyTable). But when I try this, I get the error:

Invalid Object Name: #MyTable

I think the problem is due to the fact that the BULK INSERT statement is constructed on the fly and then executed using EXEC, and that #MyTable is not accessible in the context of the EXEC call.

The reason that I need to construct the BULK INSERT statement like this is that I need to insert the filename into the statement, and this seems to be the only way to do that. So, it seems that I can either have a variable filename, or use a temporary table, but not both.

Is there another way of achieving this - perhaps by using OPENROWSET(BULK...)?


UPDATE: OK, so what I'm hearing is that BULK INSERT & temporary tables are not going to work for me. Thanks for the suggestions, but moving more of my code into the dynamic SQL part is not practical in my case.

Having tried OPENROWSET(BULK...), it seems that that suffers from the same problem, i.e. it cannot deal with a variable filename, and I'd need to construct the SQL statement dynamically as before (and thus not be able to access the temp table).

So, that leaves me with only one option which is to use a non-temp table and achieve process isolation in a different way (by ensuring that only one process can be using the tables at any one time - I can think of several ways to do that).

It's annoying. It would have been much more convenient to do it the way I originally intended. Just one of those things that should be trivial, but ends up eating a whole day of your time...

Gary McGill
  • 26,400
  • 25
  • 118
  • 202
  • did you try static bulk insert into temp table? – Andrey Mar 04 '10 at 17:59
  • @Andrey: to be honest I hadn't, but I have now - and it works as I expected. – Gary McGill Mar 04 '10 at 18:07
  • I know this is already solved, but just to mention: you could write a SQLCLR stored proc to take in any of these parameters (e.g. filename) and spit out the results. At that point you just create the temp table, then `INSERT INTO #mytemp EXEC SQLCLRproc @filename='path.txt';`. And such a proc already exists in the [SQL#](http://www.SQLsharp.com/) library (that I created). The proc is called `File_SplitIntoFields` and allows for specifying field names and datatypes (not sure about `CODEPAGE='ACP'` though). I should mention that it is _not_ in the Free version, but it makes this very easy. – Solomon Rutzky Jan 15 '15 at 22:10

4 Answers4

21

You could always construct the #temp table in dynamic SQL. For example, right now I guess you have been trying:

CREATE TABLE #tmp(a INT, b INT, c INT);

DECLARE @sql NVARCHAR(1000);

SET @sql = N'BULK INSERT #tmp ...' + @variables;

EXEC master.sys.sp_executesql @sql;

SELECT * FROM #tmp;

This makes it tougher to maintain (readability) but gets by the scoping issue:

DECLARE @sql NVARCHAR(MAX);

SET @sql = N'CREATE TABLE #tmp(a INT, b INT, c INT);

BULK INSERT #tmp ...' + @variables + ';

SELECT * FROM #tmp;';

EXEC master.sys.sp_executesql @sql;

EDIT 2011-01-12

In light of how my almost 2-year old answer was suddenly deemed incomplete and unacceptable, by someone whose answer was also incomplete, how about:

CREATE TABLE #outer(a INT, b INT, c INT);

DECLARE @sql NVARCHAR(MAX);

SET @sql = N'SET NOCOUNT ON; 

CREATE TABLE #inner(a INT, b INT, c INT);

BULK INSERT #inner ...' + @variables + ';

SELECT * FROM #inner;';

INSERT #outer EXEC master.sys.sp_executesql @sql;
Aaron Bertrand
  • 272,866
  • 37
  • 466
  • 490
14

It is possible to do everything you want. Aaron's answer was not quite complete.

His approach is correct, up to creating the temporary table in the inner query. Then, you need to insert the results into a table in the outer query.

The following code snippet grabs the first line of a file and inserts it into the table @Lines:

declare @fieldsep char(1) = ',';
declare @recordsep char(1) = char(10);

declare @Lines table (
    line varchar(8000)
);

declare @sql varchar(8000) = ' 
    create table #tmp (
        line varchar(8000)
    );

    bulk insert #tmp
        from '''+@filename+'''
        with (FirstRow = 1, FieldTerminator = '''+@fieldsep+''', RowTerminator = '''+@recordsep+''');

    select * from #tmp';

insert into @Lines
    exec(@sql);

select * from @lines
David Sherret
  • 101,669
  • 28
  • 188
  • 178
Gordon Linoff
  • 1,242,037
  • 58
  • 646
  • 786
  • you've cracked it! Well, nearly: you need to select from #tmp as the last statement in the dynamic SQL, but apart from that - success! Thanks. (For those that hope to use this solution on SQL Server 2005 and earlier, you'll find you need to make some syntax changes, but the principle is sound). – Gary McGill Jan 12 '12 at 11:27
  • 7
    Wow, the lack of the outer insert (given I had no idea what @tablename should be, and the goal was to insert into a #tmp table, not a table variable) made my answer incomplete? For a question from almost two years ago? Welcome to StackOverflow. Also Gary the answer still wasn't complete, but because it's newer you changed your accept? I'm so confused. – Aaron Bertrand Jan 12 '12 at 19:04
  • Hello, I have the same issue but it doesn't work for me. I am working with SQL Server with Azure SQL Server. – Lidou123 May 02 '19 at 13:26
  • Neither works for me, on SQL Server 2012 at least, the field separator is completely ignored. – ner0 May 25 '20 at 17:33
2

Sorry to dig up an old question but in case someone stumbles onto this thread and wants a quicker solution.

Bulk inserting a unknown width file with \n row terminators into a temp table that is created outside of the EXEC statement.

DECLARE     @SQL VARCHAR(8000)

IF OBJECT_ID('TempDB..#BulkInsert') IS NOT NULL
BEGIN
    DROP TABLE #BulkInsert
END

CREATE TABLE #BulkInsert
(
    Line    VARCHAR(MAX)
)

SET @SQL = 'BULK INSERT #BulkInser FROM ''##FILEPATH##'' WITH (ROWTERMINATOR = ''\n'')'
EXEC (@SQL)

SELECT * FROM #BulkInsert

Further support that dynamic SQL within an EXEC statement has access to temp tables outside of the EXEC statement. http://sqlfiddle.com/#!3/d41d8/19343

DECLARE     @SQL VARCHAR(8000)

IF OBJECT_ID('TempDB..#BulkInsert') IS NOT NULL
BEGIN
    DROP TABLE #BulkInsert
END

CREATE TABLE #BulkInsert
(
    Line    VARCHAR(MAX)
)
INSERT INTO #BulkInsert
(
    Line
)
SELECT 1
UNION SELECT 2
UNION SELECT 3

SET @SQL = 'SELECT * FROM #BulkInsert'
EXEC (@SQL)

Further support, written for MSSQL2000 http://technet.microsoft.com/en-us/library/aa175921(v=sql.80).aspx

Example at the bottom of the link

DECLARE @cmd VARCHAR(1000), @ExecError INT
CREATE TABLE #ErrFile (ExecError INT)
SET @cmd = 'EXEC GetTableCount ' + 
'''pubs.dbo.authors''' + 
'INSERT #ErrFile VALUES(@@ERROR)'
EXEC(@cmd)
SET @ExecError = (SELECT * FROM #ErrFile)
SELECT @ExecError AS '@@ERROR'
deeg
  • 528
  • 2
  • 5
1

http://msdn.microsoft.com/en-us/library/ms191503.aspx

i would advice to create table with unique name before bulk inserting.

Andrey
  • 59,039
  • 12
  • 119
  • 163
  • Thanks, but I'm not sure what you're suggesting? If I create the table with a unique name that therefore isn't known till runtime, then I still have to dynamically construct the BULK INSERT statement? Wait - do you mean I should create a *non-temporary* table (which I can then drop later on)? Or a global temp table, I suppose. Hmmm... I guess that *might* work, but that would mean that all the *other* code that processes the table once the data has been inserted would *also* have to be dynamically-constructed in order to use the correct table name. That sounds like a nightmare... – Gary McGill Mar 04 '10 at 18:26
  • you answered your questions :) "other code" also have to be aware about temp table. you can do this: create nontemp table, bulk-insert, copy from nontemp to temp, drop nontemp, have fun – Andrey Mar 04 '10 at 18:39
  • No, do not use a global temp table. All this will do is ensure that two different users running at roughly the same time will mix their results together (or get an error when they try to create a ##table that already exists). I have yet to find a real, practical use case for a global temp table. – Aaron Bertrand Mar 04 '10 at 18:59
  • I think there's a flaw in your plan... in order to "copy from nontemp to temp", I'd need to use dynamic SQL (so that I could insert the name of the nontemp table with the generated unique name) - and once again, the dynamic SQL would not be able to access the temp table. – Gary McGill Mar 05 '10 at 10:07
  • @Aaron: I only meant that if I was going to create a non-temp table with a unique name, then it could equally well be a global temp table *with a unique name*. The only benefit being that the table would be automatically dropped at some point, so there would be no risk of the database getting cluttered up with tables that I'd created but failed to drop explicitly. I agree that - on their own - global temp tables do not actually help solve the problem at hand. – Gary McGill Mar 05 '10 at 10:10