19

How can I specify an input sql file with a long query when using bcp? I tried using the -i option but it keeps complaining about a command-line error with no extra information. Is this possible?

Legend
  • 113,822
  • 119
  • 272
  • 400
  • Input SQL file? No. You can specify a data file and a format file. If you want to use SQL, perhaps bulk insert is a better choice. – Gordon Linoff May 13 '12 at 03:40
  • @GordonLinoff: Actually, I am trying to pull data from a table. Before this, I was doing a SELECT based on a string and supplying it to bcp but I have a longer query now. – Legend May 13 '12 at 03:49
  • You could create a view with the query, perhaps. – Gordon Linoff May 13 '12 at 03:53
  • @GordonLinoff: Unfortunately, I don't have sufficient permissions to create a view. – Legend May 13 '12 at 04:09
  • @Legend: You should be able to create a temporary procedure even if you don't have DDL rights. CREATE PROC #MYPROC AS SELECT* FROM MYTABLE, Then run your newly created proc. This proc must return a persistant result. The best bet would be to drop it to a temp table then finish the proc with select * from temp table. – PollusB Jan 16 '17 at 19:32
  • If you have access to the SQL server, create a stored procedure and call the sproc via the BCP utility. – Fabio S. Jun 06 '18 at 15:52

9 Answers9

12

I had this problem today and found a convenient workaround, at least in an ad-hoc situation.

Temporary tables can be created by any user with connect permissions. This means you can also create GLOBAL temporary tables.

Just run your query in enterprise manager (or sql cmd or whatever) using SELECT ...INTO with a global temporary table e.g.

SELECT * 
INTO ##mytemptable
FROM SomeTable 
WHERE [massive where clause, for example] 

You can then use the temporary table in the BCP query with a simple

SELECT * FROM ##mytemptable

Then drop the temp table through enterprise manager

DROP TABLE ##mytemptable
LarryDavid
  • 736
  • 6
  • 16
9

I did other way for fix that.

I create a batch file which read a file and send your content in bcp command. See:

@ECHO off

SETLOCAL EnableDelayedExpansion

SET queryFile=%1
SET outFileName=%2

FOR /F "delims=" %%i IN (%queryFile%) DO SET join=!join! %%i

ECHO %join%

bcp "%join%" queryout %outFileName% /S.\SQLE_CAESAR /d /c /t"|" /T

That script receive two parameters:

  1. Filename which has a query;
  2. Filename for export data;

Execute a script in cmd like that: export-query.bat query.sql export.txt

I hope helped.

Rafael Gomes Francisco
  • 2,193
  • 1
  • 15
  • 16
8

As far as I'm concerned the BCP utility only supports Transact-SQL queries directly written to the command line. Ex:

bcp "SELECT Name FROM AdventureWorks.Sales.Currency" queryout Currency.Name.dat -T -c

According to its reference the "-i" option:

Specifies the name of a response file, containing the responses to the command prompt questions for each data field when a bulk copy is being performed using interactive mode (-n, -c, -w, or -N not specified).

Notice that it differs from the sqlcmd Utility "-i" option:

Identifies the file that contains a batch of SQL statements or stored procedures. Multiple files may be specified that will be read and processed in order (...)

Thomas C. G. de Vilhena
  • 13,819
  • 3
  • 50
  • 44
  • +1 Thank you. Would you happen to know what to do when the `SELECT` command gets too long? – Legend May 13 '12 at 03:49
  • I never had to use long queries myself, but a quick Google search results in this link that may be helpful to you: [FIX: Bcp.exe with Long Query String Can Result in Assertion Failure](http://support.microsoft.com/kb/279180) – Thomas C. G. de Vilhena May 13 '12 at 04:25
  • 4
    When I've run into BCP out queries that were too long, we resorted to storing views on the source. Our BCP query can then simply be "select x,y,z from [VIEW]" – Jim Clouse Aug 26 '15 at 15:28
3

try :

query=$( cat < /file.sql )
export query
bcp "${query}" queryout /home/file.csv
  • `/opt/mssql-tools/bin/bcp: Argument list too long` for example with 83315393 bytes file. Maybe a bad option, if your file is really large. – rios0rios0 Sep 19 '20 at 00:30
2

Multi-line queries can be given to bcp easily using powershell:

PS> $query = @'
    select * 
    from <table>
'@
PS> bcp $query queryout <outfile> -d <database> -T -S <server> -c
torbiak
  • 136
  • 5
1

I had face same issue, may not be a very good approach. However, I did something like the following

bcp "declare @query nvarchar(max)  set @query = (SELECT * FROM OPENROWSET(BULK 'F:\tasks\report_v2.sql', SINGLE_CLOB) AS Contents) exec sp_executesql @query" queryout %outFileName%  /c /C RAW -S . -U sa -P 123 -d blog /T

And I must say, if you use like global temp table then global temp table is dropped itself of after query executed. you can't use this at some situations

mdora7
  • 41
  • 1
  • 5
1

What really worked for me is this:

@ECHO off
setlocal enableextensions enabledelayedexpansion
SET "queryFile=%1"
SET "outFileName=%2"

SET RESULT=
FOR /F "delims=" %%i IN ('type %queryFile%') DO SET RESULT=!RESULT! %%i

echo %RESULT%

rem bcp "%RESULT%" queryout %outFileName% -t^ -r \n -T -k -c -d DB_NAME -S SERVER_NAME

type file is the equivalent of cat file in unix

Pencilcheck
  • 2,664
  • 3
  • 25
  • 14
0

What I did with complex queries was create a stored procedure with the desired statement and call it from BCP:

bcp "exec db.schema.stored_procedure" queryout "c:\file.txt" -T -S localhost -t "|" -c

This worked great for me. Greetings!

Diego
  • 151
  • 1
  • 6
0

I made my own script (called of bulk.sh) to do this (not optimal and not best practice... The script is too ugly, but very functional).

#!/bin/bash
input="SQL_FILE.sql"
count=0
const=1000
lines=()
mkdir -p bulk
while IFS= read -r line
do
  lines+=("$line")
  count=$((count+1))
  check=$((count % const))
  if [[ $check -eq 0 ]]; then
     bulk="${lines[*]}"
     unset lines
     number=$(printf "%010d" $count)
     echo $bulk > "bulk/bulk${number}.sql"
     bulk=""
  fi
done < "$input"

FILES="bulk/*"
for f in $FILES
do
  echo "Processing $f file..."
  sqlcmd -S SERVER -d DATABASE -U USER -P "PASSWORD" -i "$f"
  sleep 2s
done

You can try it, with:

$ docker run -v /path/to/your/sql/file/folder:/backup -it mcr.microsoft.com/mssql-tools
$ bash bulk.sh
rios0rios0
  • 735
  • 7
  • 20