0

I am trying to upload the binary[] of a Zip folder to my database. I used Get-Content -Encoding Byte -ReadCount 0 to read the data into a variable. I want to use this variable in an INSERT statement. Unfortunately, sqlcmd doesn't like the size of the variable, and gives me this error:

Program 'SQLCMD.EXE' failed to run: The filename or extension is too longAt line:1 char:1.

I have tried using the -Q option to run the query, and also -i to run a sql file.


DECLARE @data varbinary(MAX)
SET @data = '$(data_stuff)'

INSERT INTO MyTable
(v1,v2,v3,v4,v5)
VALUES
(v1,v2,v3,v4,@data)

sqlcmd -S servername -E -i .\file.sql -v data = "$binarydata"

Is there a workaround for doing this?

techguy1029
  • 743
  • 10
  • 29
  • `sqlcmd` is not the best tool for working with binary-typed data, for example, it defaults binary results to 4000 bytes unless you specify `-Y 0`, and Microsoft warns against doing so here: https://learn.microsoft.com/en-us/sql/tools/sqlcmd-utility?view=sql-server-2017 – Dai Aug 13 '19 at 20:10

1 Answers1

0

In a SQL query/batch/.sql file, binary/varbinary/image literal data values must be in hexadecimal format with a 0x prefix:

INSERT INTO tableName ( binaryColum ) VALUES ( 0x1234567890ABCDEF )

I don't know what the maximum length of a binary literal is, but I suspect things might stop working, or be very slow, if you exceed more than a few hundred kilobytes.

I recommend using ADO.NET directly via PowerShell, which will also let you use binary parameter values (SqlParameter): How do you run a SQL Server query from PowerShell?

Dai
  • 141,631
  • 28
  • 261
  • 374