I'm executing the following command:
sqlcmd -i "\\path\to\sqlfile.sql" -S "ServerName" -d "DBName" -y 0
Wherein sqlfile.sql
is a handful of DECLARE
statements for variables and then a tree of nested SELECT ... FOR XML Path('...')
statements to ultimately generate a big XML string as output.
In a PowerShell command prompt, I pipe the output to a file:
sqlcmd -i "\\path\to\sqlfile.sql" -S "ServerName" -d "DBName" -y 0 | Out-File C:\path\to\output.xml
(Or I could use -o
on the command itself, I have no strong preference either way.)
The goal here is to have a single XML file that can then be loaded into another downstream process. This works well, save for one detail that currently has me stuck. The output (to the file) is auto-wrapped at 2,033 characters. Which means that it generates broken XML if (as in most cases) the wrapping breaks a tag:
<SomeTag>This is some content just to illustrate.</Som
eTag>
I've tried using -w
on the sqlcmd
arguments. In testing that would successfully wrap at values lower than 2,033, but at any value higher than 2,033 (20000
for example, which would be more than wide enough) that same maximum of 2,033 would remain.
I also tried calling -replace
before piping the output to try to remove line breaks, such as:
([the sqlcmd command above]) -replace "\r", "" | Out-File C:\path\to\output.xml
To include attempts on \r
, \n
, \r\n
, and using back-ticks instead of back-slashes. All to no avail. (In testing, -replace "KnownString", "TEST"
did successfully replace the known string with TEST
.)
Perhaps tunnel-vision has taken over here. Am I missing something easier? The goal is to have a PowerShell command-line command which executes a .sql
file and puts valid XML (doesn't need to be pretty-printed, although that would be cool) into an .xml
file.
Is there something I can add/modify here which would break out of that 2,033-character width limit? Or perhaps a more creative approach to achieve the same goal?