I have a SQL Server table with a column X
of type varbinary(max)
. It is storing the output of a GZIP of an XML string. I can export the column records to a Windows Native format BCP file like this:
bcp "select top 5 X from mytable" -N foo.bcp
I have a very large number of records. PYODBC is 100X slower than BCP, so rather than query the binaries directly into Python, which I can do, I am trying to BCP them into a flat file first.
I know there is a DECOMPRESS
function in SQL Server 2016+, but I am unfortunately limited to SQL Server 2014 which does not have this function. So DECOMPRESS
on the server side before BCP is not an option for me. I have to get the data out of the native format file into Python and then decompress.
Question: How to I extract the 5 rows from the file separately as type binary objects B
in Python 3, so that I can decompress them into Unicode with
gzip.decompress(B).decode('utf-8')
?
Note: I am not trying to BCP out and then BCP back in again. I am trying to extract the varbinary records from the native-encoded BCP file using Python.