1

I have a table in SQL Server with a varbinary(MAX) column which is filled with SQL Compress method of JSON string with utf8 characters for example {"id": 12, title: "فروش"} in the sql I use CAST(DECOMPRESS(data) AS NVARCHAR(MAX)) and result is ok.

In the c# I use this code for decompressing the data column:

public static string Unzip(byte[] bytes)
{
    using (var msi = new MemoryStream(bytes))
    using (var mso = new MemoryStream())
    {
        using (var gs = new GZipStream(msi, CompressionMode.Decompress))
        {
            CopyTo(gs, mso);
        }
        return Encoding.UTF8.GetString(mso.ToArray());
    }
}

But the result is an invalid string:

enter image description here

jarlh
  • 42,561
  • 8
  • 45
  • 63
Hamed Mahdizadeh
  • 936
  • 1
  • 15
  • 29

3 Answers3

4

I believe your string is in the format "Unicode" but you are trying to use the encoding type UTF8?

Morten Bork
  • 1,413
  • 11
  • 23
0

Depending on if you have the default Collations enabled in SQL, NVARCHAR will be returned as Unicode while VARCHAR will be returned as UTF-8. You could also use SQL Output Collation to force an encoding:

SELECT CAST('abc' AS varchar(5)) COLLATE Latin1_General_100_CI_AS_SC_UTF8

Dave Black
  • 7,305
  • 2
  • 52
  • 41
-1

In your SELECT you should use DECOMPRESS

Source:

https://learn.microsoft.com/it-it/sql/t-sql/functions/decompress-transact-sql?view=sql-server-ver15

Marco Salerno
  • 5,131
  • 2
  • 12
  • 32
  • I use entity frame work for fetching data is there method in ef which does the same job? – Hamed Mahdizadeh Aug 11 '20 at 10:57
  • 2
    Nope, you can make a view which takes the result already decompressed and use it with EF – Marco Salerno Aug 11 '20 at 10:59
  • 2
    Using `DECOMPRESS` would defeat the purpose of returning compressed data from SQL.. This doesn't answer the OP's question as they are trying to read a compressed stream, not decompress the stream and then read it. – Dave Black Jun 21 '22 at 16:29