I have some relatively large F# structures made from primitive types / arrays / lists / records / DUs, but no classes. The structures are fluid and so creating tables to match them will bear a very high maintenance cost. Serializing to / deserializing from JSON works like magic.
Unfortunately, the resulting JSON strings might be as large as 25-250 MB, so I want to compress them before storing in the SQL Server database. This is effectively an archive system, so performance is not an issue, look up is not an issue, and using file system as a storage will bring its own mess, which I want to avoid.
I am fine with the SQL Server database growing as large as it will. On the other side SQL Server based compression does not do any good in this case due to some other issues.
What I need is as follows: take an in-memory string (about 25-250MB JSON) -> compress it into a binary format (not GZipped base 64 format as discussed in various answers on the web!) -> store in SQL Server varbinary(max)
column and then be able to do it all the way back. I was expecting something like JsonConvert.SerializeObject
/ JsonConvert.DeserializeObject
. Unfortunately, searching on the web did not produce any one-liner results. I would appreciate any advice.