Any advice would be appreciated. I've got a json variable in my web application that I'd like to gzip and upload to S3 through a presigned URL.
I'm able to upload JSON successfully, but I fail to gzip the JSON and then upload it.
The three separate different ways I've tried to build the gzipped json is:
// example json
const someJson = { testOne: 'a', testTwo: 'b' };
// Attempt one
const stringUtf16 = JSON.stringify(someJson);
const resultAsBinString = pako.gzip(stringUtf16);
// Attempt two
const stringUtf16 = JSON.stringify(someJson);
const resultAsBinString = pako.gzip(stringUtf16, { to: 'string' });
// Attempt three
const stringUtf16ThatWeNeedInUtf8 = JSON.stringify(someJson);
const stringUtf8 = unescape(encodeURIComponent(stringUtf16ThatWeNeedInUtf8));
const resultAsBinString = pako.gzip(stringUtf8);
For each attempt, I uploaded the resultAsBinString through Angular's HTTP client, with the headers Content-Type: 'application/x-gzip' and Content-Encoding: 'gzip'
But when (and if, oftentimes it gives a network error) the file is afterwards downloaded from S3, when trying to unzip with gzip or gunzip in the terminal, an error message is given: 'not in gzip format'
Sources I've tried to follow:
https://github.com/nodeca/pako/issues/55 https://github.com/nodeca/pako/blob/master/examples/browser.html