0

is it possible to gzcompress data in PHP and then have Axios request it?

I've tried doing this but keep getting this error: "Malformed UTF-8 characters, possibly incorrectly encoded."

My Axios request looks like this:

axios({
method: 'get',
url: 'https://someapi.com/api/test',
data: { },
config: { headers: { 'Content-Type': 'application/json', 'Accept-Encoding': 'gzip' }}
})
.then(response => {
    response.data.forEach(el => {
        this.transactions.push(JSON.parse(el));
        this.transactionsFull = this.transactions;
    });
    this.loading = false;
    console.log(this.transactions);
})
.catch(e => {
    this.errors.push(e)
})
$result = openssl_decrypt($cipher_text, 'aes-256-gcm', $key, OPENSSL_RAW_DATA, $iv, $auth_tag);

$json = json_decode($result);
$channel = Channel::where('uuid', $json->payload->authentication->entityId)->first();
$gzencode = gzencode(json_encode(array('transaction' => $json, 'relation' => json_decode($channel))), 8);

Redis::lpush('transactions_gzencode', $gzencode);

$length = 0;
$transactions = Redis::lrange('transactions_gzencode', 0, -1);
foreach($transactions as $item) {
    $length += strlen($item);
}
header('Content-Encoding: gzip');
header('Content-Type: application/json');
header('Content-Length: ' . $length);
return $transactions;
Dally
  • 1,281
  • 4
  • 18
  • 37
  • Are you sure you want to do it exactly in PHP? Normally, it's done on the web-server side by appropriate settings (Nginx or Apache). – Ruslan Isay May 04 '19 at 16:19
  • Hi @Ruslan but wouldn't it make sense to send the data compressed as it's smaller in size and then uncompress it? – Dally May 04 '19 at 16:22
  • What you are saying is correct. But there is no actual "traffic" between PHP and web-server. Therefore there are no benefits to compressing data with PHP (maybe with some exceptions, but not sure if it's applicable for regular JS query to REST API). You can try something like this - https://php.net/manual/ru/function.gzcompress.php, but if you compress on the web-server level, you can also control gzipping for static assets (JS, CSS, HTML, images) and manage your compression policy in one place. – Ruslan Isay May 04 '19 at 16:50
  • gzcompress is exactly what I'm doing. What I'm doing is compressing large JSON objects and putting them into Redis. I'm then retrieving everything from Redis and send it to my VueJS front-end app. The problem I have is that Axios isn't decompressing the data. – Dally May 04 '19 at 16:54

1 Answers1

1

I believe that axios is not able to decompress gzip, but the browser should be able to do it before axios even touches the response. But for the browser to do so you must respond with the proper http header and format.

Note that to use the compressed data in the http response body you must use gzencode, not gzcompress, according to php documentation.

Example PHP:

$compressed = gzencode(json_encode(['test' => 123]));
header('Content-Type: application/json');
header('Content-Encoding: gzip');
header('Content-Length: ' . strlen($compressed));
echo $compressed;

Example JS:

console.log(await (await fetch('/test')).json());
// {test: 123}

Edit

Since what you are trying to do is to send an array of individually compressed items, you can output the data in a JSON encoded array of base64 encoded binary compressed data.

Example of how to use pako.js to decompress the array of compressed transactions returned from the server:

PHP:

$transactions = ['first', 'second', 'third'];
echo json_encode(array_map('base64_encode', array_map('gzencode', $transactions)));

JS:

(async () => {
    const transactions = (await (await fetch('/test')).json())
        .map(atob)
        .map(blob => pako.inflate(blob, { to: 'string' }));

    console.log(transactions);
})();

Notice that now I didn't include the headers, cause I am just sending a regular json encoded array.

The downside of this approach is that there won't be much benefit on compressing the data, since it is being converted to base64 before sending to the client. It is necessary to encode as base64, because otherwise json_encode would try to handle the binary data as string and it would lead to string encoding errors.

You can still compress the resulting json encoded string before sending to the client like was done I the previous answer, but I'm not sure if the compression would still be good enough:

$compressedTransactions = array_map('gzencode', ['first', 'second', 'third']);

$compressed = gzencode(json_encode(array_map('base64_encode', $compressedTransactions)));
header('Content-Type: application/json');
header('Content-Encoding: gzip');
header('Content-Length: ' . strlen($compressed));
echo $compressed;
Thiago Barcala
  • 6,463
  • 2
  • 20
  • 23
  • Thanks mate, I'll try this now and then get back. – Dally May 05 '19 at 07:38
  • strlen doesn't work on the compressed data. I get this error, strlen() expects parameter 1 to be string, array given. – Dally May 05 '19 at 07:57
  • @Dally, can you write some of your php code to the question? It seems something else is wrong, cause the return type of gzencode is `string`: https://www.php.net/manual/en/function.gzencode.php. – Thiago Barcala May 05 '19 at 08:15
  • I wrote a small test and was able to request the data via AJAX. I updated the example in my answer. – Thiago Barcala May 05 '19 at 08:20
  • I've added more code to my original question mate. Is it because I have a multi-dimensional array? – Dally May 05 '19 at 08:25
  • I know what my issue was. Noob mistake, I had more than one array but now I have a new issue. net::ERR_CONTENT_DECODING_FAILED 500. I have updated my original code. – Dally May 05 '19 at 08:32
  • the problem is that concatenating gzipped data is not the same as gzipping concatenated data. You should consider storing uncompressed data in redis, and compressing it only before responding to the client, otherwise you will need to fetch from redis, decompress the individual items into an array of uncompressed data, then compress the array and send to client. – Thiago Barcala May 05 '19 at 08:34
  • I see what you mean mate. The reason I’ve compressed it into Redis is because I’m storing a lot of data. Fetching 30k rows from Redis and then outputting the result is taking about 10-11 seconds. I think it’s because I’m using the list data structure and LLRANGE isn’t very efficient when fetching large amounts of data. – Dally May 05 '19 at 08:41
  • another option would be to receive the array of compressed data in the client and use a JS lib to uncompress, like https://github.com/nodeca/pako – Thiago Barcala May 05 '19 at 08:45
  • But would Axios throw an error when fetching compressed data? – Dally May 05 '19 at 08:49
  • I added a usage example to my answer. – Thiago Barcala May 05 '19 at 09:11