0

The context behind this question is that I am taking an image buffer, compressing it with pngquant, and then piping the compressed image to the response. Something like:

// https://www.npmjs.com/package/pngquant
const PngQuant = require('pngquant'); 

// start with base64-encoded png image data:
var base64data = '.......';

// then create buffer from this, as per:
//   https://stackoverflow.com/a/28440633/4070848
//   https://stackoverflow.com/a/52257416/4070848
var imgBuffer = Buffer.from(base64data, 'base64');

// set up pngquant...
const optionsArr = [ ..... ];
const myPngQuanter = new PngQuant(optionsArr);

// convert buffer into stream, as per:
//   https://stackoverflow.com/a/16044400/4070848
var bufferStream = new stream.PassThrough();
bufferStream.end(imgBuffer);

// pipe the image buffer (stream) through pngquant (to compress it) and then to res...
bufferStream.pipe(myPngQuanter).pipe(res);

I want to determine the compression ratio achieved by the pngquant operation. I can easily find the starting size with:

const sizeBefore = imgBuffer.length;

I also need the size of the compressed stream. Furthermore, this information must be available before the stream is piped to the res destination because I need to add a header to res based on the compression stats.

To get sizeAfter I've tried the length-stream module, where you can insert a listener into the pipe (between myPngQuanter and res) to determine the length as it passes through. Whilst this does seem to work to determine the length of the compressed stream, it doesn't happen in time to add any headers to res. I've also tried stream-length, but cannot get it to work at all.

Any help appreciated.

drmrbrewer
  • 11,491
  • 21
  • 85
  • 181

1 Answers1

1

Well streams by their nature don't really have length information (a stream can be infinite, e.g. opening /dev/random), so the easiest option I can see is using another temporary buffer. It is unfortunate that pngquant doesn't have options for operating on buffers, but there is not much you can do about that, besides using a different package altogether.

2nd edit, since stream-buffer might not work:

There is a package called stream-to-array, which allows easy implementation of a stream-to-buffer conversion. As per the README, the code should be modified to:

const toArray = require('stream-to-array');
const util = require('util');
toArray(bufferStream.pipe(myPngQuanter))
.then(function (parts) {
  const buffers = parts
    .map(part => util.isBuffer(part) ? part : Buffer.from(part));
  const compressedBuffer = Buffer.concat(buffers);
  console.log(compressedBuffer.length); // here is the size of the compressed data
  res.write(compressedBuffer);
});

Or alternatively with await, if you happen to be in an async context:

const toArray = require('stream-to-array');
const util = require('util');
const parts = await toArray(bufferStream.pipe(myPngQuanter));
const buffers = parts.map(part => util.isBuffer(part) ? part : Buffer.from(part));
const compressedBuffer = Buffer.concat(buffers);
console.log(compressedBuffer.length); // here is the size of the compressed data
res.write(compressedBuffer);
Aurel Bílý
  • 7,068
  • 1
  • 21
  • 34
  • Nice thought. With this I get `Error [ERR_STREAM_CANNOT_PIPE]: Cannot pipe, not readable` when calling `compressedStream.pipe(res)` – drmrbrewer Jan 02 '19 at 15:08
  • Hmm, odd. Now I get `TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be one of type string or Buffer. Received type boolean` when calling `res.write(compressedBuffer)`. Indeed, the value of `compressedBuffer` is `false` at that point, and `compressedStream.size()` is `0`. – drmrbrewer Jan 02 '19 at 15:44
  • I also notice from the README of the `stream-buffers` package that it is "Not supposed to be a speed demon, it's more for tests/debugging or weird edge cases. It works with an internal buffer that it copies contents to/from/around". I wonder how much it really slows things down by using it rather than just streaming from end-to-end (like in my original code) without the need for the "size check" before the end. – drmrbrewer Jan 02 '19 at 16:04
  • That is quite strange. See if the latest edit helps, using a different package, in case `stream-buffers` really is that slow. Having said that, it is completely possible that `pngquant` internally does a full memory read at some point, and I honestly doubt your PNG images are anywhere near large enough for this to matter. – Aurel Bílý Jan 02 '19 at 19:15
  • 1
    I like it!! Works nicely! The only small change I made is completely unrelated to the core question, which is to use `res.send(compressedBuffer)` rather than `res.write(compressedBuffer)`... see https://stackoverflow.com/a/44693016/4070848. I like the fact that, with this solution, we do the size determination based on `buffer.length` for both 'before' and 'after'. Not sure if there is a more efficient / faster way of doing this, but certainly your solution works... thanks! – drmrbrewer Jan 02 '19 at 21:31