1

I have some very large images (10,000 x 10,000) and I want to create small thumbnails for them. Unfortunately, I can't read them into memory using imagick (imagemagick php) because I run out of memory.

On Android devices, when loading a Bitmap from disk, there is an option to downsample it as it is being loaded into memory. That allows me to load smaller versions of the image into memory (by only reading every 2 or every 4 pixels, etc) and avoid the problem of gigantic files breaking everything.

I'm trying to find a way to do something like that with imagick. I've found the setOption() function which lets me send parameters similar to the command line params to the imagemagick tools.

One of those options is sample:offset which says I can give it a geometry definition. One of the possible geometry definitions is to give it a percent, like 25%, and says that means Height and width both scaled by specified percentage.

When I try to load a png that is 10988 x 5782 and I use the following code the png is still loaded into memory at full size (it can just barely handle it on my localhost, but on the server runs out of memory):

$im = new \Imagick();
$im->setOption('sample:offset', '25%');
$im->readImage($path);

Am I using setOption() correctly? Is this the correct option to set? Do I need a different value?

Here are some other formats I tried that also didn't seem to have any effect:

$im->setOption('sample:offset', '25%+0+0');
$im->setOption('sample:offset', '700x700'); // to scale it down to 700 pixels by 700 pixels
$im->setOption('sample:offset', '700x700+0+0');

I've been able to achieve a solution with jpg files using $im->setOption('jpeg:size', $smallwidth.'x'.$smallheight); but I need to be able to handle png and other image types as well.

Kenny Wyland
  • 20,844
  • 26
  • 117
  • 229
  • Avoiding using Imagick, and generating the thumbnail through a command line call to ImageMagick directly, would be a sensible thing to do here. Source, I am the maintainer of Imagick, and knowing when not to use it is a good thing. – Danack Aug 12 '20 at 19:00

1 Answers1

1

Shrink on load is a feature of libjpeg, the JPEG loading library. It's not a feature of ImageMagick.

It's possible because of the way that JPEG images are encoded: they are split into 8x8 pixel blocks and JPEG encodes the average for that block, plus a set of DCT coefficients. When you decode, you can get the 8x8 average, then only reconstruct enough of the block to make a 4x4 pixel version (for example). That's why libjpeg offers 1/8, 1/4 and 1/2 size decode. ImageMagick exposes this library feature with the jpeg:size option.

PNG images are not encoded like this. They use a simple predictor and encode the prediction errors with zlib as a single huge datastream. You can't decode part of the image, and you can't decode at a lower resolution.

The only way to handle large PNG images is to stream them. You decode (for example) 10 scanlines of pixels, process them, write the output, then decode the next set of scanlines. ImageMagick does not work this way, so what you need is not possible.

As I posted in your previous question, I think you'll need a streaming image processing library, such as libvips.

jcupitt
  • 10,213
  • 2
  • 23
  • 39