1

I'm processing 5MB - 6MB images and reducing them to under 600KB file size using PHP Imagick. Usually on the order of 3000 to 5000 at a time. However, the process is taking 8-12 hours to complete. I tried two different ways of handling this: 1) Retrieving the images remotely using Guzzle Pool and storing them locally, then running the conversion process, and 2) Retrieving the images remotely and storing in an ImageMagick object, processing them, then saving locally. Either method seems to take a huge amount of time to complete. The process of resizing the images and saving them below is the same between the two methods, except for reading the image from file if I already have it saved locally.

$imagick = new Imagick();
$imagick->readImageBlob($source);
$imagick->setImageFormat('jpg');
$imagick->setOption('jpeg:extent', '600kb');
$imagick->stripImage();
$imagick->writeImage($destination);

Wondering if there is something else I can do to speed things up.

rbruhn
  • 185
  • 1
  • 12
  • 1
    Imagick is a bit slow and resource hungry... you could try, if you're just resizing images, to use the GD library instead - it's less sophisticated but faster. Actually, copying the images to a local machine and processing them with Imagick via a system call from Perl is quicker than PHP. – CD001 Apr 13 '17 at 15:15
  • 1
    As I understand it imagick's jpeg:extent tries different compression levels until it achieves the desired file size (see [this answer](http://stackoverflow.com/a/19639344/1400579)). Anything you can do to shortcut that process will help with speed. Are you only reducing file size or pixel dimensions as well? – DaveP Apr 13 '17 at 15:23
  • @CD001 - From what I've read, and some tests I've seen, using the command line doesn't offer any more speed than not. – rbruhn Apr 14 '17 at 15:47
  • @DaveP - Only file size. And thanks for using the word dimensions. Trying to do a search for file size brings up so many dimension discussions it's difficult to sift through them. – rbruhn Apr 14 '17 at 15:49

1 Answers1

0

I have several suggestions, and you can probably combine them together too if you wish. It would have helped if you had specified your OS and also the image types in your question...

First suggestion - process in parallel

Do them in parallel, with GNU Parallel. So instead of the following sequential code:

for f in *.png; do
    ./Process "$f"
done

you would do:

parallel ./Process {} ::: *.png

Second suggestion - use "shrink on load"

If you are reducing the pixel dimensions of your images, i.e. from 5,000x3,000 pixels to 800x600 pixels for example, and they are JPEGs, you can use "shrink on load"

 magick -define jpeg:size=1600x1200 BigBoy.jpg -define jpeg:extent=600kb BiteSize.jpg

Consider vips

Consider using vips which I suspect will be much faster and lighter on resources

vipsthumbnail BigBoy.jpg -s 800x600 -o BiteSize.jpg

Add --vips-leak to the command above to see how much memory it used.

Mark Setchell
  • 191,897
  • 31
  • 273
  • 432
  • This is on Ubuntu, soon to be CentOS. I'm only reducing the file size to 600K or under, not the dimensions. My questions would be can "shrink on load" be done regarding file size? Will GNU parallel mean I will have 3000-5000 processes running at once and killing my server? As for VIPS, we do use that on creating tiled images on another server. I've not found anything regarding just file size. Do you know of something? – rbruhn Apr 14 '17 at 15:57
  • The *"shrink on load"* only works for reducing pixel dimensions. With **GNU Parallel**, it will start one process per CPU core unless you say otherwise, so if you have an 8-core server, it will keep 8 processes running till all your jobs are done. Try tagging your question with `vips` and John (the author) @user894763 may be able to share his considerable knowledge and insights. – Mark Setchell Apr 14 '17 at 17:13