14

Given a user uploaded image, I need to create various thumbnails of it for display on a website. I'm using ImageMagick and trying to make Google PageSpeed happy. Unfortunately, no matter what quality value I specify in the convert command, PageSpeed is still able to suggest compressing the image even further.

Note that http://www.imagemagick.org/script/command-line-options.php?ImageMagick=2khj9jcl1gd12mmiu4lbo9p365#quality mentions:

For the JPEG ... image formats, quality is 1 [provides the] lowest image quality and highest compression ....

I actually even tested compressing the image using 1 (it produced an unusable image, though) and PageSpeed still suggests that I can still optimize such image by "losslessly compressing" the image. I don't know how to compress an image any more using ImageMagick. Any suggestions?

Here's a quick way to test what I am talking about:

assert_options(ASSERT_BAIL, TRUE);

// TODO: specify valid image here
$input_filename = 'Dock.jpg';

assert(file_exists($input_filename));

$qualities = array('100', '75', '50', '25', '1');
$geometries = array('100x100', '250x250', '400x400');

foreach($qualities as $quality)
{
    echo("<h1>$quality</h1>");
    foreach ($geometries as $geometry)
    {
        $output_filename = "$geometry-$quality.jpg";

        $command = "convert -units PixelsPerInch -density 72x72 -quality $quality -resize $geometry $input_filename $output_filename";
        $output  = array();
        $return  = 0;
        exec($command, $output, $return);

        echo('<img src="' . $output_filename . '" />');

        assert(file_exists($output_filename));
        assert($output === array());
        assert($return === 0);
    }

    echo ('<br/>');
}
StackOverflowNewbie
  • 39,403
  • 111
  • 277
  • 441
  • I think it wants you to try a lossless format like PNG. Sometimes this can give you better compression, particularly for small images; but it really depends on the type of image you users are uploading. – Tim Fountain Oct 10 '10 at 10:27
  • 1
    PageSpeed is referring to JPG (as it actually provides a JPG that can be saved). Users are uploading photos. – StackOverflowNewbie Oct 10 '10 at 10:30
  • Hmm strange, so is the JPEG it provides smaller than your auto-created one? – Tim Fountain Oct 10 '10 at 10:34
  • @Tim, yes it would be. Try out PageSpeed for Firebug yourself to see what the OP means. I just asked a related question here: http://stackoverflow.com/questions/5451597/how-does-googles-page-speed-lossless-image-compression-work – Drew Noakes Mar 27 '11 at 19:08
  • In my experience, the only optimization Google Page Speed makes to JPEGs is to remove unnecessary metadata. While it's true this is unnecessary, in most images this accounts for relatively few bytes - not enough to make a difference in large images, and if you're using small images - you should be spriting. Generally Page Speed's JPEG advice and optimization is focusing on the wrong problem. – Chris Moschini Jun 26 '12 at 15:43

3 Answers3

9
  • The JPEG may contain comments, thumbnails or metadata, which can be removed.
  • Sometimes it is possible to compress JPEG files more, while keeping the same quality. This is possible if the program which generated the image did not use the optimal algorithm or parameters to compress the image. By recompressing the same data, an optimizer may reduce the image size. This works by using specific Huffman tables for compression.

You may run jpegtran or jpegoptim on your created file, to reduce it further in size.

Sjoerd
  • 74,049
  • 16
  • 131
  • 175
  • I looked at the properties of the images generated by ImageMagick. It seems to have retained metadata. So you know how I can remove this using ImageMagick? Also, it seems that ImageMagick uses Huffman tables. See entry about JPEG at http://www.imagemagick.org/script/formats.php. Does this relieve me of the need to explore jpegtran or jpegoptim? – StackOverflowNewbie Oct 10 '10 at 10:56
  • mogrify -strip input.jpg seems to work. Not sure if it's the best approach, though. – StackOverflowNewbie Oct 10 '10 at 11:06
  • 3
    Every JPEG uses Huffman tables. Imagemagick probably always uses the same Huffman tables, whereas jpegoptim tries to find the best, custom Huffman table. – Sjoerd Oct 10 '10 at 11:11
  • mogrify -strip input.jpg seems to have satisfied PageSpeed. Thanks! – StackOverflowNewbie Oct 10 '10 at 12:04
  • 1
    @Sjoerd: according to ImageMagick's documentation (http://www.imagemagick.org/script/formats.php), "By default we compute optimal Huffman coding tables" for the JPEG format. – Joe Lencioni Sep 18 '12 at 19:36
4

To minimize the image sizes even more, you should remove all meta data. ImageMagick can do this by adding a -strip to the commandline.

Have you also considered to put your thumbnail images as inline-d base64 encoded data into your HTML?

This can make your web page load much faster (even though the size gets a bit larger), because it saves the browser from running multiple requests for all the image files (the images) which are referenced in the HTML code.

Your HTML code for such an image would look like this:

 <IMG SRC="data:image/png;base64,
         iVBORw0KGgoAAAANSUhEUgAAAM4AAABJAQMAAABPZIvnAAAABGdBTUEAALGPC/xh
         BQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAA
         OpgAABdwnLpRPAAAAAZQTFRFAAAA/wAAG/+NIgAAAAF0Uk5TAEDm2GYAAAABYktH
         RACIBR1IAAAACXBIWXMAAABIAAAASABGyWs+AAAB6ElEQVQ4y+3UQY7bIBQG4IeQ
         yqYaLhANV+iyi9FwpS69iGyiLuZYpepF6A1YskC8/uCA7SgZtVI3lcoiivkIxu/9
         MdH/8U+N6el2pk0oFyibWyr1Q3+PlO2NqJV+/BnRPMjcJ9zrfJ/U+zQ9oAvo+QGF
         d+npPqFQn++TXElkrEpEJhAtlTBR6dNHUuzIMhFnEhxAmJDkKxlmt7ATXDDJYcaE
         r4Txqtkl42VYSH+t9KrD9b5nxZeog/LWGVHprGInGWVQUTvjDWXca5KdsowqyGSc
         DrZRlGlQUl4kQwpUjiSS9gI9VdECZhHFQ2I+UE2CHJQfkNxTNKCl0RkURqlLowJK
         1h1p3sjc0CJD39D4BIqD7JvvpH/GAxl2/YSq9mtHSHknga7OKNOHKyEdaFC2Dh1w
         9VSJemBeGuHgMuh24EynK03YM1Lr83OjUle38aVSfTblT424rl4LhdglsUag5RB5
         uBJSJBIiELSzaAeIN0pUlEeZEMeClC4cBuH6mxOlgPjC3uLproUCWfy58WPN/MZR
         86ghc888yNdD0Tj8eAucasl2I5LqX19I7EmEjaYjSb9R/G1SYfQA7ZBuT5H6WwDt
         UAfK1BOJmh/eZnKLeKvZ/vA8qonCpj1h6djfbqvW620Tva36++MXUkNDlFREMVkA
         AAAldEVYdGRhdGU6Y3JlYXRlADIwMTItMDgtMjJUMDg6Mzc6NDUrMDI6MDBTUnmt
         AAAAJXRFWHRkYXRlOm1vZGlmeQAyMDEyLTA4LTIyVDA4OjM3OjQ1KzAyOjAwIg/B
         EQAAAA50RVh0bGFiZWwAImdvb2dsZSJdcbX4AAAAAElFTkSuQmCC"
  ALT="google" WIDTH=214  HEIGHT=57  VSPACE=5 HSPACE=5 BORDER=0 />

And you would create the base64 encoded image data like this:

base64  -i image.jpg  -o image.b64
Kurt Pfeifle
  • 86,724
  • 23
  • 248
  • 345
  • I'm not sure this is true. Referencing external files means the browser can use multiple processors, cores and network links to request the images (or any other resource). HTTP/1.1 has some overhead for a new request which may outweigh that benefit, but HTTP/2.0 or SPDY (available in many modern browsers) is able to use a single connection to make multiple requests at once. – JP. Jan 21 '14 at 15:39
2

Google performs those calculations based on it's WebP image format (https://developers.google.com/speed/webp/).

Despite giving performance gains though, it is currently supported only by chrome and opera (http://caniuse.com/webp)

Greg Funtusov
  • 1,377
  • 15
  • 18