328

I want to compress a JPG image file with ImageMagick but can't get much difference in size. By default the output size is bigger than the input. I don't know why, but after adding some +profile options and setting down the quality I can get an smaller size but still similar to original.

The input image is 255kb, the processed image is 264kb (using +profile to remove profiles and setting quality to 70%). Is there any way to compress that image to 150kb at least? Is that possible? What ImageMagick options can I use?

Andrew Lott
  • 185
  • 15
Javis Perez
  • 4,130
  • 3
  • 23
  • 27
  • 5
    Recompressing a JPEG will always result in a degraded image, even if it's larger. It would be better if you could start with the original before it was saved the first time. – Mark Ransom Aug 31 '11 at 18:55
  • I know, but unfortunately i dont have the original image, all i have is a big jpg file, but i think i can get a good balance between quality and size – Javis Perez Aug 31 '11 at 20:21

11 Answers11

593

I use always:

  • quality in 85
  • progressive (comprobed compression)
  • a very tiny gausssian blur to optimize the size (0.05 or 0.5 of radius) depends on the quality and size of the picture, this notably optimizes the size of the jpeg.
  • Strip any comment or EXIF metadata

in imagemagick should be

convert -strip -interlace Plane -gaussian-blur 0.05 -quality 85% source.jpg result.jpg

or in the newer version:

magick source.jpg -strip -interlace Plane -gaussian-blur 0.05 -quality 85% result.jpg

Source.

From @Fordi in the comments (Don't forget to upvote him if you like this): If you dislike blurring, use -sampling-factor 4:2:0 instead. What this does is reduce the chroma channel's resolution to half, without messing with the luminance resolution that your eyes latch onto. If you want better fidelity in the conversion, you can get a slight improvement without an increase in filesize by specifying -define jpeg:dct-method=float - that is, use the more accurate floating point discrete cosine transform, rather than the default fast integer version.

Felipe Buccioni
  • 19,109
  • 2
  • 28
  • 28
  • 3
    Thank you! this code got me the image to 170kb, now i can go and experiment with your code thank you, also i've found the -define:extent=MAX_SIZE_IN_KB option that really helps, thank you! – Javis Perez Aug 31 '11 at 20:22
  • 45
    If you're doing a bunch of files, you can also do `mogrify -strip -interlace Plane -gaussian-blur 0.05 -quality 85% *.jpg`. Make sure you have a backup before running that command. It will write in place. – Richard Ayotte Mar 08 '13 at 20:03
  • 1
    I got really bad-looking images from this method. – Buttle Butkus Apr 05 '13 at 21:29
  • @ButtleButkus blurred? get less gaussian blur, your image is small? what kind of bad-looking is? – Felipe Buccioni Apr 08 '13 at 21:35
  • 47
    I got very blurry images. It seems counterproductive to intentionally blur the image to save space. Wouldn't it make more sense to just use a lower quality %? The quality-changing process is pretty good at saving space while maintaining apparent image quality. I put a 0.05 gaussian blur on my image and it saved some space but looked like utter crap. I settled on using `mogrify -strip -quality 75% *.jpg`. Strip is great. 0 quality loss, and large space savings. And quality at 75% is barely distinguishable from 100%, but takes half the space. – Buttle Butkus Apr 12 '13 at 06:51
  • @ButtleButkus When the images are more bigger you got better results with gaussian blur, parameters are changed by versions, the idea is do a very tiny gaussian blur to get more space, the step is optional, check out in http://www.imagemagick.org/script/command-line-options.php#gaussian-blur and if you got a better way, edit the answer please! – Felipe Buccioni Apr 16 '13 at 16:19
  • 1
    I am resizing larger images down to 280x280 or so, which is perhaps too small for gaussian blur to be useful. I will try it on larger images at some point. I will be surprised if it is more effective than just using the quality setting, but I suppose since you have 20 upvotes you must be on to something. I added my own upvote. Thanks. – Buttle Butkus Apr 17 '13 at 03:55
  • @ButtleButkus & Felipe: any suggestions for this shiny app I have done : http://glimmer.rstudio.com/stla/ImageMagickCompression_single/ ? (I know it is slow on the server) – Stéphane Laurent Feb 11 '14 at 21:28
  • 110
    If you dislike blurring, use -sampling-factor 4:2:0 instead. What this does is reduce the chroma channel's resolution to half, without messing with the luminance resolution that your eyes latch onto. If you want better fidelity in the conversion, you can get a slight improvement without an increase in filesize by specifying -define jpeg:dct-method=float - that is, use the more accurate floating point discrete cosine transform, rather than the default fast integer version. – Fordi Jun 19 '14 at 12:47
  • 1
    it worked great for me. But the result image was a bit blurred; I just removed the gaussian blur option and it was all good – kabirbaidhya Sep 03 '14 at 10:44
  • 2
    @Fordi: I simply use quality 75% without gaussian-blur or -sampling-factor or jpeg:dct-method=float to get a smaller sized and still better image. Maybe we should stop trying to outsmart the program? – Tacaza Apr 04 '15 at 04:50
  • 3
    Don't blur the image. That is stupid. Use quality setting to reduce sizes. It is designed to optimally reduce size while maintaining visual similarity. Personally i use `-quality 40` and it drastically reduce the size of the image without any greatly noticeable artifacts. (of course depending on the type of picture) – Automatico Jun 13 '15 at 00:13
  • @Cort3z The quality parameter is related with the loss by the JPEG conpression, we are talking about compress trying to presserve qulity. If you use 40% in a thumbnail of an imahe you will get a poor image. – Felipe Buccioni Jun 13 '15 at 00:20
  • @FelipeAlcacibar Yes, but in my experience, blurring the image will be worse than simply lowering the quality. It is very much dependent on what the picture is depicting, but by doing blur you are saying that you assume that the picture have some sort of attribute that makes the image compress better with blur than normal frequency reduction. My guess is that this assumption will be wrong most of the time, and that the greatest "bang-for-the-buck" is gained by simply lowering the quality setting. – Automatico Jun 13 '15 at 00:36
  • can any one suggest me this same feature with Laravel 5 package intervention/image. https://github.com/Intervention/image – Tarunn Jul 10 '15 at 10:46
  • Your solution and using sampling factor instead of the blurring will result in a quality & file size similar to photoshops JPEG high setting when exporting for web :) Thx! – electronix384128 Aug 31 '15 at 22:55
  • 2
    If the image is not expected to be loaded from a low-speed internet connection, then providing `-interlace Plane` option is completely unnecessary, if not bad. For rendering the image from hard disk, the interlaced image takes 2-3 times longer than the non-interlaced one. And there's no size difference. – John Jan 06 '16 at 20:44
  • @John You can find in internet billions of articles with the advantages of progressive JPEG... – Felipe Buccioni Jan 06 '16 at 21:32
  • 2
    @FelipeAlcacibar All of the "advantages" that I found revolves around a web browser. But what if I just want to view it locally? Then it's nothing but computation overhead for me... There's absolutely no quality difference. – John Jan 07 '16 at 09:02
  • 1
    @John is completely right, I don't know of any advantages to progressive JPEGs outside of web browsers – forresthopkinsa Apr 12 '19 at 22:04
  • 1
    I suggest adding `-auto-orient` option to auto-rotate pictures, otherwise pictures often end up sideways. – JM Lord Dec 30 '20 at 19:48
106

I'm using the Google Pagespeed Insights image optimization guidelines, and for ImageMagick they recommend the following:

-sampling-factor 4:2:0
-strip
-quality 85 [it can vary, I use range 60-80, lower number here means smaller file]
-interlace
-colorspace RGB

Command in ImageMagick:

convert image.jpg -sampling-factor 4:2:0 -strip -quality 85 -interlace JPEG -colorspace RGB image_converted.jpg

With these options I get up to 40% savings in JPEG size without much visible loss.

rogerdpack
  • 62,887
  • 36
  • 269
  • 388
irina
  • 1,261
  • 1
  • 9
  • 10
  • 5
    sRGB is recommended by Google... why did you write RGB? Is there a specific reason? – collimarco Sep 22 '20 at 20:58
  • 1
    @collimarco The reason I found to use linear colospace (RGB) [is only temporarily, to resize](https://legacy.imagemagick.org/Usage/resize/#resize_colorspace) – Pablo Bianchi Feb 10 '21 at 00:04
  • 3
    The perfect command for expand on @irina's description, as `mogrify` does not create the target directory when you need to create it from a full directory. `SOURCE=./*.jpg DESTINE=$(mkdir -p ./resize; echo $_); mogrify -monitor -sampling-factor 4:2:0 -strip -interlace JPEG -colorspace sRGB -path $DESTINE -resize 1000 -compress JPEG -quality 80 $SOURCE` – jcarlosweb Mar 20 '21 at 23:59
17

Just saying for those who using Imagick class in PHP:

$im -> gaussianBlurImage(0.8, 10);      //blur
$im -> setImageCompressionQuality(85);  //set compress quality to 85
15

Once I needed to resize photos from camera for developing:

  • Original filesize: 2800 kB
  • Resolution: 3264x2448

Command:

mogrify -quality "97%" -resize 2048x2048 -filter Lanczos -interlace Plane -gaussian-blur 0.05 
  • Result filesize 753 kB
  • Resolution 2048x2048

and I can't see any changes in full screen with my 1920x1080 resolution monitor. 2048 resolution is the best for developing 10 cm photos at maximum quality of 360 dpi. I don't want to strip it.

edit: I noticed that I even get much better results without blurring. Without blurring filesize is 50% of original, but quality is better (when zooming).

tuipveus
  • 151
  • 1
  • 3
  • You don't have to add "-filter Lanczos". It's set by default http://www.imagemagick.org/script/command-line-options.php#filter – Ilya Prokin Sep 22 '16 at 15:04
  • as have been said above - it's useless to lower image resolution to make the file smaller. Lower the JPEG quality instead! You can test it yourself - compare two images one with 97% JPEG and anothern converted to let's say 68% and you will really have a hard time seeing any pixels being different even if you look at 100% zoom! Stock camera and phone's settings is ridiculously high just to make you wanna buy new phones with bigger storage and bigger HDDs for your PCs... https://photo.stackexchange.com/questions/30243/what-quality-to-choose-when-converting-to-jpg – McVitas Apr 23 '17 at 15:54
12

@JavisPerez -- Is there any way to compress that image to 150kb at least? Is that possible? What ImageMagick options can I use?

See the following links where there is an option in ImageMagick to specify the desired output file size for writing to JPG files.

http://www.imagemagick.org/Usage/formats/#jpg_write http://www.imagemagick.org/script/command-line-options.php#define

-define jpeg:extent={size} As of IM v6.5.8-2 you can specify a maximum output filesize for the JPEG image. The size is specified with a suffix. For example "400kb".

convert image.jpg -define jpeg:extent=150kb result.jpg

You will lose some quality by decompressing and recompressing in addition to any loss due to lowering -quality value from the input.

fmw42
  • 46,825
  • 10
  • 62
  • 80
11

I would add an useful side note and a general suggestion to minimize JPG and PNG.

First of all, ImageMagick reads (or better "guess"...) the input jpeg compression level and so if you don't add -quality NN at all, the output should use the same level as input. Sometimes could be an important feature. Otherwise the default level is -quality 92 (see www.imagemagick.org)

The suggestion is about a really awesome free tool ImageOptim, also for batch process.
You can get smaller jpgs (and pngs as well, especially after the use of the free ImageAlpha [not batch process] or the free Pngyu if you need batch process).
Not only, these tools are for Mac and Win and as Command Line (I suggest installing using Brew and then searching in Brew formulas).

Steve
  • 355
  • 3
  • 8
11

I added -adaptive-resize 60% to the suggested command, but with -quality 60%.

convert -strip -interlace Plane -gaussian-blur 0.05 -quality 60% -adaptive-resize 60% img_original.jpg img_resize.jpg

These were my results

  • img_original.jpg = 13,913KB
  • img_resized.jpg = 845KB

I'm not sure if that conversion destroys my image too much, but I honestly didn't think my conversion looked like crap. It was a wide angle panorama and I didn't care for meticulous obstruction.

gre_gor
  • 6,669
  • 9
  • 47
  • 52
C.shin
  • 111
  • 1
  • 2
8

Here's a complete solution for those using Imagick in PHP:

$im = new \Imagick($filePath);
$im->setImageCompression(\Imagick::COMPRESSION_JPEG);
$im->setImageCompressionQuality(85);
$im->stripImage();
$im->setInterlaceScheme(\Imagick::INTERLACE_PLANE);

// Try between 0 or 5 radius. If you find radius of 5 
// produces too blurry  pictures decrease to 0 until you 
// find a good balance between size and quality. 
$im->gaussianBlurImage(0.05, 5);



// Include this part if you also want to specify a maximum size for the images

$size = $im->getImageGeometry();
$maxWidth = 1920;
$maxHeight = 1080;


// ----------
// |        |
// ----------
if($size['width'] >= $size['height']){
  if($size['width'] > $maxWidth){
    $im->resizeImage($maxWidth, 0, \Imagick::FILTER_LANCZOS, 1);
  }
}


// ------
// |    |
// |    |
// |    |
// |    |
// ------
else{
  if($size['height'] > $maxHeight){
    $im->resizeImage(0, $maxHeight, \Imagick::FILTER_LANCZOS, 1);
  }
}
emanuelbsilva
  • 252
  • 3
  • 10
6

Did some experimenting myself here and boy does that Gaussian blur make a nice different. The final command I used was:

mogrify * -sampling-factor 4:2:0 -strip -quality 88 -interlace Plane -define jpeg:dct-method=float -colorspace RGB -gaussian-blur 0.05 

Without the Gaussian blur at 0.05 it was around 261kb, with it it was around 171KB for the image I was testing on. The visual difference on a 1440p monitor with a large complex image is not noticeable until you zoom way way in.

Pablo Bianchi
  • 1,824
  • 1
  • 26
  • 30
5

An very old but helpful answer.

I need to say, to serious large photography, -gaussian-blur is not acceptable, rather than compress ratio.

Comparing below, %95 with -gaussian-blur 0.05 vs. %85 without blurring. Original 17.5MB (8MP with much defail), %95 without blurring 5MB, %85 without blurring 3036KB, %95 with blurring 3365KB.

Comparing between blurring and compress ratio

Maybe lower blurring like 0.02 will work better.

butfly
  • 105
  • 1
  • 6
2

If the image has big dimenssions is hard to get good results without resizing, below is a 60 percent resizing which for most of the purposes doesn't destroys too much of the image.

I use this with good result for gray-scale images (I convert from PNG):

ls ./*.png | xargs -L1 -I {} convert {} -strip -interlace JPEG -sampling-factor 4:2:0 -adaptive-resize 60%   -gaussian-blur 0.05 -colorspace Gray -quality 20  {}.jpg

I use this for scanned B&W pages get them to gray-scale images (the extra arguments cleans shadows from previous pages):

ls ./*.png | xargs -L1 -I {} convert {} -strip -interlace JPEG -sampling-factor 4:2:0 -adaptive-resize 60%   -gaussian-blur 0.05 -colorspace Gray -quality 20 -density 300 -fill white -fuzz 40% +opaque "#000000" -density 300 {}.jpg 

I use this for color images:

ls ./*.png | xargs -L1 -I {} convert {} -strip -interlace JPEG -sampling-factor 4:2:0 -adaptive-resize 60%   -gaussian-blur 0.05 -colorspace RGB -quality 20  {}.jpg 
Eduard Florinescu
  • 16,747
  • 28
  • 113
  • 179