39

I am using C# and want to save images using JPEG format. However .NET reduces quality of the images and saves them with compression that is not enough.

I want to save files with their original quality and size. I am using the following code but compression and quality are not like the original ones.

Bitmap bm = (Bitmap)Image.FromFile(FilePath); 
ImageCodecInfo[] codecs = ImageCodecInfo.GetImageEncoders(); 
ImageCodecInfo ici = null; 

foreach (ImageCodecInfo codec in codecs)
{ 
    if (codec.MimeType == "image/jpeg") 
    ici = codec; 
} 

EncoderParameters ep = new EncoderParameters(); 
ep.Param[0] = new EncoderParameter(System.Drawing.Imaging.Encoder.Quality, (long)100); 
bm.Save("C:\\quality" + x.ToString() + ".jpg", ici, ep);

I am archiving studio photos and quality and compression is very important. Thanks.

Bobrovsky
  • 13,789
  • 19
  • 80
  • 130
Baran
  • 1,388
  • 6
  • 16
  • 33

6 Answers6

21

The .Net encoder built-in to the library (at least the default Windows library provided by Microsoft) is pretty bad:

http://b9dev.blogspot.com/2013/06/nets-built-in-jpeg-encoder-convenient.html

Partial Update

I'm now using an approach outlined here, that uses ImageMagick for the resize then jpegoptim for the final compression, with far better results. I realize that's a partial answer but I'll expand on this once time allows.

Older Answer

ImageMagick is the best choice I've found so far. It performs relatively solid jpeg compression.

http://magick.codeplex.com/

It has a couple downsides:

  1. It's better but not perfect. In particular, its Chroma subsampling is set to high detail at 90% or above, then jumps down to a lower detail level - one that can introduce a lot of artifacts. If you want to ignore subsampling, this is actually pretty convenient. But if you wanted high-detail subsampling at say, 50%, you have a larger challenge ahead. It also still won't quite hit quality/compression levels of Photoshop or Google PageSpeed.

  2. It has a special deployment burden on the server that's very easy to miss. It requires a Visual Studio 2008 SDK lib installed. This lib is available on any dev machine with Visual Studio on it, but then you hit the server for the first time and it implodes with an obscure error. It's one of those lurking gotchas most people won't have scripted/automated, and you'll trip over it during some future server migration.

Oldest Answer

I dug around and came across a project to implement a C# JPEG encoder by translating a C project over:

http://www.codeproject.com/Articles/83225/A-Simple-JPEG-Encoder-in-C

which I've simplified slightly:

https://github.com/b9chris/ArpanJpegEncoder

It produces much higher quality JPEGs than the .Net built-in, but still is not as good as Gimp's or Photoshop's. Filesizes also tend to be larger.

BitMiracle's implementation is practically identical to the .Net built-in - same quality problems.

It's likely that just wrapping an existing open source implementation, like Google's jpeg_optimizer in PageSpeed Tools - seemingly libjpeg underneath, would be the most efficient option.

Update

ArpanJpegEncoder appears to have issues once it's deployed - maybe I need to increase the trust level of the code, or perhaps something else is going on. Locally it writes images fine, but once deployed I get a blank black image from it every time. I'll update if I determine the cause. Just a warning to others considering it.

Community
  • 1
  • 1
Chris Moschini
  • 36,764
  • 19
  • 160
  • 190
  • 2
    +1 for mentioning the built-in .NET JPEG encoder is bad. It will still visibly reduce the quality even on quality 100. Quality 100 produces big file with low quality. Using this .NET built-in encoder, I find it best (ie. reasonable file size for minimum quality loss) to use Encoder.Quality between 70 (for large images) to 90 (for small images). – Aximili Nov 19 '13 at 00:08
  • 5
    I am the creator of Magick.NET and I would like to inform you that I recently added a JpegOptimizer class that can be used to get the same size as Google PageSpeed. You might want to update the answer. – dlemstra Apr 27 '15 at 10:15
  • Great answer. I used the .Net Library and the compression level damaged the quality of the image. Then I tried Magick which was quite simple and much better quality after compressing the image. – Misha Zaslavsky May 10 '16 at 16:17
  • the link that you provide for the reason why .net jpeg encoder is bad is very old and out dated and the image that has been used in that blog to proof the concept is not a good example. I made an application and test the magick.net and built-in .net encoder with real life photo (not the geometric one) and the result is almost the same for Magick.net and .net build in encoder. – Meysam Jul 27 '17 at 13:30
  • The image used is meant to stress the JPEG format, to illustrate flaws in encoders. A "real life photo" would often look similar at 100%, but the built-in .Net encoder will give you a much larger file in terms of file size. In addition, not all images or photos lack hard geometric edges, which is what JPEG is bad at and this test is meant to stress. Worst of all you can get great results at 60% in good encoders, and a mess in the built-in. – Chris Moschini Jul 30 '17 at 17:51
  • @dlemstra Thanks for the update! Unfortunately I still use jpegoptim.exe because it supports streams, so I can do the entire image pipeline in RAM, instead of having to write to disk or even more complex RAMdisk. Any chance jpegOptim/libjpeg could have its Compress method overloads expanded to take a Stream? https://github.com/dlemstra/Magick.NET/tree/master/Source/Magick.NET.Native – Chris Moschini Oct 19 '17 at 19:03
  • 1
    @ChrisMoschini The latest version of Magick.NET has support for compressing images from a stream. – dlemstra Nov 03 '17 at 21:04
15

It looks like you're setting the quality to 100%. That means that there will be no compression.

If you change the compression level (80, 50, etc.) and you're unsatisifed with the quality, you may want to try a different image library. LEADTools has a good (non-free) engine.

UPDATE: As a commenter mentioned, 100% quality still does not mean lossless compression when using JPEG. Loading the image, doing something to it, and then saving it again will ultimately result in image degradation. If you need to alter and save an image without losing any of the data you need to use a lossless format such as TIFF, PNG or BMP. I'd go with compressed TIFF (since it's still lossless even though it's compressed) or PNG.

Michael Todd
  • 16,679
  • 4
  • 49
  • 69
  • 3
    Setting quality to 100% does not mean it's lossless, when it comes to .NET JPG encoding. I personally use the above mentioned LEADTools (FREE!) JPG encoder from my .NET code. It's an executable to which I provide an image source, launch the executable, and then read the result back into memory for further manipulation. – Andy Jan 06 '10 at 22:25
  • 1
    Not only that but the JPEG encoder built-in to .Net is just terrible. You can draw a yellow box on a blue background, set the encoder to jpeg 100%, and without resaving still get really awful results. http://brass9.com/test/netjpeg/comparison.png (I realize geometric figures aren't best in JPEG; imagine drawing a geometric shape with .Net on a loaded photograph and wanting to save it out; the photograph makes jpeg the best candidate, .Net produces unacceptably low quality even at 100%.) – Chris Moschini Jun 12 '13 at 04:13
6

Compression and quality are always a trade off.

JPEGs are always going to be lossy.

You may want to consider using PNG and minifying the files using PNGCrush or PNGauntlet

John Gietzen
  • 48,783
  • 32
  • 145
  • 190
2

Regarding the setup of the compression level in .NET, please check this link (everything included): http://msdn.microsoft.com/en-us/library/bb882583.aspx

Rearding your question: Usually you will save the uploaded image from users as PNG, then you use this PNG as base to generate your JPGs with different sizes (and you put a watermark ONLY on the JPGs, never on the original PNG!) Advantage of this is: if you change your images-dimensions later on for your platform, you have the original PNG saved and based on this you can re-compute any new image sizes.

johngrinder
  • 422
  • 4
  • 14
0

It must save the file like its orjinal quality and size

That doesn't make a lot of sense. When you are using lossy compression you are going to lose some information by definition. The point of compressing an image is to reduce the file size. If you need high quality and jpeg isn't doing it for you you may have to go with some type of lossless compression, but your file sizes will not be reduced by much. You could always try using the 'standard' library for compressing to jpeg (libjpeg) and see if that gives you any different results (I doubt it, but I don't know what .NET is using under the hood.)

Ed S.
  • 122,712
  • 22
  • 185
  • 265
0

Compressing the jpeg format by its very nature reduces quality. Perhaps you should look into file compression, such as #ziplib. You may be able to get a reasonable compression over a group of files.

C. Ross
  • 31,137
  • 42
  • 147
  • 238
  • 2
    Part of the JPEG algorithm is zip compression - zipping a jpeg or many jpegs will generally not get you meaningful savings. – Chris Moschini Jun 12 '13 at 04:39