1

I compressed a png file using ImageOptim. I used the Zopfli as the method of compressing. ImageOptim guarantees that it's a lossless compression. Truly my png size has reduced, but how can i decompress it back to the original size?

I read about Zopfli compression, it said that it compresses png files using deflate algorithm. How can inflate/decompress the compressed png back to original file in java?

Hither Joe
  • 117
  • 2
  • 9

2 Answers2

0

Zopli does not optimize the already compressed stream of PNG data. It works on decompressed image from scratch. So you can not undo the process. If you insist on getting the exact same result before the optimization you can decompress to bitmap and encode to PNG using the original algorithm, again.

A little more explanation: Let's simplistically say you need to compress the string "1111111111". A first lossless compression function which has a max buffer size of 5 bytes would output it as "5x1,5x1"(5 times 1 then 5 times 1), then a second implementation of the same algorithm which has a max buffer of 10 bytes outputs "10x1" which looks better. Both results are of course lossless, but most often you can not go to "5x1,5x1" from "10x1".

To make sure the compressions are really lossless you may simply decompress and compare uncompressed versions. In your case, decompress both original PNG and optimized PNG to bitmap and compare bitmaps.

No need to program;

Use ImageMagick to compare files

Lossless Compression:

Command:

magick compare -metric mae anydesk00000.png anydesk00000.png diff.png`

Output:

0 (0)

A value of 0 shows zero difference, hence no loss.

Lossy Compression:

Command:

magick compare -metric mae anydesk00000.png anydesk00000.jpg diff.png

Output:

151.861 (0.00231724)
Bahram Ardalan
  • 280
  • 3
  • 11
  • Interesting suggestion; So I should decompress the compressed file to bitmap and then encode the bitmap to png by using the Zopfli algorithm? – Hither Joe Nov 08 '19 at 00:07
  • @HitherJoe Updated the answer with a simplistic example. Let me know if it helps you understand why you do not really need to do the rollback. – Bahram Ardalan Nov 08 '19 at 00:22
  • Many software claims they perform lossless compression of png files. This is to test if truly they're lossless as they claim. Since it compresses the png very well using the Zopfli, but uncompressing the compressed png file back to original will certify their claim. I do understand what you say, what I want to do is to inflate the file back, if you know how, kindly suggest please. – Hither Joe Nov 08 '19 at 00:34
  • @HitherJoe Sure; Updated the answer. – Bahram Ardalan Nov 08 '19 at 02:10
  • Zopfli *is* a compression algorithm, which does a very good job compressing data streams into the Deflate data compression format used by PNG, Zip, GZip and others. Zopfli itself is lossless (just does data compression, doesn't know of images etc.), but perhaps ImageOptim does some lossy processing ? – Zerte Nov 08 '19 at 06:00
  • @Zerte Agreed; Updated. Thanks. – Bahram Ardalan Nov 08 '19 at 06:45
0

You don't need to decompress it, it should be a png file useable as is, unless you are doing something wrong. The image is not changed, only the binary representation in disk.

There is no way to go to the original file, you could recompress it but you would need to know the library used to create it as well as the compression level and other settings used to create it (filters, deflate strategy, sliding window size,... most png files are created with default values but not all). Usually there is no need, but if you need to keep the original file (e.g. to match a md5 checksum) you shouldn't use zopfli.

Edit: there are two common definitions of lossless compression:

  1. Being able to recreate a bit-to-bit exact copy of the input file.
  2. Being able to recover the input file contents without changes. Bit exact reproduction of the input file is not a goal.

General file compression uses the first definition. If you zip a .jpg file you can get back exactly the same .jpg file.

Domain specific lossless compression typically uses the second definition, if you create a PNG file from a JPEG you won't be able to recover the original JPEG file (even if you know that the original was a JPEG image), but the original JPEG and the PNG will have exactly the same image data, yet everybody calls PNG lossless.

If you consider a program that uses Zopfli to reduce PNG files size as a "PNG file compressor", it won't be lossless using the general definition, but it will be using the domain specific one. If the program claims to compress losslessly the image rather than the file there is no ambiguity, it's the second definition. If the program claims to be a lossless optimizer it's also the second definition, there is no ambiguity.

ggf31416
  • 3,582
  • 1
  • 25
  • 26
  • Thanks, I do know that the compress png is okay and usable without any difference when viewed. But it's claim that the compression is lossless. So if truly it's lossless, I should be able to decompress it back to the same original bytes isn't it? – Hither Joe Nov 08 '19 at 00:05
  • "Lossless" is regarding the "original data", the image data (pixel values) and metadata for a png file, the uncompressed files for a zip. There is no attempt to be able to recover the original png or zip files, which are already compressed with the Deflate algorithm. – ggf31416 Nov 08 '19 at 00:13
  • When you compress file with zip, you will be able to decompress it back to the original file, this is truly lossless. So if an algorithm claims it compresses file losslessly, it should be able or there should be a lib or a way to decompress it back to the original file. That is what I want to do. Since zopfli claims it's lossless, how can I decompress it back? – Hither Joe Nov 08 '19 at 00:37
  • Thanks for your clarification, I now understand your point. Though when I compressed png file using Zopfli, and I compare the pixel data to the previous image, some of the pixels are different. For example {202,124,222,144} and compress with Zopfli, though it claims it's lossless the result after compressing is something different {111,232,043,225}, this is just an example. So, that should mean it is a lossy compression right? since actual pixel data are replaced. – Hither Joe Nov 08 '19 at 01:38
  • That means you are doing something wrong, for example you are enabling reduction from 32-bit RGBA to 8-bit palette, the Zopfli algorithm has nothing to do with that. I would not expect more than 5% improvement in average from Zopfli and never more than 10%. – ggf31416 Nov 08 '19 at 01:46
  • I don't think I'm doing something wrong. I think it's the ImageOptim. I optimise the file using ImageOptim, I use the Zopfli as the method of compression. It's slow, but it compresses the file. When I crosscheck pixel to pixel with the original file. I find some pixels are not the same. So I think it's the ImageOptim software. – Hither Joe Nov 08 '19 at 01:55
  • Edit: I may have understimated it a bit, [Jeff Artwood](https://blog.codinghorror.com/zopfli-optimization-literally-free-bandwidth/) claims that larger improvements (13% or even 27%) are somewhat common, but I think the reason is that most image editors don't even use the highest compression level in libpng by default. – ggf31416 Nov 08 '19 at 01:56