-1

I want to calculate / estimate what would be the file size of image (.jpg) after doing compression with Encoder.Quality value. Here I should not perform any compression before estimating, Is there any formula to do it?

Is there any rough estimate. I want to reduce image file size to < 250kb with out doing iterative compression.

Is there any Encoder.Quality value estimation based on image(jpg) width, height and depth (dpi) , etc...

starball
  • 20,030
  • 7
  • 43
  • 238
Naveen Bathina
  • 404
  • 1
  • 6
  • 25
  • I think doing iterative compression is faster than any other calculation/estimation. However you should compress to memory, not write files to disk. – wimh Feb 09 '16 at 09:32
  • 2
    Impossible to answer -you don't even mention the algorithm is. In all but the most inefficient algorithms though, the compression ratio depends on the data and can't be calculated. You can only make guestimates based on the algorithm, type of content (eg clip art, photograph) and quality level – Panagiotis Kanavos Feb 09 '16 at 09:32
  • I have visited the link http://stackoverflow.com/questions/2094064/how-to-resize-a-picture-to-a-specific-file-size – Naveen Bathina Feb 09 '16 at 10:55

2 Answers2

1

Ultimately, you cannot know until you have made the actual compression.

On the other hand, (Shannon's) data entropy is usually a good estimation of the compressed file bitrate. For optimal accuracy, you would need to compute the discrete cosine transform (DCT) of the data and apply the quantization corresponding to the quality Q you want ot use. However, you can also try to apply the quantization in the original domain and then calculate the entropy. You can also try to evaluate a portion of the image and extrapolate the compressed bitrate to the whole image.

Finally, if the type of images you are compressing are more or less homogeneous --i.e., you obtain similar compression ratios for most images-- you can build a look-up table to get your estimations. If your images are not homogeneous, you could also try to classify the image (maybe based on the entropy, too?) and use a different look-up table for each class.

mhernandez
  • 607
  • 5
  • 14
  • I agree that there is no accurate estimate but my intention is to find at least rough estimate. – Naveen Bathina Feb 12 '16 at 03:04
  • There you have two (probably rough) estimators: the one based on entropy and the one based on averaging compressed sizes. If you need help calculating the entropy, you can get some inspiration online (e.g., http://www.csharpprogramming.tips/2013/07/Data-Entropy.html). – mhernandez Feb 12 '16 at 09:22
-2

i already compress folder that size is 134GB and have 22 million of images. after compress to archive zip file, the size is 84GB. the compress time of folder take 92 Hours exactly.

  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community May 09 '23 at 11:07