1

For any given file data size, I want to be able to resize (or compress) a UIImage to fit within that data limit. This question is NOT about how to resize, or how to check file sizes... it is about an algorithm to getting this in a performant way.

Searching here already, I found this thread which talks about stepping down the image jpeg quality in a linear, or binary algorithm. This isn't very performant, taking dozens of seconds at best.

I am working on iOS so images can be close to 10MB (from iPhone 4S). My target, although variable, is currently 3145728 bytes.

I am currently using UIImageJPEGRepresentation to compress a little, but to get to my low target it appears I would have to lose much quality for such a large photo. Is there a relation between UIImage size and NSData size? Is there some function where I can say something like:

area * X = dataSize

...and solve for a scale factor so I can resize in one shot?

Community
  • 1
  • 1
coneybeare
  • 33,113
  • 21
  • 131
  • 183
  • With JPEG compression, it's surely going to depend on the nature of the image, i.e. an image with a lot of high frequency signal (i.e. sharp edges) is going to be bigger than one without. So it's going to be pretty hard to know in advance without actually performing the compression. – Jon Burgess Jun 25 '12 at 01:52
  • is there a good ballpark estimate? what about png? – coneybeare Jun 25 '12 at 01:57
  • Even with PNG, it's going to depend on the nature of the image. I'm guessing with a 10MB image size, you're dealing with photos, or natural images - and PNG is going to be pretty rubbish for that. The only format I can think of where you can know the size in advance is BMP, or similar uncompressed formats. But then of course you're losing the benefit of compression. – Jon Burgess Jun 25 '12 at 02:03
  • What about setting pre-calculated benchmarks using a very complex sample image at different sizes. You can get the compression ratio ballparks for the different scenarios and use them as guides for what compression and resizes are needed. – Rog Jun 25 '12 at 02:34

1 Answers1

1

One idea I just had after looking at the thread you linked to: compressing a 10MB image is going to be relatively slow. How about resizing to be much smaller (so that compression is much faster), then performing the compression algorithm (from the link). This can then be used as a guide to the size of compressing the 10MB image? The idea being that the compression ratio should be similar for the same image, independent of size.

Let's say 1000x1000 pixels compressed is 10MB, target size is 3MB.

Then say smaller 100x100 pixels (for example), compressed with same quality, is C MB. Then perform the binary search alg on the 100x100 image until size = C * (3/10). Then use this compression quality for the 1000x1000 image to get ~3MB image.

Note: I have no idea how well this will work - it's just a suggestion. What size to pick (I've used 100x100) for the smaller-sized image is also just a guess and something would need to be experimented with.

Jon Burgess
  • 2,035
  • 2
  • 17
  • 27
  • How should i determine which size to shrink it to? If I can semi-accurately gauge that, the iterations wouldn't be necessary? – coneybeare Jun 25 '12 at 02:14
  • In setting this up, it seems that the compression of JPG is quite good. Previously, I had been using no compression at 1.0. In testing and setting this up, I found that dropping to 0.9 resulted in a 54% file size drop. 0.8 => 60%, 0.7 => 66%, 0.6 => 77% smaller file size. This is all with no perceivable quality loss. Even Photoshop has the setting for High JPEG Quality at 60, so I think I will set it at 0.6 and forget it. Marking this answer as correct as it led me to find the right answer for my situation. – coneybeare Jun 25 '12 at 03:03