For any given file data size, I want to be able to resize (or compress) a UIImage to fit within that data limit. This question is NOT about how to resize, or how to check file sizes... it is about an algorithm to getting this in a performant way.
Searching here already, I found this thread which talks about stepping down the image jpeg quality in a linear, or binary algorithm. This isn't very performant, taking dozens of seconds at best.
I am working on iOS so images can be close to 10MB (from iPhone 4S). My target, although variable, is currently 3145728 bytes.
I am currently using UIImageJPEGRepresentation
to compress a little, but to get to my low target it appears I would have to lose much quality for such a large photo. Is there a relation between UIImage
size and NSData
size? Is there some function where I can say something like:
area * X = dataSize
...and solve for a scale factor so I can resize in one shot?