1

Doing a image scaling to have image limit in 320x480, so w <= 320 and h <= 640. But after conversion to data and back conversion to image I get a twice as big image as it was originally. Why? How can I avoid? Should use 160x320 size to get the required size?

let s = CGSize(width: w, height: h)
UIGraphicsBeginImageContextWithOptions(s, false, 0.0);
image.draw(in: CGRect(x: 0, y: 0, width: w, height: h))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();

let data2 = UIImageJPEGRepresentation(newImage!, 1.0)!
let image3 = UIImage(data: data2)

(lldb) po newImage!.size ▿ (320.0, 213.5) - width : 320.0 - height : 213.5

(lldb) po image3!.size ▿ (640.0, 427.0) - width : 640.0 - height : 427.0

AVerguno
  • 1,277
  • 11
  • 27
János
  • 32,867
  • 38
  • 193
  • 353

0 Answers0