1

I am using a piece of code from this link - Resize UIImage by keeping Aspect ratio and width, and it works perfectly, but I am wondering if it can be altered to preserve hard edges of pixels. I want to double the size of the image and keep the hard edge of the pixels.

class func resizeImage(image: UIImage, newHeight: CGFloat) -> UIImage {

    let scale = newHeight / image.size.height
    let newWidth = image.size.width * scale
    UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight))
    image.drawInRect(CGRectMake(0, 0, newWidth, newHeight))
    let newImage = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()

    return newImage
}

What I want it to do

What it does

In Photoshop there is the nearest neighbour interpolation when resizing, is there something like that in iOS?

Community
  • 1
  • 1
impo
  • 747
  • 1
  • 11
  • 37

3 Answers3

2

Inspired by the accepted answer, updated to Swift 5.

Swift 5

let image = UIImage(named: "Foo")!
let scale: CGFloat = 2.0

let newSize = image.size.applying(CGAffineTransform(scaleX: scale, y: scale))
UIGraphicsBeginImageContextWithOptions(newSize, false, UIScreen.main.scale)
let context = UIGraphicsGetCurrentContext()!
context.interpolationQuality = .none
let newRect = CGRect(origin: .zero, size: newSize)
image.draw(in: newRect)

let newImage = UIImage(cgImage: context.makeImage()!)
UIGraphicsEndImageContext()
Jayden Irwin
  • 921
  • 9
  • 14
1

Did a bit more digging and found the answer -

https://stackoverflow.com/a/25430447/4196903

but where

CGContextSetInterpolationQuality(context, kCGInterpolationHigh)

instead write

CGContextSetInterpolationQuality(context, CGInterpolationQuality.None)
Community
  • 1
  • 1
impo
  • 747
  • 1
  • 11
  • 37
0

You need to use CISamplerclass (Which is only available in iOS 9) and need to create your own custom image processing filter for it i think

You can find more information here and here too

Mihir Mehta
  • 13,743
  • 3
  • 64
  • 88