4

I'm trying to resize a CVPixelBuffer to a size of 128x128. I'm working with one that is 750x750. I'm currently using the CVPixelBuffer to create a new CGImage, which I resize then convert back into a CVPixelBuffer. Here is my code:

func getImageFromSampleBuffer (buffer:CMSampleBuffer) -> UIImage? {
    if let pixelBuffer = CMSampleBufferGetImageBuffer(buffer) {
        let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
        let context = CIContext()
        let imageRect = CGRect(x: 0, y: 0, width: 128, height: 128)

        if let image = context.createCGImage(ciImage, from: imageRect) {
            let t = CIImage(cgImage: image)
            let new = t.applying(transformation)
            context.render(new, to: pixelBuffer)

            return UIImage(cgImage: image, scale: UIScreen.main.scale, orientation: .right)
        }

    }

    return nil
}

I've also tried scaling the CIImage then converting it:

let t = CIImage(cgImage: image)
let transformation = CGAffineTransform(scaleX: 1, y: 2)
let new = t.applying(transformation)
context.render(new, to: pixelBuffer)

But that didn't work either.

Any help is appreciated. Thanks!

enjoysturtles
  • 61
  • 1
  • 6
  • Please look for `vImageScale_ARGB8888 ` here: https://stackoverflow.com/a/10063006/2567725 – olha Jun 12 '17 at 22:12

2 Answers2

4

There's no need for pixel buffer rendering and alike. Just transform the original CIImage and crop to size. Cropping is necessary if you source and destination dimensions aren't proportional.

func getImageFromSampleBuffer (buffer:CMSampleBuffer) -> UIImage? {
    if let pixelBuffer = CMSampleBufferGetImageBuffer(buffer) {
        let ciImage = CIImage(cvPixelBuffer: pixelBuffer)

        let srcWidth = CGFloat(ciImage.extent.width)
        let srcHeight = CGFloat(ciImage.extent.height)

        let dstWidth: CGFloat = 128
        let dstHeight: CGFloat = 128

        let scaleX = dstWidth / srcWidth
        let scaleY = dstHeight / srcHeight
        let scale = min(scaleX, scaleY)

        let transform = CGAffineTransform.init(scaleX: scale, y: scale)
        let output = ciImage.transformed(by: transform).cropped(to: CGRect(x: 0, y: 0, width: dstWidth, height: dstHeight))

        return UIImage(ciImage: output)
    }

    return nil
}
Aleksey Gureiev
  • 1,729
  • 15
  • 17
0

Try this

func getImageFromSampleBuffer (buffer:CMSampleBuffer) -> UIImage? {
    if let pixelBuffer = CMSampleBufferGetImageBuffer(buffer) {
        let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
        let resizedCIImage = ciImage.applying(CGAffineTransform(scaleX: 128.0 / 750.0, y: 128.0 / 750.0))

        let context = CIContext()
        if let image = context.createCGImage(resizedCIImage, from: resizedCIImage.extent) {
            return UIImage(cgImage: image)
        } 
    }
    return nil
}

Here I assume that pixel buffer is square and size is equal to 750x750, you can change it to work with all aspect ratios and sizes

Tiko
  • 485
  • 4
  • 10