0

I am trying to create an UIImage from a CMSampleBuffer. The following code works fine for sample buffers from the rear camera but not on the front facing camera. When the front facing camera is used, the CGContext fails to initialise, i.e. CGContext constructor returns nil. I suspect that I need to use the right bitmap info but there are so many combinations.

func convert(buffer: CMSampleBuffer) -> UIImage? {

   guard let imageBuffer = CMSampleBufferGetImageBuffer(buffer) else { return nil }

   CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags.readOnly)



   let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)

   let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)

   let width = CVPixelBufferGetWidth(imageBuffer)

   let height = CVPixelBufferGetHeight(imageBuffer)



   let colorSpace = CGColorSpaceCreateDeviceRGB()

   var bitmapInfo: UInt32 = CGBitmapInfo.byteOrder32Little.rawValue

   bitmapInfo |= CGImageAlphaInfo.premultipliedFirst.rawValue & CGBitmapInfo.alphaInfoMask.rawValue

   let context = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo)



   guard let quartzImage = context?.makeImage() else { return nil }

   CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags.readOnly)

   let image = UIImage(cgImage: quartzImage)



   return image

}

It accompanied by the following error: "CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst."

  • Are there any errors thrown, or does it just silently fail? – brandonscript Feb 26 '18 at 03:13
  • I added the error in the post. – Loc Nguyen Feb 26 '18 at 03:21
  • what's the format of your image buffer's data ? Are you sure it contains ARGB ? – Valérian Feb 26 '18 at 08:20
  • Possibly this? https://stackoverflow.com/a/24124943/1214800 – brandonscript Feb 26 '18 at 17:08
  • @Valérian That is my question, I don't know what the format is. The buffer comes from `AVCaptureVideoDataOutputSampleBufferDelegate`'s `captureOutup(_ output: AVCaptureOutput, sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)`. The format I specified works fine for the rear camera buffer so I assume it would be the same for the front camera, but it isn't the case. – Loc Nguyen Feb 26 '18 at 22:24
  • Would this work? https://stackoverflow.com/a/42711035/7831758 – adamfowlerphoto Feb 27 '18 at 08:52
  • `CVPixelBufferGetPixelFormatType(imageBuffer)` gives you the format – Valérian Feb 27 '18 at 09:19
  • Thanks @Valérian, how do I use the returned `OSType` value to create an UIImage? – Loc Nguyen Feb 28 '18 at 05:31
  • @Spads, that does give me back an image but for a few specific reasons, I need to go this route. – Loc Nguyen Feb 28 '18 at 05:44
  • My intuition is that the `imageBuffer` doesn't contain RGB but YUV in which case you'll need to convert first. What's the `OSType` that was returned ? – Valérian Feb 28 '18 at 08:42
  • The returned type is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange. I found a solution on how to convert this buffer to UIImage. Thanks for guiding me through this. My question now is: how to (if possible) convert a kCVPixelFormatType_32BGRA CASampleBuffer to kCVPixelFormatType_420YpCbCr8BiPlanarFullRange. – Loc Nguyen Mar 01 '18 at 00:56
  • Either with `VTPixelTransferSession` or using `vImage` (which is what the pixel transfer session uses under the hood) – Valérian Mar 07 '18 at 17:18
  • `VTPixelTransferSession` is not available on iOS. Do you mind sharing a snippet for converting the buffer using vImage? – Loc Nguyen Mar 08 '18 at 01:14

0 Answers0