0

This is a fairly popular question, but I have not found a solution for my issue. I am capturing the frames from the front camera of iPhone this way

func captureOutput(captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection) {
    let uiImage = imageFromSampleBufferDep(sampleBuffer)
    ...
    UIImageWriteToSavedPhotosAlbum(uiImage!, self, "image:didFinishSavingWithError:contextInfo:", nil)
    ...
}

func imageFromSampleBufferDep(sampleBuffer: CMSampleBuffer) -> UIImage? {
    let imageBuffer: CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)!
    CVPixelBufferLockBaseAddress(imageBuffer, 0)
    let address = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
    let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer) //1924
    let width = CVPixelBufferGetWidth(imageBuffer) //1280
    let height = CVPixelBufferGetHeight(imageBuffer) //720

    let colorSpace = CGColorSpaceCreateDeviceRGB()

    let context = CGBitmapContextCreate(address, width, height, 8, bytesPerRow, colorSpace, CGImageAlphaInfo.NoneSkipFirst.rawValue)
    let imageRef = CGBitmapContextCreateImage(context)

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0)
    var resultImage : UIImage?
    if context != nil {
        resultImage = UIImage(CGImage: imageRef!)
    }

    return resultImage
}

And I've got an error:

<Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 5120 for 8 integer bits/component, 3 components, kCGImageAlphaNoneSkipFirst.

I tried to assign bytesPerRow to 5120 directly, but in this case I've got a set of gray and inverted pictures(attached)

Hot to fix this issue?

enter image description here

albertpod
  • 155
  • 2
  • 11
  • You are assuming that the pixel buffer you are getting is single planar and is in the RGB color space. The content from the camera will not be this format. Firstly you'll need to get the pixel format type using: CVPixelBufferGetPixelFormatType. You might also look at using CVPixelBufferGetPlaneCount to find out the number of planes. Also to get better color matching you should also consider using`CGColorSpaceRef colorSpace; colorSpace = (CGColorSpaceRef)CVBufferGetAttachment(pixBuffer, kCVImageBufferCGColorSpaceKey, NULL);` – SheffieldKevin Mar 01 '16 at 13:40
  • May you give me another prompt? I made these things. `var pixelBuffer : CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!; var planesNum = CVPixelBufferGetPlaneCount(pixelBuffer); let colorSpace = CVBufferGetAttachment(pixelBuffer, kCVImageBufferCGColorSpaceKey, nil) as! CGColorSpaceRef` And the colorSpace returns nil in this case. Am I right that then I have to swap imageBuffer to pixelBuffer? – albertpod Mar 01 '16 at 14:42
  • And what's the purpose of the number of planes? Where it must be used? – albertpod Mar 01 '16 at 14:57
  • The output from your camera is quite likely to be one of the YUV where each component is in its own plane, unlike RGB where each color component is in the same memory block [YUV Wikipedia](https://en.wikipedia.org/wiki/YUV). I would check the result of CVPixelBufferGetPixelFormatType to see what you get. Also see this [stack overflow question](http://stackoverflow.com/questions/4085474/how-to-get-the-y-component-from-cmsamplebuffer-resulted-from-the-avcapturesessio). The sort of values you'll get back from CVPixelBufferGetPixelFormatType are kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange – SheffieldKevin Mar 01 '16 at 15:23
  • Thanks! The returned value of CVPixelBufferGetPixelFormatType is 875704438 which is equal to `kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange` . So you are definitely right. – albertpod Mar 01 '16 at 15:32

1 Answers1

0

In fact I made a stupid thing which led to an above error.

Following this convert CMSampleBufferRef to UIImage issue.

I did so:

var videoCaptureOutput = AVCaptureVideoDataOutput()

videoCaptureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_32BGRA)]
Community
  • 1
  • 1
albertpod
  • 155
  • 2
  • 11