5

I'm trying to use cocoa to grab images from a webcam. I'm able to get the image in RGBA format using the QTKit and the didOutputVideoFrame delegate call, and converting the CVImageBuffer to a CIImage and then to a NSBitmapImageRep.

I know my camera grabs natively in YUV, what I want is to get the YUV data directly from the CVImageBuffer, and proccess the YUV frame before displaying it.

My question is: How can I get the YUV data from the CVImageBuffer?

thanks.

jslap
  • 711
  • 1
  • 6
  • 21

2 Answers2

1

You might be able to create a CIImage from the buffer using +[CIImage imageWithCVBuffer:] and then render that CIImage into a CGBitmapContext of the desired pixel format.

Note, I have not tested this solution.

Barry Wark
  • 107,306
  • 24
  • 181
  • 206
1

I asked what I thought was a different question, but it turned out to have the same answer as this one: raw data from CVImageBuffer without rendering?

Community
  • 1
  • 1
jab
  • 4,053
  • 3
  • 33
  • 40