-3

I am new to iOS development and I need to capture camera frames and convert them into yuv420 to perform h264 encoding. In my android application, I used the libyuv to do the conversions. Now I want to replicate the same thing with in iOS. I was wondering what options do I have and how do I receive camera callback with datas in iOS?

Nav Nav
  • 169
  • 1
  • 20

2 Answers2

1

maybe you need to use ffmpeg in your project, see this question (it's not about iOS programming, but it's possible to compile ffmpeg and link it in an iOS app). convert H264 video to raw YUV format

P.S. Also try to compile libyuv for iOS architecture, maybe it's easier yo use it instead of ffmpeg :)

https://code.google.com/p/libyuv/wiki/GettingStarted

Community
  • 1
  • 1
Alexander Tkachenko
  • 3,221
  • 1
  • 33
  • 47
1

When capturing raw image data from the camera, you can specify the pixel format of the output buffer in the videoSettings property of the AVCaptureVideoDataOutput of your capture session.

Pixel formats such as kCVPixelFormatType_420YpCbCr8Planar or kCVPixelFormatType_420YpCbCr8PlanarFullRange will probably give you a good starting point for additional processing to get to the exact format that you need.

I also recommend that you look into Apple's "AV Foundation Programming Guide", especially into the chapter about Still and Video Media Capture.

Artal
  • 8,933
  • 2
  • 27
  • 30