1

I have to make an iPhone project that can process video data in realtime. This app has to be able to reconize the color of the object in the video frame. After I found information relating to video processing in iOS, I found that I can use AVFoundation Framework to achieve this task but I don't know which APIs or functions of AVFoundation that's able to do this video processing task.

Can anyone suggest me which function to use to get image frames or raw image data out of a video streaming in real-time?

I'd appreciate if you can give me some example code

Thank you very much for helping me...

yuji
  • 16,695
  • 4
  • 63
  • 64
user848437
  • 23
  • 5
  • See http://stackoverflow.com/questions/5156872/how-to-apply-filters-to-avcapturevideopreviewlayer/5158856#5158856 – Matt J. Feb 03 '12 at 05:16

1 Answers1

1

You can first of all make use of AVAsset class by initiating it with your video file URL.

You can then use an AVAssetReader object for obtaining media data of that asset. This will help you obtain video frames which you can read using AVAssetReaderVideoCompositionOutput class object. Accessing RGB channel data from these frames is pertinent to CGImage classes and it's methods.

Hope this helps you to get started

eckes
  • 64,417
  • 29
  • 168
  • 201
ankit
  • 26
  • 1