I would like to read a movie file frame by frame, run some image processing algorithms on each frame, and display the resultant frames sequentially in an iOS app. There are mechanisms for doing this with the live camera feed, such as using an AVCaptureSession, and I would like to do something similar with movies already saved to disk.
I am trying to do this with an AVAssetReader and AVAssetReaderTrackOutput, however in this document, it clearly states that "AVAssetReader is not intended for use with real-time sources, and its performance is not guaranteed for real-time operations".
So my question is: what is the right way to read movie frames in real-time?
Edit:
To further clarify what I mean by "real-time", consider the fact that it possible to capture frames from the camera feed, run some computer vision algorithms on each frame (i.e. object detection, filtering, etc), and display the filtered frames at a reasonable frame rate (30 - 60 FPS). I would like to do the exact same thing, except that the input source is a video already saved on disk.
I don't want to read the entire movie, process the whole thing, and display the result only once the entire processing pipeline is finished (I would consider that non-real time). I want the processing to be done frame by frame, and in order to do that the file has to be read in frame by frame in real time.