0

I am using GPUImage's GPUImageVideoCamera initWithSessionPreset:cameraPosition: in order to display video from the rear facing camera on an iOS device (targeting iOS 7). This is filtered and displayed on a GPUImageView. Will not exceed AVCaptureSessionPreset640x480.

At any given moment in the app, I need to recall the past 5 seconds of unfiltered video captured from the rear-facing camera and instantly play this back on another (or the same) GPUImageView.

I can access CMSampleBufferRef via GPUImageVideoCamera's willOutputSampleBuffer: which is passed through from but I'm not sure how one goes about getting the most recent frames into memory in an efficient way such that they can be instantly, seamlessly played back.

I believe the solution is a Circular Buffer using something like TPCircularBuffer but I'm not sure that will work with a video stream. Also wanted to reference unanswered Buffering CMSampleBufferRef into a CFArray and Hold multiple Frames in Memory before sending them to AVAssetWriter as they closely resembled my original plan of attack until I started researching this.

Community
  • 1
  • 1
Plywood
  • 891
  • 1
  • 7
  • 16
  • 2
    Take a look at GPUImageBuffer. The idea is that it maintains an internal ring buffer of textures, although I only use this for delaying frames at present. I don't yet have this rigged up to give arbitrary access to each of the stored frames, but you might be able to modify this buffer to do what you want. – Brad Larson Dec 10 '13 at 02:30
  • Thanks Brad! After looking at GPUImageBuffer I think I will try NSMutableArray after all though I am curious about [this possible restriction](http://stackoverflow.com/questions/6227130/hold-multiple-frames-in-memory-before-sending-them-to-avassetwriter#comment7287022_6236582) and getting the raw data 420v frames back onto a GPUImageView. – Plywood Dec 10 '13 at 08:12

0 Answers0