1

I was writing an app that does some real-time video processing using an AVCaptureSession with a AVCaptureVideoDataOutput as output and an AVCaptureDeviceInput with the camera as input. Then the requirements changed, and now I need to save the video file as it comes in, and process the video (it no longer needs to be in real-time)

Is it possible to either a) attach an AVCaptureMovieFileOutput and a AVCaptureVideoDataOutput to the same AVCaptureSession? Trying it and preliminary searches both suggest this isn't possible, but someone here might know a way.

b) Record to a file using AVCaptureMovieFileOutput and then use the file as in input to the AVCaptureSession instead of the camera? This would allow me to reuse all the code from before. I haven't been able to find a way to use a file as an input to an AVCaptureSession though.

If neither of these methods are possible, what is the best method to save and process a video on iOS (either simultaneously or sequentially)?

Drew
  • 12,578
  • 11
  • 58
  • 98

2 Answers2

2

I know this post is nearly three years old, however it helped me in the right direction, and I figured I'd post an answer about what I did.

In order to "capture" video from a file, I used the usual AVPlayer setup, eg:

https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html

Then used the AVPlayerItem addOutput method to add an AVPlayerItemVideoOutput instance, like in the GreenScreen demo:

https://github.com/alokc83/iOS-Example-Collections/blob/master/WWDC_2012_SourceCode/iOS/517%20-%20Real-Time%20Media%20Effects%20and%20Processing%20during%20Playback/AVGreenScreenPlayer/AVGreenScreenPlayer/GSPlayerView.m

But without the filtering, and instead of using CVDisplayLink, I just used a timer with an interval of 33ms, and used hasNewPixelBufferForItemTime and copyPixelBufferForItemTime on the AVPlayerItems currentTime value. This was sufficient to "capture" at around 25fps, which was sufficient for the purpose.

I achieved this using Delphi, however the above should provide enough clues to be able to do the same in ObjectiveC, Swift, C#, or whatever your choice of language that is supported.

Dave Nottage
  • 3,411
  • 1
  • 20
  • 57
0

Here is another post that suggests a possible scenario:

How do i save a video (mp4 format) using AVCaptureVideoDataOutput?

Basically, use the AVCaptureVideoDataOutputSampleBufferDelegate method:

captureOutput:didOutputSampleBuffer:fromConnection:

to process the frame as you'd like. You can then use AVAssetWriter to actually store the data.

Regarding your point a), you are correct, you cannot use AVCaptureMovieFileOutput and AVCaptureVideoDataOutput simultaneously.

Community
  • 1
  • 1
joelg
  • 1,094
  • 9
  • 19