We have been to trying to create real-time videos on iOS, but have experienced many frustrating problems with AVAssetWriter
like this one where the error claims media is being appended after a session ends, though our code does not appear to do this.
Upon reading the Apple docs more carefully, it appears AVAssetWriter
is not meant for real-time processing:
Note: The asset reader and writer classes are not intended to be used for real-time processing. In fact, an asset reader cannot even be used for reading from a real-time source like an HTTP live stream. However, if you are using an asset writer with a real-time data source, such as an AVCaptureOutput object, set the expectsMediaDataInRealTime property of your asset writer’s inputs to YES. Setting this property to YES for a non-real-time data source will result in your files not being interleaved properly.
If not AVAssetWriter
, how are you supposed to capture input from the front camera and make a video in real-time (with different overlays/watermarks appearing at different points in the video)?