10

I'm looking into making my Swift iOS app record a video and play it back on the same screen with 30 seconds delay.

I've been using an official example to record a video. Then I added a button that would trigger playing self.movieFileOutput?.outputFileURL using AVPlayer in a separate view on the screen. It's close to what I want but obviously it stops playing once it comes to the end of the file written to the disk and does not proceed when the next buffered chunk is written.

I could stop the video recording every 30 seconds and save the URL for each file so I can play it back but that means that there would be interruptions in video capture and playback.

How can I make video recording never stop and playback always be on the screen with any delay I want?

I've seen a similar question and all the answers pointed at AVFoundation docs. I couldn't find how to make AVFoundation to write predictable chunks of video from memory to disk when recording.

Maklaus
  • 538
  • 2
  • 16
  • 37
  • Possible duplicate of [Record video and play video at the same time](https://stackoverflow.com/questions/7707427/record-video-and-play-video-at-the-same-time) –  Aug 26 '17 at 23:36

1 Answers1

9

You can achieve what you want by recording 30s chunks of video, then enqueueing them to an AVQueuePlayer for seamless playback. Recording the video chunks would be very easy with AVCaptureFileOutput on macOS, but sadly, on iOS you cannot create new chunks without dropping frames, so you have to use the wordier, lower level AVAssetWriter API:

import UIKit
import AVFoundation

// TODO: delete old videos
// TODO: audio

class ViewController: UIViewController {
    // capture
    let captureSession = AVCaptureSession()

    // playback
    let player = AVQueuePlayer()
    var playerLayer: AVPlayerLayer! = nil

    // output. sadly not AVCaptureMovieFileOutput
    var assetWriter: AVAssetWriter! = nil
    var assetWriterInput: AVAssetWriterInput! = nil

    var chunkNumber = 0
    var chunkStartTime: CMTime! = nil
    var chunkOutputURL: URL! = nil

    override func viewDidLoad() {
        super.viewDidLoad()

        playerLayer = AVPlayerLayer(player: player)
        view.layer.addSublayer(playerLayer)

        // inputs
        let videoCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
        let videoInput = try! AVCaptureDeviceInput(device: videoCaptureDevice)
        captureSession.addInput(videoInput)

        // outputs
        // iOS AVCaptureFileOutput/AVCaptureMovieFileOutput still don't support dynamically
        // switching files (?) so we have to re-implement with AVAssetWriter
        let videoOutput = AVCaptureVideoDataOutput()
        // TODO: probably something else
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main)
        captureSession.addOutput(videoOutput)

        captureSession.startRunning()
    }

    override func viewDidLayoutSubviews() {
        super.viewDidLayoutSubviews()
        playerLayer.frame = view.layer.bounds
    }

    func createWriterInput(for presentationTimeStamp: CMTime) {
        let fileManager = FileManager.default
        chunkOutputURL = fileManager.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("chunk\(chunkNumber).mov")
        try? fileManager.removeItem(at: chunkOutputURL)

        assetWriter = try! AVAssetWriter(outputURL: chunkOutputURL, fileType: AVFileTypeQuickTimeMovie)
        // TODO: get dimensions from image CMSampleBufferGetImageBuffer(sampleBuffer)
        let outputSettings: [String: Any] = [AVVideoCodecKey:AVVideoCodecH264, AVVideoWidthKey: 1920, AVVideoHeightKey: 1080]
        assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
        assetWriterInput.expectsMediaDataInRealTime = true
        assetWriter.add(assetWriterInput)

        chunkNumber += 1
        chunkStartTime = presentationTimeStamp

        assetWriter.startWriting()
        assetWriter.startSession(atSourceTime: chunkStartTime)
    }
}

extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        let presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)

        if assetWriter == nil {
            createWriterInput(for: presentationTimeStamp)
        } else {
            let chunkDuration = CMTimeGetSeconds(CMTimeSubtract(presentationTimeStamp, chunkStartTime))

            if chunkDuration > 30 {
                assetWriter.endSession(atSourceTime: presentationTimeStamp)

                // make a copy, as finishWriting is asynchronous
                let newChunkURL = chunkOutputURL!
                let chunkAssetWriter = assetWriter!

                chunkAssetWriter.finishWriting {
                    print("finishWriting says: \(chunkAssetWriter.status.rawValue, chunkAssetWriter.error)")
                    print("queuing \(newChunkURL)")
                    self.player.insert(AVPlayerItem(url: newChunkURL), after: nil)
                    self.player.play()
                }
                createWriterInput(for: presentationTimeStamp)
            }
        }

        if !assetWriterInput.append(sampleBuffer) {
            print("append says NO: \(assetWriter.status.rawValue, assetWriter.error)")
        }
    }
}

p.s. it's very curious to see what you were doing 30 seconds ago. What exactly are you making?

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • 1
    Thanks for you reply, going to give it a try soon. I'm experimenting with different approaches to help people in different kinds of sport review their tricks right away. – Maklaus Sep 06 '17 at 16:09
  • Does this method record audio + video? – Hardik1344 Aug 19 '19 at 09:30
  • No, you’d need to add an audio AVAssetWriterInput to that asset writer and audio inputs and outputs to the capture session. It’s a lot like the video part. – Rhythmic Fistman Aug 19 '19 at 09:44
  • 1
    @RhythmicFistman: Thanks It is working now for both audio + video . – Hardik1344 Aug 19 '19 at 10:58
  • @RhythmicFistman: Audio is not recorded if the Bluetooth headphone connected. – Hardik1344 Sep 09 '19 at 08:30
  • Hi @Hardik1344!! Did you figure out using this way to record both audio and video rather than using ReplayKit? – swiftlearneer Aug 19 '20 at 00:06
  • @RhythmicFistman Is there a way using this example to capture both audio and video rather than using ReplayKit?? – swiftlearneer Aug 19 '20 at 00:08
  • 2
    I’m not sure what ReplayKit has to do with anything here. What’s the problem with audio and video? Is the problem that this answer leaves audio as a TODO? – Rhythmic Fistman Aug 19 '20 at 00:38
  • Hi @RhythmicFistman! Thanks so much for the response! Yes! I am wondering could your suggestion replace the usage of ReplayKit. Currently I am using ReplayKit to capture audio and video. But there user permission makes it not well sync. I am wondering whether your suggestion can be combined with audio and video capturing to replace the use of ReplayKit – swiftlearneer Aug 19 '20 at 23:32
  • Probably - I don’t think it matters where the audio and video come from. – Rhythmic Fistman Aug 20 '20 at 07:21
  • Does not work with audio completely. Found frame loss and glitches. – khunshan Aug 20 '20 at 12:49
  • Without more information it’s hard to say where your audio loss and frame drops are coming from. Start a new question and show some code! – Rhythmic Fistman Aug 20 '20 at 17:00
  • I think the frame loss and glitches are related to the duration of video and the duration of the loop. If we capture only video frames they video duration will contain exactly duration of those frames. If we add audio samples the duration now will not be exact and we might not have video frame for some number of audio samples. Something like: VVVVVVVV (for only video) AAVAAAVAAAVAAVAAAVAA (video and audio) @RhythmicFistman do you think there might be way around it? – bojan Aug 23 '20 at 16:29
  • There's definitely a way around it, but it's hard to understand from all these comments - can you summarise the problem in a new question? – Rhythmic Fistman Aug 31 '20 at 15:55