16

I've been looking all over the web and can't seem to find a tutorial or help in what I need.

Using AVFoundation and the Dlib library I've created an app that can detect a face from real time video using the front camera on the phone. I'm doing this using Shape Predictor 68 Face Landmarks. For this to work I'm pretty sure I have to use AVCaptureVideoDataOutput as opposed to AVMovieFileOutput so that each frame can be analysed.

I now want to be able to save the video to file and from what I gather I need to use AVAssetWriter to do this. I just can't find much information anywhere about how to get started with this. I'm completely new to Swift and iOS programming and can't really understand much from looking at Apple's documentation.

If anyone could help me would be greatly appreciated!

Wings
  • 2,398
  • 23
  • 46
Hardy143
  • 577
  • 4
  • 13

1 Answers1

32

I was able to find out how to use AVAssetWriter. In case anyone else needs help the code I used is as follows:

func setUpWriter() {

    do {
        outputFileLocation = videoFileLocation()
        videoWriter = try AVAssetWriter(outputURL: outputFileLocation!, fileType: AVFileType.mov)

        // add video input
        videoWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: [
            AVVideoCodecKey : AVVideoCodecType.h264,
            AVVideoWidthKey : 720,
            AVVideoHeightKey : 1280,
            AVVideoCompressionPropertiesKey : [
                AVVideoAverageBitRateKey : 2300000,
                ],
            ])

        videoWriterInput.expectsMediaDataInRealTime = true

        if videoWriter.canAdd(videoWriterInput) {
            videoWriter.add(videoWriterInput)
            print("video input added")
        } else {
            print("no input added")
        }

        // add audio input
        audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)

        audioWriterInput.expectsMediaDataInRealTime = true

        if videoWriter.canAdd(audioWriterInput!) {
            videoWriter.add(audioWriterInput!)
            print("audio input added")
        }


        videoWriter.startWriting()
    } catch let error {
        debugPrint(error.localizedDescription)
    }


}

func canWrite() -> Bool {
    return isRecording && videoWriter != nil && videoWriter?.status == .writing
}


 //video file location method
func videoFileLocation() -> URL {
    let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString
    let videoOutputUrl = URL(fileURLWithPath: documentsPath.appendingPathComponent("videoFile")).appendingPathExtension("mov")
    do {
    if FileManager.default.fileExists(atPath: videoOutputUrl.path) {
        try FileManager.default.removeItem(at: videoOutputUrl)
        print("file removed")
    }
    } catch {
        print(error)
    }

    return videoOutputUrl
}

// MARK: AVCaptureVideoDataOutputSampleBufferDelegate
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

    let writable = canWrite()

    if writable,
        sessionAtSourceTime == nil {
        // start writing
        sessionAtSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
        videoWriter.startSession(atSourceTime: sessionAtSourceTime!)
        //print("Writing")
    }

    if output == videoDataOutput {
        connection.videoOrientation = .portrait

        if connection.isVideoMirroringSupported {
            connection.isVideoMirrored = true
        }
    }

    if writable,
        output == videoDataOutput,
        (videoWriterInput.isReadyForMoreMediaData) {
        // write video buffer
        videoWriterInput.append(sampleBuffer)
        //print("video buffering")
    } else if writable,
        output == audioDataOutput,
        (audioWriterInput.isReadyForMoreMediaData) {
        // write audio buffer
        audioWriterInput?.append(sampleBuffer)
        //print("audio buffering")
    }

}

// MARK: Start recording
func start() {
    guard !isRecording else { return }
    isRecording = true
    sessionAtSourceTime = nil
    setUpWriter()
    print(isRecording)
    print(videoWriter)
    if videoWriter.status == .writing {
        print("status writing")
    } else if videoWriter.status == .failed {
        print("status failed")
    } else if videoWriter.status == .cancelled {
        print("status cancelled")
    } else if videoWriter.status == .unknown {
        print("status unknown")
    } else {
        print("status completed")
    }

}

// MARK: Stop recording
func stop() {
    guard isRecording else { return }
    isRecording = false
    videoWriterInput.markAsFinished()
    print("marked as finished")
    videoWriter.finishWriting { [weak self] in
        self?.sessionAtSourceTime = nil
    }
    //print("finished writing \(self.outputFileLocation)")
    captureSession.stopRunning()
    performSegue(withIdentifier: "videoPreview", sender: nil)
}

I now have another problem where this solution doesn't work when I'm using AVCaptureMetadataOutput, AVCaptureVideoDataOutput and AVCaptureAudioDataOutput together. The app crashes when I add AVCaptureAudioDataOutput.

Hardy143
  • 577
  • 4
  • 13
  • 3
    I have now solved this issue. I just needed to state in the captureOutput function that face detection should only occur when capture output is videoData. eg: if output == videoDataOutput { doFaceDetection} Before I didn't have this if clause so the audioDataOutput was interfering with the face detection. – Hardy143 Jul 18 '18 at 10:56
  • 1
    I experienced the same problem when trying to detect objects and record movie at the same time. The recorded movie has no sound. Later I found out that video data output, audio data output and object detection were all take place in a same thread and therefore the audio samples were never captured. I resolved this by moving the detection task to another thread. – Gang Fang Dec 13 '18 at 01:44
  • After following this @Hardy143 i cant find my file – droid May 29 '20 at 14:12
  • 1
    Great, works a treat. I know you're not supposed to leave comments that basically just say 'it worked', but this is a very comprehensive and detailed answer to a specifically complex problem. Nice one. – Mick Byrne Mar 27 '21 at 21:45
  • This was incredibly helpful, thank you! – Michael N Aug 17 '23 at 11:02