5

I am trying to useAVVideoComposition to add some text on top of a video and save the video. This is the code I use:

I Create an AVMutableComposition and AVVideoComposition

var mutableComp =          AVMutableComposition()
var mutableVidComp =       AVMutableVideoComposition()
var compositionSize :      CGSize?

func configureAsset(){

    let options =               [AVURLAssetPreferPreciseDurationAndTimingKey : "true"]
    let videoAsset =             AVURLAsset(url: Bundle.main.url(forResource: "Car", withExtension: "mp4")! , options : options)
    let videoAssetSourceTrack =  videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack

    compositionSize = videoAssetSourceTrack.naturalSize

    let mutableVidTrack =       mutableComp.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
    let trackRange =            CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

    do {
        try mutableVidTrack.insertTimeRange( trackRange, of: videoAssetSourceTrack, at: kCMTimeZero)

        mutableVidTrack.preferredTransform = videoAssetSourceTrack.preferredTransform

    }catch { print(error) }

    snapshot =       mutableComp
    mutableVidComp = AVMutableVideoComposition(propertiesOf: videoAsset)
 }

II Setup the layers

  func applyVideoEffectsToComposition()   {

    // 1 - Set up the text layer
    let subTitle1Text =            CATextLayer()
    subTitle1Text.font =           "Helvetica-Bold" as CFTypeRef
    subTitle1Text.frame =           CGRect(x: self.view.frame.midX - 60 , y: self.view.frame.midY - 50, width: 120, height: 100)
    subTitle1Text.string =         "Bench"
    subTitle1Text.foregroundColor = UIColor.black.cgColor
    subTitle1Text.alignmentMode =   kCAAlignmentCenter

    // 2 - The usual overlay
    let overlayLayer = CALayer()
    overlayLayer.addSublayer(subTitle1Text)
    overlayLayer.frame = CGRect(x: 0, y: 0, width: compositionSize!.width, height: compositionSize!.height)
    overlayLayer.masksToBounds = true


    // 3 - set up the parent layer
    let parentLayer =   CALayer()
    let videoLayer =    CALayer()
    parentLayer.frame = CGRect(x: 0, y: 0, width: compositionSize!.width, height: compositionSize!.height)
    videoLayer.frame =  CGRect(x: 0, y: 0, width: compositionSize!.width, height: compositionSize!.height)

    parentLayer.addSublayer(videoLayer)
    parentLayer.addSublayer(overlayLayer)

    mutableVidComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

 }

III . Save video with AVMutbaleVideoComposition

func saveAsset (){

    func deleteFile(_ filePath:URL) {

        guard FileManager.default.fileExists(atPath: filePath.path) else { return }

        do {
            try    FileManager.default.removeItem(atPath: filePath.path) }
        catch {fatalError("Unable to delete file: \(error) : \(#function).")} }


    let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL
    let filePath =           documentsDirectory.appendingPathComponent("rendered-audio.mp4")
    deleteFile(filePath)

    if let exportSession = AVAssetExportSession(asset: mutableComp , presetName: AVAssetExportPresetHighestQuality){

        exportSession.videoComposition = mutableVidComp

        //  exportSession.canPerformMultiplePassesOverSourceMediaData = true
        exportSession.outputURL =                   filePath
        exportSession.shouldOptimizeForNetworkUse = true
        exportSession.timeRange =                   CMTimeRangeMake(kCMTimeZero, mutableComp.duration)
        exportSession.outputFileType =              AVFileTypeQuickTimeMovie



        exportSession.exportAsynchronously {
            print("finished: \(filePath) :  \(exportSession.status.rawValue) ")

            if exportSession.status.rawValue == 4 {

                print("Export failed -> Reason: \(exportSession.error!.localizedDescription))")
                print(exportSession.error!)

            }

        }

    }

}

Then I run all three methods in the viewDidLoad method for a quick test. The problem is that when I run the app ,the result of the export is the original video without the title on it.

What am I missing here?

UPDATE

I notice that adding a subTitle1Text.backgroundColor property in part II of the code makes a colored CGRect corresponding to subTitle1Text.frame appear on top of the video when exported.

(See Image)

When this code is modified for playback using AVSynchronizedLayer the desired layer can be seen on top of the video with text on it. So perhaps this is a bug in AVFoundation itself.

I suppose I am only left with the option of using a customVideoCompositorClass. The problem with that is that it takes a lot of time to render the video . Here is an example that uses AVVideoCompositing

Bhavesh Nayi
  • 705
  • 4
  • 15
6994
  • 107
  • 2
  • 9

1 Answers1

3

Here is full working code which I used in my project. It will show CATextLayer at bottom (0,0). And in export session finish it will replace new path in player item. I used one model from Objective C code to get orientation. Please do testing in device. AVPLayer will not show text layer properly in simulator.

let composition = AVMutableComposition.init()

    let videoComposition = AVMutableVideoComposition()
    videoComposition.frameDuration = CMTimeMake(1, 30)
    videoComposition.renderScale  = 1.0

    let compositionCommentaryTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)


    let compositionVideoTrack: AVMutableCompositionTrack? = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)


    let clipVideoTrack:AVAssetTrack = self.currentAsset.tracks(withMediaType: AVMediaTypeVideo)[0]

    let audioTrack: AVAssetTrack? = self.currentAsset.tracks(withMediaType: AVMediaTypeAudio)[0]

    try? compositionCommentaryTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, self.currentAsset.duration), of: audioTrack!, at: kCMTimeZero)

    try? compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, self.currentAsset.duration), of: clipVideoTrack, at: kCMTimeZero)

    let orientation = VideoModel.videoOrientation(self.currentAsset)
    var isPortrait = false

    switch orientation {
    case .landscapeRight:
        isPortrait = false
    case .landscapeLeft:
        isPortrait = false
    case .portrait:
        isPortrait = true
    case .portraitUpsideDown:
        isPortrait = true
    }

    var naturalSize = clipVideoTrack.naturalSize

    if isPortrait
    {
        naturalSize = CGSize.init(width: naturalSize.height, height: naturalSize.width)
    }

    videoComposition.renderSize = naturalSize

    let scale = CGFloat(1.0)

    var transform = CGAffineTransform.init(scaleX: CGFloat(scale), y: CGFloat(scale))

    switch orientation {
    case .landscapeRight: break
    // isPortrait = false
    case .landscapeLeft:
        transform = transform.translatedBy(x: naturalSize.width, y: naturalSize.height)
        transform = transform.rotated(by: .pi)
    case .portrait:
        transform = transform.translatedBy(x: naturalSize.width, y: 0)
        transform = transform.rotated(by: CGFloat(M_PI_2))
    case .portraitUpsideDown:break
    }

    let frontLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: compositionVideoTrack!)
    frontLayerInstruction.setTransform(transform, at: kCMTimeZero)

    let MainInstruction = AVMutableVideoCompositionInstruction()
    MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)
    MainInstruction.layerInstructions = [frontLayerInstruction]
    videoComposition.instructions = [MainInstruction]

    let parentLayer = CALayer.init()
    parentLayer.frame = CGRect.init(x: 0, y: 0, width: naturalSize.width, height: naturalSize.height)

    let videoLayer = CALayer.init()
    videoLayer.frame = parentLayer.frame


    let layer = CATextLayer()
    layer.string = "HELLO ALL"
    layer.foregroundColor = UIColor.white.cgColor
    layer.backgroundColor = UIColor.orange.cgColor
    layer.fontSize = 32
    layer.frame = CGRect.init(x: 0, y: 0, width: 300, height: 100)

    var rct = layer.frame;

    let widthScale = self.playerView.frame.size.width/naturalSize.width

    rct.size.width /= widthScale
    rct.size.height /= widthScale
    rct.origin.x /= widthScale
    rct.origin.y /= widthScale



    parentLayer.addSublayer(videoLayer)
    parentLayer.addSublayer(layer)

    videoComposition.animationTool = AVVideoCompositionCoreAnimationTool.init(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

    let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
    let videoPath = documentsPath+"/cropEditVideo.mov"

    let fileManager = FileManager.default

    if fileManager.fileExists(atPath: videoPath)
    {
        try! fileManager.removeItem(atPath: videoPath)
    }

    print("video path \(videoPath)")

    var exportSession = AVAssetExportSession.init(asset: composition, presetName: AVAssetExportPresetHighestQuality)
    exportSession?.videoComposition = videoComposition
    exportSession?.outputFileType = AVFileTypeQuickTimeMovie
    exportSession?.outputURL = URL.init(fileURLWithPath: videoPath)
    exportSession?.videoComposition = videoComposition
    var exportProgress: Float = 0
    let queue = DispatchQueue(label: "Export Progress Queue")
    queue.async(execute: {() -> Void in
        while exportSession != nil {
            //                int prevProgress = exportProgress;
            exportProgress = (exportSession?.progress)!
            print("current progress == \(exportProgress)")
            sleep(1)
        }
    })

    exportSession?.exportAsynchronously(completionHandler: {


        if exportSession?.status == AVAssetExportSessionStatus.failed
        {
            print("Failed \(exportSession?.error)")
        }else if exportSession?.status == AVAssetExportSessionStatus.completed
        {
            exportSession = nil

            let asset = AVAsset.init(url: URL.init(fileURLWithPath: videoPath))
            DispatchQueue.main.async {
                let item = AVPlayerItem.init(asset: asset)


                self.player.replaceCurrentItem(with: item)

                let assetDuration = CMTimeGetSeconds(composition.duration)
                self.progressSlider.maximumValue = Float(assetDuration)

                self.syncLayer.removeFromSuperlayer()
                self.lblIntro.isHidden = true

                self.player.play()
                //                    let url =  URL.init(fileURLWithPath: videoPath)
                //                    let activityVC = UIActivityViewController(activityItems: [url], applicationActivities: [])
                //                    self.present(activityVC, animated: true, completion: nil)
            }

        }
    })

Below is code of My VideoModel class

-(AVCaptureVideoOrientation)videoOrientation:(AVAsset *)asset
{
    AVCaptureVideoOrientation result = 0;
    NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
    if([tracks    count] > 0) {
        AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
        CGAffineTransform t = videoTrack.preferredTransform;
        // Portrait
        if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
        {
            result = AVCaptureVideoOrientationPortrait;
        }
        // PortraitUpsideDown
        if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {

            result = AVCaptureVideoOrientationPortraitUpsideDown;
        }
        // LandscapeRight
        if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
        {
            result = AVCaptureVideoOrientationLandscapeRight;
        }
        // LandscapeLeft
        if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
        {
            result = AVCaptureVideoOrientationLandscapeLeft;
        }
    }
    return result;
}

Let me know if you need any more help in this.

Amrit Trivedi
  • 1,240
  • 1
  • 9
  • 24
  • Thanks for the answer Amrit. I updated my code with your answer but unfortunately there is still no subtitle on the output video .Apparently adding `AVMutableVideoCompositionInstruction ` doesn't seem to solve the problem . This has been very distressing. – 6994 Jul 29 '17 at 11:37
  • check out my updated answer it will help you to solve your problem. – Amrit Trivedi Aug 03 '17 at 18:57
  • I changed my code to match your answer again. The video exports fine in the simulator but without any title in the `CATextLayer` . So as per your recommendation I tried to run the project on a device but the video doesn't export at all and keeps waiting with the message `current progress == 0` being printed continuously in the debug area. I couldn't find a way around this problem either. – 6994 Aug 08 '17 at 04:52
  • I think the problem all along could have been with using the simulator. Another sample application that I tried displayed the `CATextLayer` title just fine on the device , but not on the simulator. I wonder why this is the case .Thanks for for pointing that out. – 6994 Aug 09 '17 at 07:15
  • I am preparing one demo for this i will share github url of this so it will easy for other developers also. – Amrit Trivedi Aug 09 '17 at 07:16
  • I have draggable view and i want to set CATextlayer frame as per dragged position so how can i set? because when i set CATextlayer position as per dragged position it displays in a small size with wrong position. and only set in left position. can you help me for this issue? @AmritTrivedi – Vivek Goswami Oct 03 '17 at 07:26
  • @AmritTrivediPlease see this https://stackoverflow.com/questions/50039910/how-to-add-moving-text-over-video-in-ios and advise me. – Anand Gautam Apr 26 '18 at 11:11