3

I'm new to AVFoundation, and i'm trying to add a image to a video. however i keep getting an error : failed Optional(Error Domain=AVFoundationErrorDomain Code=-11823 "Cannot Save" UserInfo={NSLocalizedDescription=Cannot Save, NSLocalizedRecoverySuggestion=Try saving again.}) . What am i doing wrong. Here is my code:

func createVideo() {

    let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as NSString

    let fileURL = NSURL(fileURLWithPath: "\(documentsPath)/\(self.randomVideoFileName).mov")

    let composition = AVMutableComposition()
    let vidAsset = AVURLAsset(URL: fileURL, options: nil)

    // get video track
    let vtrack =  vidAsset.tracksWithMediaType(AVMediaTypeVideo)
    let videoTrack:AVAssetTrack = vtrack[0]
    let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration)


    do {
        let compositionvideoTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
        try compositionvideoTrack.insertTimeRange(vid_timerange, ofTrack: videoTrack, atTime: kCMTimeZero)

        compositionvideoTrack.preferredTransform = videoTrack.preferredTransform
    } catch {
        print(error)
    }

    //Get the video
    let fullSizeImage = videoTrack

    let newOverLayHeight = fullSizeImage.naturalSize.width / self.containerView!.frame.width * self.containerView!.frame.height

    UIGraphicsBeginImageContext(CGSizeMake(fullSizeImage.naturalSize.width, newOverLayHeight));
    self.containerView!.drawViewHierarchyInRect(CGRectMake(0, 0, fullSizeImage.naturalSize.width, newOverLayHeight), afterScreenUpdates: true)
    let overlayImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    let imglogo = UIImage(named: "image.png")
    let imglayer = CALayer()
    imglayer.contents = imglogo?.CGImage
    imglayer.frame = CGRectMake(0,fullSizeImage.naturalSize.height - newOverLayHeight, overlayImage.size.width, overlayImage.size.height)

    let videolayer = CALayer()
    videolayer.frame = CGRectMake(0, 0, fullSizeImage.naturalSize.width, fullSizeImage.naturalSize.height)

    let parentlayer = CALayer()
    parentlayer.frame = CGRectMake(0, 0, fullSizeImage.naturalSize.width, fullSizeImage.naturalSize.height)
    parentlayer.addSublayer(imglayer)

    let layercomposition = AVMutableVideoComposition()
    layercomposition.frameDuration = CMTimeMake(1, 30)
    layercomposition.renderSize = fullSizeImage.naturalSize
    layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videolayer, inLayer: parentlayer)

    let instruction = AVMutableVideoCompositionInstruction()
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)
    let videotrack = composition.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
    let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)
    instruction.layerInstructions = NSArray(object: layerinstruction) as! [AVVideoCompositionLayerInstruction]
    layercomposition.instructions = NSArray(object: instruction) as! [AVVideoCompositionInstructionProtocol]


    //  create new file to receive data
    let docsDir: AnyObject = documentsPath
    let movieFilePath = docsDir.stringByAppendingPathComponent("result.mov")
    let movieDestinationUrl = NSURL(fileURLWithPath: movieFilePath)

    // use AVAssetExportSession to export video
    let assetExport = AVAssetExportSession(asset: composition, presetName:AVAssetExportPresetHighestQuality)
    assetExport!.outputFileType = AVFileTypeQuickTimeMovie
    assetExport!.outputURL = movieDestinationUrl
    assetExport!.exportAsynchronouslyWithCompletionHandler({
        switch assetExport!.status{
        case  AVAssetExportSessionStatus.Failed:
            print("failed \(assetExport!.error)")
        case AVAssetExportSessionStatus.Cancelled:
            print("cancelled \(assetExport!.error)")
        default:
            print("Movie complete")


            // save to photoalbum
            NSOperationQueue.mainQueue().addOperationWithBlock({ () -> Void in
                UISaveVideoAtPathToSavedPhotosAlbum(movieDestinationUrl.absoluteString,  self, "image:didFinishSavingWithError:contextInfo:", nil)
            })
        }
    })

}
user990423
  • 1,397
  • 2
  • 12
  • 32
Peter Pik
  • 11,023
  • 19
  • 84
  • 142
  • 1
    This might not help, but I have two suggestions: (1) Your mixed use of string file paths and NSURL file URLs is dangerous and liable to error. Use NSURL throughout. (Also your use of AnyObject shows that you don't know what you're doing even with string file paths; clearly you are trying to work around something that mystifies you, but in fact there's a simple explanation.) (2) Make sure the file at your target URL doesn't exist before trying to save to it. – matt Nov 22 '15 at 18:13
  • Your probably right im not quite into the AVFoundation and the information around the internet is limited and if there is something it is `obj-c`, so i would appreciate if you had any concrete input as an answer. Thank you in advance – Peter Pik Nov 22 '15 at 18:18

2 Answers2

1

As Matt commented, you've forgotten to delete the output file (AVFoundation refuses to overwrite files for some reason). So do that:

let movieDestinationUrl = NSURL(fileURLWithPath: movieFilePath)
_ = try? NSFileManager().removeItemAtURL(movieDestinationUrl)

That fixes the error, but you won't yet see your watermark because you're not setting the AVAssetExportSession's videoComposition:

assetExport?.videoComposition = layercomposition  // important!

assetExport!.exportAsynchronouslyWithCompletionHandler({...})
Community
  • 1
  • 1
Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • i've added `assetExport?.videoComposition = layercomposition` but it doesnt add the watermark. – Peter Pik Nov 23 '15 at 08:16
  • i don't know if it have something to do with it but even though it is filmed in portrait it outputs it in landscape – Peter Pik Nov 23 '15 at 10:20
0

Hi i have done this in ObjectivC following is my code...

AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition] ;

    CGSize videoSize = CGSizeApplyAffineTransform(a_compositionVideoTrack.naturalSize, a_compositionVideoTrack.preferredTransform);

        CATextLayer *titleLayer = [CATextLayer layer];
        titleLayer.string = @"lippieapp.com";
        titleLayer.font = (__bridge CFTypeRef)(@"Helvetica-Bold");
        titleLayer.fontSize = 32.0;
        //titleLayer.alignmentMode = kCAAlignmentCenter;
        titleLayer.frame = CGRectMake(30, 0, 250, 60); //You may need to adjust this for proper display

    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
    videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
    [parentLayer addSublayer:videoLayer];
Abhishek
  • 509
  • 3
  • 12
  • 1
    The question is tagged "swift". Your answer should be using Swift. – Eric Aya Nov 23 '15 at 10:00
  • 1
    While this code may answer the question, it would be better to explain how it solves the problem without introducing others and why to use it. Code-only answers are not useful in the long run. – JAL Nov 30 '15 at 21:56