3

can someone please advise. I am trying to add a text overlay (title) to a video I am composing using AVFoundation. I found a few online resources (see http://stackoverflow.com/questions/21684549/add-a-text-overlay-with-avmutablevideocomposition-to-a-specific-timerange) However all these resources are in Objective-C. My project is in Swift and I cannot find any related resources in Swift. I am not able to get the text to overlay properly is seems distorted as if the frame in which is gets rendered is skewed... See picture Distorted text in AVPlayer I have attempted to convert the Objective-C code I found to Swift but obviously I am missing something. Below is the code I am using. (I used some code for the player and the video file from:www.raywenderlich.com/90488/calayer-in-ios-with-swift-10-examples

    func MergeUnWeldedVideoByUserPref(showInBounds: CGRect) -> (AVMutableComposition, AVMutableVideoComposition)
{
    let fps: Int32 = 30

    // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
    let mixComposition = AVMutableComposition()
    // 2 - Create a video track for each of the video assests. Add your media data to the appropriate tracks
    //let url = NSBundle.mainBundle().URLForResource("colorfulStreak", withExtension: "m4v")!
    let url = NSBundle.mainBundle().URLForResource("colorfulStreak", withExtension: "m4v")!
    let avAsset = AVAsset(URL: url)
    let track = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
    let segmentInMovie = CMTimeRangeMake(kCMTimeZero, avAsset.duration)
    let videoTrack = avAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
    do
    {
        try track.insertTimeRange(segmentInMovie, ofTrack: videoTrack, atTime: kCMTimeZero)

    } catch{
        print("Failed to load track")
    }

    let mainInstruction = AVMutableVideoCompositionInstruction()
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, avAsset.duration)
    let instruction = videoCompositionInstructionForTrack(showInBounds, track: track, asset: avAsset)
    mainInstruction.layerInstructions.append(instruction)

    let mainComposition = AVMutableVideoComposition()
    mainComposition.instructions = [mainInstruction]
    mainComposition.frameDuration = CMTimeMake(1, fps)
    mainComposition.renderSize = CGSize(width: showInBounds.width, height: showInBounds.height)

    let textLayer = CATextLayer()
    textLayer.backgroundColor = UIColor.clearColor().CGColor
    textLayer.foregroundColor = UIColor.whiteColor().CGColor
    textLayer.string = "T E S T"
    textLayer.font = UIFont(name: "Arial", size: 18)
    textLayer.shadowOpacity = 0.5
    textLayer.alignmentMode = kCAAlignmentCenter
    textLayer.frame = CGRectMake(5, 5, 100, 50)
    textLayer.shouldRasterize = true
    textLayer.rasterizationScale = showInBounds.width / videoTrack.naturalSize.width

    let parentLayer = CALayer()
    let videoLayer = CALayer()
    parentLayer.frame = CGRectMake(0, 0, showInBounds.width, showInBounds.height);
    videoLayer.frame = CGRectMake(0, 0, showInBounds.width, showInBounds.height);
    parentLayer.addSublayer(videoLayer)
    parentLayer.addSublayer(textLayer)

    mainComposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)
    return (mixComposition, mainComposition)
}

1 Answers1

1

There is nothing wrong with your Swift interpretation and is rather an issue with the rendering engine of the simulator. I tried your code on the simulator and it indeed looked skewed and distorted but when compiling to the device it worked beautifully.

simplexity
  • 1,527
  • 1
  • 13
  • 31