1

I was trying to merge one audio and video. All are completely fine. But the video is exporting to landscape mode. But I want it to portrait mode.

I know this question is already answered in StackOverflow. But those answers are not working for me. Can anyone please help me. I need it swift 3 version.

This is the answer list I have followed but not working for me

First

second

3rd

4th

5th

6th

Note: Please do not vote it duplicate. I know this question already have but not correct answer for me. please help me. I am stuck here for a long day.

here what I try to implement

 let aVideoAsset : AVAsset = AVAsset(url: videoUrl as URL)
        let aAudioAsset : AVAsset = AVAsset(url: audioUrl as URL)

        let mainComposition = AVMutableComposition()

        let videoTrack = mainComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
        let videoAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaTypeVideo).first!
        try? videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAsset.duration), of: videoAssetTrack, at: kCMTimeZero)

        let audioTrack = mainComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
        let audioAssetTrack = aAudioAsset.tracks(withMediaType: AVMediaTypeAudio).first!
        try? audioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, aAudioAsset.duration), of: audioAssetTrack, at: kCMTimeZero)

        let videoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
        videoCompositionLayerInstruction.setTransform( videoAssetTrack.preferredTransform, at: kCMTimeZero)
        let videoCompositionInstuction = AVMutableVideoCompositionInstruction()
        videoCompositionInstuction.timeRange = CMTimeRangeMake(kCMTimeZero, mainComposition.duration)
        videoCompositionInstuction.layerInstructions = [ videoCompositionLayerInstruction ]



        var renderSize = videoAssetTrack.naturalSize
        renderSize = renderSize.applying(videoAssetTrack.preferredTransform)
        renderSize = CGSize(width: fabs(renderSize.width), height: fabs(renderSize.height))

        let videoComposition = AVMutableVideoComposition()
        videoComposition.renderSize = renderSize
        videoComposition.renderSize = CGSize( width: 1280, height: 720)
        videoComposition.frameDuration = CMTimeMake(1, 30)
        videoComposition.instructions = [ videoCompositionInstuction ]




        let savePathUrl : NSURL = NSURL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")

        let assetExport = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality)
        assetExport?.outputURL = savePathUrl as URL
        assetExport?.outputFileType = AVFileTypeQuickTimeMovie
        assetExport?.shouldOptimizeForNetworkUse = true
        assetExport?.exportAsynchronously { () -> Void in
            switch assetExport?.status {

            case AVAssetExportSessionStatus.completed?:

                //Uncomment this if u want to store your video in asset

                let assetsLib = ALAssetsLibrary()
                assetsLib.writeVideoAtPath(toSavedPhotosAlbum: savePathUrl as URL!, completionBlock: nil)

                print("success")
            case  AVAssetExportSessionStatus.failed?:
                print("failed \(String(describing: assetExport?.error))")
            case AVAssetExportSessionStatus.cancelled?:
                print("cancelled \(String(describing: assetExport?.error))")
            default:
                print("complete")
            }
        }

1 Answers1

0

Just remove following line from your code.

videoComposition.renderSize = CGSize( width: 1280, height: 720)