22

I have to do "slow motion" in a video file along with audio, in-between some frames and need to store the ramped video as a new video.

Ref: http://www.youtube.com/watch?v=BJ3_xMGzauk (watch from 0 to 10s)

From my analysis, I've found that AVFoundation framework can be helpful.

Ref: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html

Copy and pasted from the above link:

" Editing AV Foundation uses compositions to create new assets from existing pieces of media (typically, one or more video and audio tracks). You use a mutable composition to add and remove tracks, and adjust their temporal orderings. You can also set the relative volumes and ramping of audio tracks; and set the opacity, and opacity ramps, of video tracks. A composition is an assemblage of pieces of media held in memory. When you export a composition using an export session, it's collapsed to a file. On iOS 4.1 and later, you can also create an asset from media such as sample buffers or still images using an asset writer.

"

Questions: Can I do " slow motion " the video/audio file using the AVFoundation framework ? Or Is there any other package available? If i want to handle audio and video separately, please guide me how to do?

Update :: Code For AV Export Session :

 NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *outputURL = paths[0];
    NSFileManager *manager = [NSFileManager defaultManager];
    [manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
    outputURL = [outputURL stringByAppendingPathComponent:@"output.mp4"];
    // Remove Existing File
    [manager removeItemAtPath:outputURL error:nil];
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:self.inputAsset presetName:AVAssetExportPresetLowQuality];
    exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; // output path;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    [exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
        if (exportSession.status == AVAssetExportSessionStatusCompleted) {
            [self writeVideoToPhotoLibrary:[NSURL fileURLWithPath:outputURL]];
            ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
            [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputURL] completionBlock:^(NSURL *assetURL, NSError *error){
                if (error) {
                    NSLog(@"Video could not be saved");
                }
            }];
        } else {
            NSLog(@"error: %@", [exportSession error]);
        }
    }];
2vision2
  • 4,933
  • 16
  • 83
  • 164

7 Answers7

41

You could scale video using AVFoundation and CoreMedia frameworks. Take a look at the AVMutableCompositionTrack method:

- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration;

Sample:

AVURLAsset* videoAsset = nil; //self.inputAsset;

//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                               preferredTrackID:kCMPersistentTrackID_Invalid];
NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                                         atTime:kCMTimeZero
                                                          error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
    //handle error
    return;
}

//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;

[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];

//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                     presetName:AVAssetExportPresetLowQuality];

(Probably audio track from videoAsset should also be added to mixComposition)

vrmaks
  • 516
  • 4
  • 6
  • Thanks for your reply. Video Scaling is working fine. But audio is muted. Implemented the audio track as you mentioned but it doesn't work for me and i pasted the code here : http://pastebin.com/UN3mtpH9. Can you please check my code and let me know what i did wrong.? – 2vision2 Jul 09 '13 at 06:05
  • @vrmarks - Can you please let me know how to add the audio track in slow motion Video? – 2vision2 Jul 09 '13 at 14:14
  • 1
    can you please let know how to add audio with scale time range. Because the audio is not slowed,original audio is playing. when i'm using the below code. AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio;[compositionCommentaryTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration) toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)]; – 2vision2 Jul 10 '13 at 11:40
  • Sorry for late response. I can't make AVFoundation/AVMutableTrack to scale audio using method scaleTimeRange:toDuration:. One of the possible solutions is to export audio track using AVAssetExportSession with presetName:AVAssetExportPresetAppleM4A and outputFileType:AVFileTypeAppleM4A. And then slow down audio using other api and add updated audio track back to video. – vrmaks Jul 10 '13 at 13:45
  • Thank you very much. I tried with .m4a but it couldn't make it to slow down the audio as i couldn't find the api. " And then slow down audio using other api and add updated audio track back to video " can you exactly point out me the api name, it would be really helpful for me if you have any sample codes. Thanks. – 2vision2 Jul 10 '13 at 13:58
  • Take a look at the comment for audio specific question http://stackoverflow.com/a/5851926/1023384 – vrmaks Jul 10 '13 at 14:39
  • 1
    @2vision2 http://dirac.dspdimension.com/Dirac3_Technology_Home_Page/License.html last 4 free licence type. Sample Dirac3-Mobile/Time Stretching Example it generates slowed down audio file into Documents folder (check factor in iPhoneTestAppDelegate.mm:107 time) – vrmaks Jul 10 '13 at 15:02
  • Thanks for your reply, i can vary the speed of audio using dirac. Now I need to merge the video and audio. Thanks for your help. Your replies worth a bounty. – 2vision2 Jul 11 '13 at 09:45
  • I completed the slow motion audio and video separately. So i tried to merge the video and audio now. How i'm doing is store the audio in "Documents folder"(/var/mobile/Applications/FAD-B6D5-A6ACB/Documents/outnew.aif) folder and retrieve the audio and merge while storing in to video. But it's not working the audio is muted. If i'm taking that audio file from "documents" and add in the project means, video and audio merging with slow motion perfectly and the path(/var/mobile/Applications/FA-C5-B6D5-A6A/ramp.app/out.aif). can you please guide what's wrong?code here: http://pastebin.com/GT67a1yF – 2vision2 Jul 11 '13 at 14:50
  • I found the solution there's some delay in writing the audio file. . That's the reason for audio is muted. . – 2vision2 Jul 12 '13 at 04:56
  • [compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration) toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)]; Won't work in IOS 9.1 if videoScaleFactor is less than 1. Does anyone know why? – privateson Jan 28 '16 at 10:53
  • Thanks for such a nice answer but can you please help me with fast video too ? I tried adding videoScaleFactor to -2 but didnt help – Aadil Keshwani Mar 28 '16 at 09:41
  • What if we would like to have multiple scale range? Like 1st to 3rd second slow and 5th to 10th seconds faster? – Amrit Trivedi Aug 31 '18 at 18:05
  • is there any swift version for this? – Vin May 15 '21 at 11:35
14

Slower + Faster with or without audio track

I have tried and able to Slower the asset.

compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration) did the trick.

I made a class which will help you to generate a slower video from AVAsset. + point is you can also make it faster and another + point is it will handle the audio too.

Here is my custom class sample:

import UIKit
import AVFoundation

enum SpeedoMode {
    case Slower
    case Faster
}

class VSVideoSpeeder: NSObject {

    /// Singleton instance of `VSVideoSpeeder`
    static var shared: VSVideoSpeeder = {
       return VSVideoSpeeder()
    }()

    /// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance.
    func scaleAsset(fromURL url: URL,  by scale: Int64, withMode mode: SpeedoMode, completion: @escaping (_ exporter: AVAssetExportSession?) -> Void) {

        /// Check the valid scale
        if scale < 1 || scale > 3 {
            /// Can not proceed, Invalid range
            completion(nil)
            return
        }

        /// Asset
        let asset = AVAsset(url: url)

        /// Video Tracks
        let videoTracks = asset.tracks(withMediaType: AVMediaType.video)
        if videoTracks.count == 0 {
            /// Can not find any video track
            completion(nil)
            return
        }

        /// Get the scaled video duration
        let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(asset.duration.value / scale, asset.duration.timescale) : CMTimeMake(asset.duration.value * scale, asset.duration.timescale)
        let timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)

        /// Video track
        let videoTrack = videoTracks.first!

        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)

        /// Audio Tracks
        let audioTracks = asset.tracks(withMediaType: AVMediaType.audio)
        if audioTracks.count > 0 {
            /// Use audio if video contains the audio track
            let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)

            /// Audio track
            let audioTrack = audioTracks.first!
            do {
                try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: kCMTimeZero)
                compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
            } catch _ {
                /// Ignore audio error
            }
        }

        do {
            try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: kCMTimeZero)
            compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)

            /// Keep original transformation
            compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform

            /// Initialize Exporter now
            let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov")
           /// Note:- Please use directory path if you are testing with device.

            if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) {
                try FileManager.default.removeItem(at: outputFileURL)
            }

            let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
            exporter?.outputURL = outputFileURL
            exporter?.outputFileType = AVFileType.mov
            exporter?.shouldOptimizeForNetworkUse = true
            exporter?.exportAsynchronously(completionHandler: {
                completion(exporter)
            })

        } catch let error {
            print(error.localizedDescription)
            completion(nil)
            return
        }
    }

}

I took 1x, 2x and 3x as a valid scale. Class contains the proper validation and handling. Below is the sample of how to use this function.

let url = Bundle.main.url(forResource: "1", withExtension: "mp4")!
VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in
     if let exporter = exporter {
         switch exporter.status {
                case .failed: do {
                      print(exporter.error?.localizedDescription ?? "Error in exporting..")
                }
                case .completed: do {
                      print("Scaled video has been generated successfully!")
                }
                case .unknown: break
                case .waiting: break
                case .exporting: break
                case .cancelled: break
           }
      }
      else {
           /// Error
           print("Exporter is not initialized.")
      }
}

This line will handle the audio scaling

compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)

TheTiger
  • 13,264
  • 3
  • 57
  • 82
10

I have achieved on adding slow motion to video including audio as well with proper output orientation.

 - (void)SlowMotion:(NSURL *)URl
 {
   AVURLAsset* videoAsset = [AVURLAsset URLAssetWithURL:URl options:nil]; //self.inputAsset;

AVAsset *currentAsset = [AVAsset assetWithURL:URl];
AVAssetTrack *vdoTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
//create mutable composition
AVMutableComposition *mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

NSError *videoInsertError = nil;
BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                                         atTime:kCMTimeZero
                                                          error:&videoInsertError];
if (!videoInsertResult || nil != videoInsertError) {
    //handle error
    return;
}

NSError *audioInsertError =nil;
BOOL audioInsertResult =[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                                       ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
                                                        atTime:kCMTimeZero
                                                         error:&audioInsertError];

if (!audioInsertResult || nil != audioInsertError) {
    //handle error
    return;
}

CMTime duration =kCMTimeZero;
duration=CMTimeAdd(duration, currentAsset.duration);
//slow down whole video by 2.0
double videoScaleFactor = 2.0;
CMTime videoDuration = videoAsset.duration;

[compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionAudioTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration)
                           toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)];
[compositionVideoTrack setPreferredTransform:vdoTrack.preferredTransform];

        NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *docsDir = [dirPaths objectAtIndex:0];
        NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"slowMotion.mov"]];
        if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
        NSURL *_filePath = [NSURL fileURLWithPath:outputFilePath];

//export
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                        presetName:AVAssetExportPresetLowQuality];
assetExport.outputURL=_filePath;
                          assetExport.outputFileType =           AVFileTypeQuickTimeMovie;
  exporter.shouldOptimizeForNetworkUse = YES;
                           [assetExport exportAsynchronouslyWithCompletionHandler:^
                            {

                                switch ([assetExport status]) {
                                    case AVAssetExportSessionStatusFailed:
                                    {
                                        NSLog(@"Export session faiied with error: %@", [assetExport error]);
                                        dispatch_async(dispatch_get_main_queue(), ^{
                                            // completion(nil);
                                        });
                                    }
                                        break;
                                    case AVAssetExportSessionStatusCompleted:
                                    {

                                        NSLog(@"Successful");
                                        NSURL *outputURL = assetExport.outputURL;

                                        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
                                        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {

                                            [self writeExportedVideoToAssetsLibrary:outputURL];
                                        }
                                        dispatch_async(dispatch_get_main_queue(), ^{
                                            //                                            completion(_filePath);
                                        });

                                    }
                                        break;
                                    default:

                                        break;
                                }


                            }];


 }

  - (void)writeExportedVideoToAssetsLibrary :(NSURL *)url {
NSURL *exportURL = url;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportURL]) {
    [library writeVideoAtPathToSavedPhotosAlbum:exportURL completionBlock:^(NSURL *assetURL, NSError *error){
        dispatch_async(dispatch_get_main_queue(), ^{
            if (error) {
                UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:[error localizedDescription]
                                                                    message:[error localizedRecoverySuggestion]
                                                                   delegate:nil
                                                          cancelButtonTitle:@"OK"
                                                          otherButtonTitles:nil];
                [alertView show];
            }
            if(!error)
            {
               // [activityView setHidden:YES];
                UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:@"Sucess"
                                                                    message:@"video added to gallery successfully"
                                                                   delegate:nil
                                                          cancelButtonTitle:@"OK"
                                                          otherButtonTitles:nil];
                [alertView show];
            }
 #if !TARGET_IPHONE_SIMULATOR
            [[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil];
#endif
        });
    }];
} else {
    NSLog(@"Video could not be exported to assets library.");
}

}
halfer
  • 19,824
  • 17
  • 99
  • 186
objectiveCoder
  • 579
  • 5
  • 17
  • when videoScaleFactor is less than 1 it can't compose the video in IOS 9.1. Any idea? – privateson Jan 28 '16 at 10:53
  • there should be at least one video else AVmutable composer will not get any asset and it will crash the app – objectiveCoder Jan 28 '16 at 12:11
  • @objectiveCoder you're code is awesome and helped solve this issue for me but the audio pitch seems to not change when exporting using this method. Is there any attribute you can add to the export for it to behave sort of like when you set `asset.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed` on a playeritem? – simplexity Mar 03 '17 at 22:23
  • Any idea how to achieve this using `PHAsset` in `Photos` framework? – atulkhatri Jun 28 '17 at 10:32
  • Is there any way to convert video to fast track means in fast motion ? really needed help – Anita Nagori Nov 20 '18 at 05:37
  • @objectiveCoder Thanks a lot man. Your code is complete and proper. – iPhone 7 Dec 13 '18 at 11:17
2

I would extract all frames from initial video using ffmpeg and then collect together using AVAssetWriter but with lower frame rate. For getting more fulid slow motion maybe you will need to apply some blur effect, or even generate frame between existing, which will be mix from two frames.

Ivan Alek
  • 1,899
  • 3
  • 19
  • 38
2

An example in swift :

I

var asset: AVAsset?  
func configureAssets(){

    let videoAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4v")!)
    let audioAsset = AVURLAsset(url: Bundle.main.url(forResource: "sample", withExtension: "m4a")!)
    //    let audioAsset2 = AVURLAsset(url: Bundle.main.url(forResource: "audio2", withExtension: "m4a")!)

    let comp = AVMutableComposition()

    let videoAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo).first! as AVAssetTrack
    let audioAssetSourceTrack = videoAsset.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack
    //    let audioAssetSourceTrack2 = audioAsset2.tracks(withMediaType: AVMediaTypeAudio).first! as AVAssetTrack

    let videoCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
    let audioCompositionTrack = comp.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

    do {

        try videoCompositionTrack.insertTimeRange(
            CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9 , 600)),
            of: videoAssetSourceTrack,
            at: kCMTimeZero)



        try audioCompositionTrack.insertTimeRange(
            CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(9, 600)),
            of: audioAssetSourceTrack,
            at: kCMTimeZero)

        //
        //      try audioCompositionTrack.insertTimeRange(
        //        CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(3, 600)),
        //        of: audioAssetSourceTrack2,
        //        at: CMTimeMakeWithSeconds(7, 600))

        let videoScaleFactor = Int64(2.0)
        let videoDuration: CMTime = videoAsset.duration


        videoCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale))
        audioCompositionTrack.scaleTimeRange(CMTimeRangeMake(kCMTimeZero, videoDuration), toDuration: CMTimeMake(videoDuration.value * videoScaleFactor, videoDuration.timescale))
        videoCompositionTrack.preferredTransform = videoAssetSourceTrack.preferredTransform



    }catch { print(error) }

    asset = comp
}

II

  func createFileFromAsset(_ asset: AVAsset){

let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0] as URL

let filePath = documentsDirectory.appendingPathComponent("rendered-audio.m4v")
deleteFile(filePath)

if let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetLowQuality){


  exportSession.canPerformMultiplePassesOverSourceMediaData = true
  exportSession.outputURL = filePath
  exportSession.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration)
  exportSession.outputFileType = AVFileTypeQuickTimeMovie
  exportSession.exportAsynchronously {
    _ in
    print("finished: \(filePath) :  \(exportSession.status.rawValue) ")
  }
}

 }

 func deleteFile(_ filePath:URL) {
guard FileManager.default.fileExists(atPath: filePath.path) else {
  return
}

do {
  try FileManager.default.removeItem(atPath: filePath.path)
}catch{
  fatalError("Unable to delete file: \(error) : \(#function).")
}
}
209135
  • 517
  • 5
  • 15
  • Do you know how to make the pitch of the audio go down with the speed as well? Audio slows down but pitch does not follow. – simplexity Mar 04 '17 at 00:49
  • The `scaleTimeRange(...)` method isn't equipped to allow **Pitch Shifting**. So the slowed Audio's pitch will not follow , but I guess you know that. Many people have recommended **Dirac** for this. You can probably do some research on it. I haven't worked with dirac yet , so I don't know how it works . Goodluck! – 209135 Mar 06 '17 at 10:14
0

Swift 5

Here is @TheTiger's code converted to SwiftUI:

import UIKit
import AVFoundation


    enum SpeedoMode {
        case Slower
        case Faster
    }

    class VSVideoSpeeder: NSObject {

        /// Singleton instance of `VSVideoSpeeder`
        static var shared: VSVideoSpeeder = {
           return VSVideoSpeeder()
        }()

        /// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance.
        func scaleAsset(fromURL url: URL,  by scale: Int64, withMode mode: SpeedoMode, completion: @escaping (_ exporter: AVAssetExportSession?) -> Void) {

            /// Check the valid scale
            if scale < 1 || scale > 3 {
                /// Can not proceed, Invalid range
                completion(nil)
                return
            }

            /// Asset
            let asset = AVAsset(url: url)

            /// Video Tracks
            let videoTracks = asset.tracks(withMediaType: AVMediaType.video)
            if videoTracks.count == 0 {
                /// Can not find any video track
                completion(nil)
                return
            }

            /// Get the scaled video duration
            let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(value: asset.duration.value / scale, timescale: asset.duration.timescale) : CMTimeMake(value: asset.duration.value * scale, timescale: asset.duration.timescale)
            let timeRange = CMTimeRangeMake(start: CMTime.zero, duration: asset.duration)

            /// Video track
            let videoTrack = videoTracks.first!

            let mixComposition = AVMutableComposition()
            let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid)

            /// Audio Tracks
            let audioTracks = asset.tracks(withMediaType: AVMediaType.audio)
            if audioTracks.count > 0 {
                /// Use audio if video contains the audio track
                let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid)

                /// Audio track
                let audioTrack = audioTracks.first!
                do {
                    try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: CMTime.zero)
                    compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
                } catch _ {
                    /// Ignore audio error
                }
            }

            do {
                try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: CMTime.zero)
                compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)

                /// Keep original transformation
                compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform

                /// Initialize Exporter now
                let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov")
               /// Note:- Please use directory path if you are testing with device.

                if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) {
                    try FileManager.default.removeItem(at: outputFileURL)
                }

                let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
                exporter?.outputURL = outputFileURL
                exporter?.outputFileType = AVFileType.mov
                exporter?.shouldOptimizeForNetworkUse = true
                exporter?.exportAsynchronously(completionHandler: {
                    completion(exporter)
                })

            } catch let error {
                print(error.localizedDescription)
                completion(nil)
                return
            }
        }

    }

}

With the same use case:

        let url = Bundle.main.url(forResource: "1", withExtension: "mp4")!
        VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in
             if let exporter = exporter {
                 switch exporter.status {
                        case .failed: do {
                              print(exporter.error?.localizedDescription ?? "Error in exporting..")
                        }
                        case .completed: do {
                              print("Scaled video has been generated successfully!")
                        }
                        case .unknown: break
                        case .waiting: break
                        case .exporting: break
                        case .cancelled: break
                   }
              }
              else {
                   /// Error
                   print("Exporter is not initialized.")
              }
        }
Brody Higby
  • 137
  • 1
  • 8
0

Creating "Slow motion" video in iOS swift is not easy, that I came across many "slow motion" that came to know not working or some of the codes in them are depreciated. And so I finally figured a way to make slow motion in Swift. note: This code can be used for 120fps are greater than that too. You can make audio in slow motion in the same way I did

Here is the "code snippet I created for achieving slow motion"

    func slowMotion(pathUrl: URL) {

    let videoAsset = AVURLAsset.init(url: pathUrl, options: nil)
    let currentAsset = AVAsset.init(url: pathUrl)

    let vdoTrack = currentAsset.tracks(withMediaType: .video)[0]
    let mixComposition = AVMutableComposition()

    let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)

    let videoInsertError: Error? = nil
    var videoInsertResult = false
    do {
        try compositionVideoTrack?.insertTimeRange(
            CMTimeRangeMake(start: .zero, duration: videoAsset.duration),
            of: videoAsset.tracks(withMediaType: .video)[0],
            at: .zero)
        videoInsertResult = true
    } catch let videoInsertError {
    }

    if !videoInsertResult || videoInsertError != nil {
        //handle error
        return
    }


    var duration: CMTime = .zero
    duration = CMTimeAdd(duration, currentAsset.duration)
    
    
    //MARK: You see this constant (videoScaleFactor) this helps in achieving the slow motion that you wanted. This increases the time scale of the video that makes slow motion
    // just increase the videoScaleFactor value in order to play video in higher frames rates(more slowly)
    let videoScaleFactor = 2.0
    let videoDuration = videoAsset.duration
    
    compositionVideoTrack?.scaleTimeRange(
        CMTimeRangeMake(start: .zero, duration: videoDuration),
        toDuration: CMTimeMake(value: videoDuration.value * Int64(videoScaleFactor), timescale: videoDuration.timescale))
    compositionVideoTrack?.preferredTransform = vdoTrack.preferredTransform
    
    let dirPaths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).map(\.path)
    let docsDir = dirPaths[0]
    let outputFilePath = URL(fileURLWithPath: docsDir).appendingPathComponent("slowMotion\(UUID().uuidString).mp4").path
    
    if FileManager.default.fileExists(atPath: outputFilePath) {
        do {
            try FileManager.default.removeItem(atPath: outputFilePath)
        } catch {
        }
    }
    let filePath = URL(fileURLWithPath: outputFilePath)
    
    let assetExport = AVAssetExportSession(
        asset: mixComposition,
        presetName: AVAssetExportPresetHighestQuality)
    assetExport?.outputURL = filePath
    assetExport?.outputFileType = .mp4
    
    assetExport?.exportAsynchronously(completionHandler: {
        switch assetExport?.status {
        case .failed:
            print("asset output media url = \(String(describing: assetExport?.outputURL))")
            print("Export session faiied with error: \(String(describing: assetExport?.error))")
            DispatchQueue.main.async(execute: {
                // completion(nil);
            })
        case .completed:
            print("Successful")
            let outputURL = assetExport!.outputURL
            print("url path = \(String(describing: outputURL))")
            
            PHPhotoLibrary.shared().performChanges({
                PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
            }) { saved, error in
                if saved {
                    print("video successfully saved in photos gallery view video in photos gallery")
                }
                if (error != nil) {
                    print("error in saing video \(String(describing: error?.localizedDescription))")
                }
            }
            DispatchQueue.main.async(execute: {
                //      completion(_filePath);
            })
        case .none:
            break
        case .unknown:
            break
        case .waiting:
            break
        case .exporting:
            break
        case .cancelled:
            break
        case .some(_):
            break
        }
    })
}
Marolean James
  • 589
  • 4
  • 7