5

I have a video file which I can load into an AVAsset. I want to change the rate at this video is played and double up its speed so that if the video was 10 seconds long, it would speed up to finish in 5 seconds.

Here is the code I am trying with, can someone tell me where am I going wrong?

Is frame duration the property i need in order to achieve my goal.

AVAsset *anAsset = self.moviePlayer.currentItem.asset;


NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset];
if ([compatiblePresets containsObject:AVAssetExportPreset640x480]) {
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
                                           initWithAsset:anAsset presetName:AVAssetExportPreset640x480];
    // Implementation continues.

    NSString *tmpStr = [[aclip selectedTakeUrl] stringByReplacingOccurrencesOfString:@".m4v" withString:@""];

    NSString *filePath = [NSString stringWithFormat:@"%@_applied.m4v", tmpStr];


    exportSession.outputURL = [NSURL fileURLWithPath:filePath];

    AVMutableVideoComposition *vidcomp = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:anAsset];

    vidcomp.frameDuration = CMTimeMake(1, 24);
    //self.moviePlayer.currentItem.videoComposition = vidcomp;

    exportSession.videoComposition = vidcomp;
    //exportSession.videoComposition.frameDuration = CMTimeMake(1, 50);

    // what is the song URL before loading startRecordingViewController?
    NSLog(@"From Save settgins Choose Song -Song URL : %@", exportSession.outputURL);

    //   NSLog(@"start time%f, end time %f", CMTimeGetSeconds(self.startTime),CMTimeGetSeconds(self.endTime));

    exportSession.outputFileType = AVFileTypeMPEG4;

    //CMTimeRange range = CMTimeRangeMake(CMTimeMake(0.0, 1), CMTimeMake(diffTime/currentround,1));
    // startTime and endTime is the duration that we need to save.
    //exportSession.timeRange = range;


    [exportSession exportAsynchronouslyWithCompletionHandler:^{

        switch ([exportSession status]) {
            case AVAssetExportSessionStatusCompleted:
                NSLog(@"Export Completed");

                break;
            case AVAssetExportSessionStatusWaiting:
                NSLog(@"Export Waiting");
                break;
            case AVAssetExportSessionStatusExporting:
                NSLog(@"Export Exporting");
                break;
            case AVAssetExportSessionStatusFailed:
            {
                NSError *error = [exportSession error];
                NSLog(@"Export failed: %@", [error localizedDescription]);

                break;
            }
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Export canceled");

                break;
            default:
                break;
        }

        dispatch_async(dispatch_get_main_queue(), ^{

            //[activityIndicator stopAnimating];

            //CMTime range = CMTimeSubtract(self.endTime, self.startTime);
            //NSUInteger theLength = (int)CMTimeGetSeconds(range);

            //[ self.songPickedDelegate songPickerDidMakeSelection:self.songTitle :self.audioPath :theLength];





        });

        //[exportSession release];
    }];
}
Jatin
  • 1,668
  • 2
  • 16
  • 23

1 Answers1

5

You can use -[AVMutableCompositionTrack scaleTimeRange:toDuration:] to scale the entire asset from 10 seconds to 5 seconds. The frameDuration property will simply perform frame rate conversion.

  • @user3196356 do you happen to have an example of this? there seems very little examples for this and reading the docs doesn't seem to help. If needed let me know and I will share my code but it doesn't work at all so no point – sudoExclaimationExclaimation Sep 02 '15 at 04:25
  • @PranoyC use this code to get started https://developer.apple.com/library/content/samplecode/AVSimpleEditoriOS/Introduction/Intro.html – Josh Bernfeld Jan 26 '18 at 07:59
  • A gotcha I discovered: you cannot use .positiveInfinity for the CMTimeRange, that'll be ignored. The time range needs to be real. – xaphod Aug 28 '19 at 20:50