24

I've seen this question asked a few times, but none of them seem to have any working answers.

The requirement is to reverse and output a video file (not just play it in reverse) keeping the same compression, format, and frame rate as the source video.

Ideally, the solution would be able to do this all in memory or buffer and avoid generating the frames into image files (for ex: using AVAssetImageGenerator) and then recompiling it (resource intensive, unreliable timing results, changes in frame/image quality from original, etc.).

--

My contribution: This is still not working, but the best I've tried so far:

  • Read in the sample frames into an array of CMSampleBufferRef[] using AVAssetReader.
  • Write it back in reverse order using AVAssetWriter.
  • Problem: Seems like timing for each frame is saved in the CMSampleBufferRef so even appending them backwards will not work.
  • Next, I tried swapping the timing information of each frame with reverse/mirror frame.
  • Problem: This causes an unknown error with AVAssetWriter.
  • Next Step: I'm going to look into AVAssetWriterInputPixelBufferAdaptor

    - (AVAsset *)assetByReversingAsset:(AVAsset *)asset {
        NSURL *tmpFileURL = [NSURL URLWithString:@"/tmp/test.mp4"];    
        NSError *error;
    
        // initialize the AVAssetReader that will read the input asset track
        AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
        AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] lastObject];
    
        AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:nil];
        [reader addOutput:readerOutput];
        [reader startReading];
    
        // Read in the samples into an array
        NSMutableArray *samples = [[NSMutableArray alloc] init];
    
        while(1) {
            CMSampleBufferRef sample = [readerOutput copyNextSampleBuffer];
    
            if (sample == NULL) {
                break;
            }
    
            [samples addObject:(__bridge id)sample];
            CFRelease(sample);
        }
    
        // initialize the the writer that will save to our temporary file.
        CMFormatDescriptionRef formatDescription = CFBridgingRetain([videoTrack.formatDescriptions lastObject]);
        AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:nil sourceFormatHint:formatDescription];
        CFRelease(formatDescription);
    
        AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:tmpFileURL
                                                          fileType:AVFileTypeMPEG4
                                                             error:&error];
        [writerInput setExpectsMediaDataInRealTime:NO];
        [writer addInput:writerInput];
        [writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)samples[0])];
        [writer startWriting];
    
    
        // Traverse the sample frames in reverse order
        for(NSInteger i = samples.count-1; i >= 0; i--) {
            CMSampleBufferRef sample = (__bridge CMSampleBufferRef)samples[i];
    
            // Since the timing information is built into the CMSampleBufferRef 
            // We will need to make a copy of it with new timing info. Will copy
            // the timing data from the mirror frame at samples[samples.count - i -1]
    
            CMItemCount numSampleTimingEntries;
            CMSampleBufferGetSampleTimingInfoArray((__bridge CMSampleBufferRef)samples[samples.count - i -1], 0, nil, &numSampleTimingEntries);
            CMSampleTimingInfo *timingInfo = malloc(sizeof(CMSampleTimingInfo) * numSampleTimingEntries);
            CMSampleBufferGetSampleTimingInfoArray((__bridge CMSampleBufferRef)sample, numSampleTimingEntries, timingInfo, &numSampleTimingEntries);
    
            CMSampleBufferRef sampleWithCorrectTiming;
            CMSampleBufferCreateCopyWithNewTiming(
                                                  kCFAllocatorDefault,
                                                  sample,
                                                  numSampleTimingEntries,
                                                  timingInfo,
                                                  &sampleWithCorrectTiming);
    
            if (writerInput.readyForMoreMediaData)  {
                [writerInput appendSampleBuffer:sampleWithCorrectTiming];
            }
    
            CFRelease(sampleWithCorrectTiming);
            free(timingInfo);
        }
    
        [writer finishWriting];
    
        return [AVAsset assetWithURL:tmpFileURL];
    }
    
Andy Hin
  • 30,345
  • 42
  • 99
  • 142
  • 2
    I don't think that this is possible because of the way video compression works... from my understanding you can only go forward from a keyframe, but not backwards .. without calculating all frames between the key frames – Bastian May 13 '15 at 14:33
  • @Bastian can you elaborate a bit on what you mean? I have the raw sample data (CMSampleBufferRef) for each frame stored in an array. – Andy Hin May 14 '15 at 15:40
  • Just an FYI to anyone reading this. I figured it out and will be posting an answer in the next few days. – Andy Hin May 15 '15 at 19:15

2 Answers2

17

Worked on this over the last few days and was able to get it working.

Source code here: http://www.andyhin.com/post/5/reverse-video-avfoundation

Uses AVAssetReader to read out the samples/frames, extracts the image/pixel buffer, and then appends it with the presentation time of the mirror frame.

Andy Hin
  • 30,345
  • 42
  • 99
  • 142
  • this code has problems. If you pass a landscape or portrait video then its output orientation gets all messed up. Don't even bother with a regular iphone high res video, this control just runs the memory to infinity and crashes. – Sam B Jul 08 '15 at 23:21
  • 1
    @SamB the code is a proof of concept. It actually only supports a single filetype and orientation (have a look at the source code). If you want to support other settings and formats you will need to modify it. Also, if you are processing large files you probably don't want to store it in memory - just use the generic apple solution and write it out to disk. – Andy Hin Jul 09 '15 at 14:57
  • 2
    The URL appears to have been changed to http://www.andyhin.com/post/5/reverse-video-avfoundation Answers with links are discouraged, for reasons like this... GitHub link for the source, which may be less likely to change, is https://github.com/whydna/ReverseAVAsset – Oliver Dec 07 '15 at 22:41
  • 2
    Swift version: https://github.com/tempire/ReverseAVAsset/blob/master/AVAsset.swift – Tempire Feb 06 '16 at 21:56
  • 2
    That sleep on NSThread is not a good solution, you should vend your frames to your `AVAssetWriterInput` using `requestMediaDataWhenReady(on:using:)`. See: https://developer.apple.com/reference/avfoundation/avassetwriterinput/1387508-requestmediadatawhenreadyonqueue – HHK Jul 15 '16 at 22:18
  • use `writerInput.transform = videoTrack.preferredTransform;` to handle the orientation in the code – rahulg Jul 28 '16 at 12:53
  • @tempire For some reason the Swift version doesn't work for me :(. The Objective-C version works fine though. – Chan Jing Hong Dec 09 '16 at 07:04
  • @ChanJingHong It's likely the swift versions don't match up. Considering the date of the gist, it was probably for Swift 2.2. – Tempire Dec 09 '16 at 23:18
  • 2
    Any idea on how to reverse the audio as well? – Chan Jing Hong Dec 10 '16 at 09:35
  • I posted a Swift 5 version of this here: https://stackoverflow.com/questions/12193773/ios-reversing-video-file-mov/57681411#57681411 – xaphod Aug 27 '19 at 19:54
  • 1
    - No audio in any solution. Please add it too. – AsifHabib Mar 19 '20 at 10:37
-1

Swift 5 version of Original Answer:

extension AVAsset {
    func getReversedAsset(outputURL: URL) -> AVAsset? {
        do {
            let reader = try AVAssetReader(asset: self)

            guard let videoTrack = tracks(withMediaType: AVMediaType.video).last else {
                return .none
            }

            let readerOutputSettings = [
                "\(kCVPixelBufferPixelFormatTypeKey)": Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
            ]

            let readerOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: readerOutputSettings)

            reader.add(readerOutput)
            reader.startReading()

            // Read in frames (CMSampleBuffer is a frame)
            var samples = [CMSampleBuffer]()
            while let sample = readerOutput.copyNextSampleBuffer() {
                samples.append(sample)
            }

            // Write to AVAsset
            let writer = try AVAssetWriter(outputURL: outputURL, fileType: AVFileType.mp4)

            let writerOutputSettings = [
                AVVideoCodecKey: AVVideoCodecType.h264,
                AVVideoWidthKey: videoTrack.naturalSize.width,
                AVVideoHeightKey: videoTrack.naturalSize.height,
                AVVideoCompressionPropertiesKey: [AVVideoAverageBitRateKey: videoTrack.estimatedDataRate]
            ] as [String : Any]
            
            let sourceFormatHint = videoTrack.formatDescriptions.last as! CMFormatDescription
            let writerInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writerOutputSettings, sourceFormatHint: sourceFormatHint)
            writerInput.expectsMediaDataInRealTime = false

            let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: writerInput, sourcePixelBufferAttributes: .none)
            writer.add(writerInput)
            writer.startWriting()
            writer.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(samples[0]))

            for (index, sample) in samples.enumerated() {
                let presentationTime = CMSampleBufferGetPresentationTimeStamp(sample)

                if let imageBufferRef = CMSampleBufferGetImageBuffer(samples[samples.count - index - 1]) {
                    pixelBufferAdaptor.append(imageBufferRef, withPresentationTime: presentationTime)
                }

                while !writerInput.isReadyForMoreMediaData {
                    Thread.sleep(forTimeInterval: 0.1)
                }
            }

            writer.finishWriting { }
            return AVAsset(url: outputURL)
        }
        catch let error as NSError {
            print("\(error)")
            return .none
        }
    }
}
Vikas Saini
  • 483
  • 4
  • 16