19

I'm attempting to apply a CIFilter to an AVAsset, and then save it with the filter applied. The way that I am doing this is by using an AVAssetExportSession with videoComposition set to an AVMutableVideoComposition object with a custom AVVideoCompositing class.

I am also setting the instructions of my AVMutableVideoComposition object to a custom composition instruction class (conforming to AVMutableVideoCompositionInstruction). This class is passed a track ID, along with a few other unimportant variables.

Unfortunately, I've run into a problem - the startVideoCompositionRequest: function in my custom video compositor class (conforming to AVVideoCompositing) is not being called correctly.

When I set the passthroughTrackID variable of my custom instruction class to the track ID, the startVideoCompositionRequest(request) function in my AVVideoCompositing is not called.

Yet, when I do not set the passthroughTrackID variable of my custom instruction class, the startVideoCompositionRequest(request) is called, but not correctly - printing request.sourceTrackIDs results in an empty array, and request.sourceFrameByTrackID(trackID) results in a nil value.

Something interesting that I found was that the cancelAllPendingVideoCompositionRequests: function is always called twice when attempting to export the video with filters. It is either called once before startVideoCompositionRequest: and once after, or just twice in a row in the case that startVideoCompositionRequest: is not called.

I've created three classes for exporting the video with filters. Here's the utility class, which basically just includes an export function and calls all of the required code

class VideoFilterExport{

    let asset: AVAsset
    init(asset: AVAsset){
        self.asset = asset
    }

    func export(toURL url: NSURL, callback: (url: NSURL?) -> Void){
        guard let track: AVAssetTrack = self.asset.tracksWithMediaType(AVMediaTypeVideo).first else{callback(url: nil); return}

        let composition = AVMutableComposition()
        let compositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)

        do{
            try compositionTrack.insertTimeRange(track.timeRange, ofTrack: track, atTime: kCMTimeZero)
        }
        catch _{callback(url: nil); return}

        let videoComposition = AVMutableVideoComposition(propertiesOfAsset: composition)
        videoComposition.customVideoCompositorClass = VideoFilterCompositor.self
        videoComposition.frameDuration = CMTimeMake(1, 30)
        videoComposition.renderSize = compositionTrack.naturalSize

        let instruction = VideoFilterCompositionInstruction(trackID: compositionTrack.trackID)
        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.asset.duration)
        videoComposition.instructions = [instruction]

        let session: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetMediumQuality)!
        session.videoComposition = videoComposition
        session.outputURL = url
        session.outputFileType = AVFileTypeMPEG4

        session.exportAsynchronouslyWithCompletionHandler(){
            callback(url: url)
        }
    }
}

Here's the other two classes - I'll put them both into one code block to make this post shorter

// Video Filter Composition Instruction Class - from what I gather,
// AVVideoCompositionInstruction is used only to pass values to
// the AVVideoCompositing class

class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{

    let trackID: CMPersistentTrackID
    let filters: ImageFilterGroup
    let context: CIContext


    // When I leave this line as-is, startVideoCompositionRequest: isn't called.
    // When commented out, startVideoCompositionRequest(request) is called, but there
    // are no valid CVPixelBuffers provided by request.sourceFrameByTrackID(below value)
    override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
    override var requiredSourceTrackIDs: [NSValue]{get{return []}}
    override var containsTweening: Bool{get{return false}}


    init(trackID: CMPersistentTrackID, filters: ImageFilterGroup, context: CIContext){
        self.trackID = trackID
        self.filters = filters
        self.context = context

        super.init()

        //self.timeRange = timeRange
        self.enablePostProcessing = true
    }

    required init?(coder aDecoder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }

}


// My custom AVVideoCompositing class. This is where the problem lies -
// although I don't know if this is the root of the problem

class VideoFilterCompositor : NSObject, AVVideoCompositing{

    var requiredPixelBufferAttributesForRenderContext: [String : AnyObject] = [
        kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA), // The video is in 32 BGRA
        kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
        kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
    ]
    var sourcePixelBufferAttributes: [String : AnyObject]? = [
        kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA),
        kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
        kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
    ]

    let renderQueue = dispatch_queue_create("co.getblix.videofiltercompositor.renderingqueue", DISPATCH_QUEUE_SERIAL)

    override init(){
        super.init()
    }

    func startVideoCompositionRequest(request: AVAsynchronousVideoCompositionRequest){
       // This code block is never executed when the
       // passthroughTrackID variable is in the above class  

        autoreleasepool(){
            dispatch_async(self.renderQueue){
                guard let instruction = request.videoCompositionInstruction as? VideoFilterCompositionInstruction else{
                    request.finishWithError(NSError(domain: "getblix.co", code: 760, userInfo: nil))
                    return
                }
                guard let pixels = request.sourceFrameByTrackID(instruction.passthroughTrackID) else{
                    // This code block is executed when I comment out the
                    // passthroughTrackID variable in the above class            

                    request.finishWithError(NSError(domain: "getblix.co", code: 761, userInfo: nil))
                    return
                }
                // I have not been able to get the code to reach this point
                // This function is either not called, or the guard
                // statement above executes

                let image = CIImage(CVPixelBuffer: pixels)
                let filtered: CIImage = //apply the filter here

                let width = CVPixelBufferGetWidth(pixels)
                let height = CVPixelBufferGetHeight(pixels)
                let format = CVPixelBufferGetPixelFormatType(pixels)

                var newBuffer: CVPixelBuffer?
                CVPixelBufferCreate(kCFAllocatorDefault, width, height, format, nil, &newBuffer)

                if let buffer = newBuffer{
                    instruction.context.render(filtered, toCVPixelBuffer: buffer)
                    request.finishWithComposedVideoFrame(buffer)
                }
                else{
                    request.finishWithComposedVideoFrame(pixels)
                }
            }
        }
    }

    func renderContextChanged(newRenderContext: AVVideoCompositionRenderContext){
        // I don't have any code in this block
    }

    // This is interesting - this is called twice,
    // Once before startVideoCompositionRequest is called,
    // And once after. In the case when startVideoCompositionRequest
    // Is not called, this is simply called twice in a row
    func cancelAllPendingVideoCompositionRequests(){
        dispatch_barrier_async(self.renderQueue){
            print("Cancelled")
        }
    }
}

I've been looking at Apple's AVCustomEdit sample project a lot for guidance with this, but I can't seem to find in it any reason why this is happening.

How could I get the request.sourceFrameByTrackID: function to call correctly, and provide a valid CVPixelBuffer for each frame?

Jojodmo
  • 23,357
  • 13
  • 65
  • 107
  • Any chance you managed to solve the rotation issue with a custom AVVideoCompositing class? It's not respecting the asset transform, all the CIFilters in the request are always in landscape... We stuck on this for over a week ah – Roi Mulia Dec 12 '18 at 12:04
  • @RoiMulia Sorry about the late response — I'm not quite sure why this could be happening. Are all other filters working, and are you only rotating by 90° increments. If so, you could post a question and I could take a look (or someone else on SO might figure it out) – Jojodmo Dec 21 '18 at 07:13

2 Answers2

13

All of the code for this utility is on GitHub

It turns out that the requiredSourceTrackIDs variable in the custom AVVideoCompositionInstruction class (VideoFilterCompositionInstruction in the question) has to be set to an array containing the track IDs

override var requiredSourceTrackIDs: [NSValue]{
  get{
    return [
      NSNumber(value: Int(self.trackID))
    ]
  }
}

So the final custom composition instruction class is

class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{
    let trackID: CMPersistentTrackID
    let filters: [CIFilter]
    let context: CIContext

    override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
    override var requiredSourceTrackIDs: [NSValue]{get{return [NSNumber(value: Int(self.trackID))]}}
    override var containsTweening: Bool{get{return false}}

    init(trackID: CMPersistentTrackID, filters: [CIFilter], context: CIContext){
        self.trackID = trackID
        self.filters = filters
        self.context = context
    
        super.init()
    
        self.enablePostProcessing = true
    }

    required init?(coder aDecoder: NSCoder){
        fatalError("init(coder:) has not been implemented")
    }
}

All of the code for this utility is also on GitHub

Jojodmo
  • 23,357
  • 13
  • 65
  • 107
  • Hello I've just tried your sample project on github but it does not work on my side. func startRequest(_ asyncVideoCompositionRequest: AVAsynchronousVideoCompositionRequest) is not called in AVVideoCompositing – Sam Nov 09 '16 at 13:55
  • Do you know what is wrong, I've implemented everything as suggested. – Sam Nov 09 '16 at 13:55
  • 1
    @Sam I just updated the GitHub project for Swift 3, although that should not have been the cause unless you made a mistake when migrating the code to Swift 3 yourself... Are you doing this on iOS or macOS? [There is an issue with this on macOS](https://github.com/jojodmo/VideoFilterExporter/issues/1) in that it simply does not work — the utility is intended for iOS, but it would be nice if it would work for macOS too. If you still can't figure it out and you're using iOS, I'll just send you the exact code I'm using in one of my projects and we'll see what went wrong. – Jojodmo Nov 10 '16 at 05:10
  • @Sam also, make sure the export url is a URL and not NSURL, and ends in .mov or .mp4 – Jojodmo Nov 10 '16 at 06:12
  • Thanks a lot for your kind help. I will try it right away and let you know. Yes I made sure it was URL and .mov thanks a lot. Appreciate it. – Sam Nov 10 '16 at 07:28
  • unfortunately for some reasons that I can't figure out the AVVideoCompositing protocol methods do not get called. Any idea why this could happen? – Sam Nov 10 '16 at 08:05
  • I'm on iOS using swift 3 and tried with your updated project on github. – Sam Nov 10 '16 at 08:14
  • Ok my bad I was messing things up with the export URL. It's working fine now juste have to fix the orientation issue because when it is exported the orientation goes 90 degrees to the left. How can I contact you in private? – Sam Nov 10 '16 at 08:34
  • @Sam The orientation is screwed up because of the way videos are recorded on iOS. To fix it, you'll have to use the `CIAffineTransform` filter. I just updated the [GitHub utility with info on how to do this](https://github.com/jojodmo/VideoFilterExporter#rotating-a-video) – Jojodmo Nov 12 '16 at 19:44
  • Thanks a lot man, really appreciate your help! I really enjoy this GitHub utility. Have you heard about that http://stackoverflow.com/questions/40530367/swift-3-how-to-add-watermark-on-video-avvideocompositioncoreanimationtool-ios I wonder how you would do that because you seem to master AVFoundation and iOS programming. Many thanks for your kind help :) – Sam Nov 13 '16 at 09:21
  • @Sam No problem! Unfortunately, I don't know how to add text to videos. The way I created this utility was by looking at an Apple sample project written in Objective-C, and I translated it to Swift and made it into a useable utility. – Jojodmo Nov 17 '16 at 05:11
  • I'm sorry to resurrect an old thread, but this project seems great. However, like @Sam, I've been unable to get the project to ever call `startRequest`, and therefore, it never exports anything (exporting to documents directory as a.mp4). Are you still maintaining this project? – ZbadhabitZ Dec 20 '18 at 22:58
  • please suggest any solution regarding my problem https://stackoverflow.com/questions/54652637/i-want-to-apply-cifilter-on-video-and-save-that-filter-applied-video-export-ta – Deep Aug 06 '19 at 16:41
  • @Jojodmo how did you get the compositor to adopt an AVMutableVideoCompositionInstruction class and not an AVVideoCompositionInstruction class? Thanks – user1752054 Apr 17 '22 at 15:24
10

As you've noted, having passthroughTrackID return the track you want to filter isn't the right approach — you need to return the track to be filtered from requiredSourceTrackIDs instead. (And it looks like once you do that, it doesn't matter if you also return it from passthroughTrackID.) To answer the remaining question of why it works this way...

The docs for passthroughTrackID and requiredSourceTrackIDs certainly aren't Apple's clearest writing ever. (File a bug about it and they might improve.) But if you look closely in the description of the former, there's a hint (emphasis added)...

If for the duration of the instruction, the video composition result is one of the source frames, this property returns the corresponding track ID. The compositor won't be run for the duration of the instruction and the proper source frame is used instead.

So, you use passthroughTrackID only when you're making an instruction class that passes a single track through without processing.

If you plan to perform any image processing, even if it's just to a single track with no compositing, specify that track in requiredSourceTrackIDs instead.

rickster
  • 124,678
  • 26
  • 272
  • 326
  • Hello, could you please elaborate. How to make the github project work with swift3 ? – Sam Nov 09 '16 at 14:09
  • Any chance you managed to solve the rotation issue with a custom AVVideoCompositing class? It's not respecting the asset transform, all the CIFilters in the request are always in landscape... We stuck on this for over a week ah – Roi Mulia Dec 12 '18 at 12:04