This is sort of an extension of this question of mine, but I think it is different enough to merit its own question:
I am filtering videos of various sizes, scales, etc. by feeding them into an AVMutableVideoComposition
.
This is the code that I currently have:
private func filterVideo(with filter: Filter?) {
if let player = playerLayer?.player, let playerItem = player.currentItem {
let composition = AVMutableComposition()
let videoAssetTrack = playerItem.asset.tracks(withMediaType: .video).first
let videoCompositionTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
try? videoCompositionTrack?.insertTimeRange(CMTimeRange(start: kCMTimeZero, duration: playerItem.asset.duration), of: videoAssetTrack!, at: kCMTimeZero)
let videoComposition = AVMutableVideoComposition(asset: composition, applyingCIFiltersWithHandler: { (request) in
print(request.sourceImage.pixelBuffer) // Sometimes => nil
if let filter = filter {
if let filteredImage = filter.filterImage(request.sourceImage) {
request.finish(with: filteredImage, context: nil)
} else {
request.finish(with: RenderError.couldNotFilter)
}
} else {
request.finish(with: request.sourceImage, context: nil)
}
})
playerItem.videoComposition = videoComposition
}
}
filter
is an instance of my custom Filter
class, which has functions to filter a UIImage or CIImage.
The problem is that some videos get messed up. This is the case for only the problematic videos for which filteredImage => nil
as well. This suggests that some images are empty: their pixelBuffer
s are nil
. By the way, the pixelBuffer
is nil
before I even feed it into the filter
.
Why is this happening, and how can I fix it?