8

Here is a complete project if you care to run this yourself: https://www.dropbox.com/s/5p384mogjzflvqk/AVPlayerLayerSoundOnlyBug_iOS10.zip?dl=0

This is a new problem on iOS 10, and it has been fixed as of iOS 10.2. After exporting a video using AVAssetExportSession and AVVideoCompositionCoreAnimationTool to composite a layer on top of the video during export, videos played in AVPlayerLayer fail to play. This doesn't seem to be caused by hitting the AV encode/decode pipeline limit because it often happens after a single export, which as far as I know only spins up 2 pipelines: 1 for the AVAssetExportSession and another for the AVPlayer. I am also setting the layer's frame properly, as you can see by running the code below which gives the layer a blue background you can plainly see.

After an export, waiting for some time before playing a video seems to make it far more reliable but that's not really an acceptable workaround to tell your users.

Any ideas on what's causing this or how I can fix or work around it? Have I messed something up or missing an important step or detail? Any help or pointers to documentation are much appreciated.

import UIKit
import AVFoundation

/* After exporting an AVAsset using AVAssetExportSession with AVVideoCompositionCoreAnimationTool, we
 * will attempt to play a video using an AVPlayerLayer with a blue background.
 *
 * If you see the blue background and hear audio you're experiencing the missing-video bug. Otherwise
 * try hitting the button again.
 */

class ViewController: UIViewController {
    private var playerLayer: AVPlayerLayer?
    private let button = UIButton()
    private let indicator = UIActivityIndicatorView(activityIndicatorStyle: .gray)

    override func viewDidLoad() {
        super.viewDidLoad()
        view.backgroundColor = UIColor.white
        button.setTitle("Cause Trouble", for: .normal)
        button.setTitleColor(UIColor.black, for: .normal)
        button.addTarget(self, action: #selector(ViewController.buttonTapped), for: .touchUpInside)
        view.addSubview(button)
        button.translatesAutoresizingMaskIntoConstraints = false
        NSLayoutConstraint.activate([
            button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
            button.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: -16),
        ])

        indicator.hidesWhenStopped = true
        view.insertSubview(indicator, belowSubview: button)
        indicator.translatesAutoresizingMaskIntoConstraints = false
        NSLayoutConstraint.activate([
            indicator.centerXAnchor.constraint(equalTo: button.centerXAnchor),
            indicator.centerYAnchor.constraint(equalTo: button.centerYAnchor),
        ])
    }

    func buttonTapped() {
        button.isHidden = true
        indicator.startAnimating()
        playerLayer?.removeFromSuperlayer()

        let sourcePath = Bundle.main.path(forResource: "video.mov", ofType: nil)!
        let sourceURL = URL(fileURLWithPath: sourcePath)
        let sourceAsset = AVURLAsset(url: sourceURL)

        //////////////////////////////////////////////////////////////////////
        // STEP 1: Export a video using AVVideoCompositionCoreAnimationTool //
        //////////////////////////////////////////////////////////////////////
        let exportSession = { () -> AVAssetExportSession in
            let sourceTrack = sourceAsset.tracks(withMediaType: AVMediaTypeVideo).first!

            let parentLayer = CALayer()
            parentLayer.frame = CGRect(origin: .zero, size: CGSize(width: 1280, height: 720))
            let videoLayer = CALayer()
            videoLayer.frame = parentLayer.bounds
            parentLayer.addSublayer(videoLayer)

            let composition = AVMutableVideoComposition(propertiesOf: sourceAsset)
            composition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
            let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: sourceTrack)
            layerInstruction.setTransform(sourceTrack.preferredTransform, at: kCMTimeZero)
            let instruction = AVMutableVideoCompositionInstruction()
            instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
            instruction.layerInstructions = [layerInstruction]
            composition.instructions = [instruction]

            let e = AVAssetExportSession(asset: sourceAsset, presetName: AVAssetExportPreset1280x720)!
            e.videoComposition = composition
            e.outputFileType = AVFileTypeQuickTimeMovie
            e.timeRange = CMTimeRange(start: kCMTimeZero, duration: sourceAsset.duration)
            let outputURL = URL(fileURLWithPath: NSTemporaryDirectory().appending("/out2.mov"))
            _ = try? FileManager.default.removeItem(at: outputURL)
            e.outputURL = outputURL
            return e
        }()

        print("Exporting asset...")
        exportSession.exportAsynchronously {
            assert(exportSession.status == .completed)

            //////////////////////////////////////////////
            // STEP 2: Play a video in an AVPlayerLayer //
            //////////////////////////////////////////////
            DispatchQueue.main.async {
                // Reuse player layer, shouldn't be hitting the AV pipeline limit
                let playerItem = AVPlayerItem(asset: sourceAsset)
                let layer = self.playerLayer ?? AVPlayerLayer()
                if layer.player == nil {
                    layer.player = AVPlayer(playerItem: playerItem)
                }
                else {
                    layer.player?.replaceCurrentItem(with: playerItem)
                }
                layer.backgroundColor = UIColor.blue.cgColor
                if UIDeviceOrientationIsPortrait(UIDevice.current.orientation) {
                    layer.frame = self.view.bounds
                    layer.bounds.size.height = layer.bounds.width * 9.0 / 16.0
                }
                else {
                    layer.frame = self.view.bounds.insetBy(dx: 0, dy: 60)
                    layer.bounds.size.width = layer.bounds.height * 16.0 / 9.0
                }
                self.view.layer.insertSublayer(layer, at: 0)
                self.playerLayer = layer

                layer.player?.play()
                print("Playing a video in an AVPlayerLayer...")

                self.button.isHidden = false
                self.indicator.stopAnimating()
            }
        }
    }
}
Sami Samhuri
  • 1,540
  • 17
  • 21
  • `AVAssetExportSession` seems to be buggy on iOS10 http://stackoverflow.com/q/39560386/22147 http://stackoverflow.com/a/39746140/22147 – Rhythmic Fistman Sep 29 '16 at 02:15
  • @RhythmicFistman Thanks! I did not come across that yet. Looks like I can work around the issue using a custom video compositor instead of AVVideoCompositionCoreAnimationTool. – Sami Samhuri Sep 29 '16 at 04:39

4 Answers4

5

The answer for me in this case is to work around the issue with AVVideoCompositionCoreAnimationTool by using a custom video compositing class implementing the AVVideoCompositing protocol, and a custom composition instruction implementing the AVVideoCompositionInstruction protocol. Because I need to overlay a CALayer on top of the video I'm including that layer in the composition instruction instance.

You need to set the custom compositor on your video composition like so:

composition.customVideoCompositorClass = CustomVideoCompositor.self

and then set your custom instructions on it:

let instruction = CustomVideoCompositionInstruction(...) // whatever parameters you need and are required by the instruction protocol
composition.instructions = [instruction]

EDIT: Here is a working example of how to use a custom compositor to overlay a layer on a video using the GPU: https://github.com/samsonjs/LayerVideoCompositor ... original answer continues below

As for the compositor itself you can implement one if you watch the relevant WWDC sessions and check out their sample code. I cannot post the one I wrote here, but I am using CoreImage to do the heavy lifting in processing the AVAsynchronousVideoCompositionRequest, making sure to use an OpenGL CoreImage context for best performance (if you do it on the CPU it will be abysmally slow). You also may need an auto-release pool if you get a memory usage spike during the export.

If you're overlaying a CALayer like me then make sure to set layer.isGeometryFlipped = true when you render that layer out to a CGImage before sending it off to CoreImage. And make sure you cache the rendered CGImage from frame to frame in your compositor.

Sami Samhuri
  • 1,540
  • 17
  • 21
  • Thanks for the workaround. It would have been great to have a more explicit code to make it clear. – Sam Oct 12 '16 at 07:27
  • Hello Can you show a more explicit code please to help the community? This issue is annoying and I don't manage to fix it – Sam Nov 09 '16 at 09:13
  • 1
    @Sam I need to build a custom video compositor for my project to work around this issue. I'll throw a link up on GitHub this week when it's ready. My solution will simply allow you to place a watermark on a video. But if nothing else, it'll give you a place to start if you need something more complex. – Clay Garrett Nov 16 '16 at 04:48
  • @kleezy thanks a lot you rock!! Appreciate your help :) – Sam Nov 16 '16 at 11:04
  • See below @Sam. Just posted a link. – Clay Garrett Nov 16 '16 at 22:37
2

We had the same issue on iOS 10 and 10.1. Looks fixed as of iOS 10.2 beta 3 though

Dimitar08
  • 450
  • 7
  • 15
  • 1
    Excellent! Do you know if the fix is mentioned by Apple somewhere? Or have you just observed it while using the beta? – Clay Garrett Nov 16 '16 at 04:44
  • Great but this means for users on iOS 10 and 10.1 it won't work, even if it has been fixed in 10.2 is that right ? – Sam Nov 16 '16 at 11:05
  • 1
    @kleezy I've not seen anything in the release notes. We worked around the issue on 10.0 and 10.1 by artificially delaying exportAsynchronously. This reduces the likelihood of the issue happening. We also put error messaging behind the player view, so that if the issue does happen, the user doesn't just see a black screen. The delay and error messaging are removed if the user is running 10.2 or higher. – Dimitar08 Nov 18 '16 at 15:49
1

To expand upon Sami Samhuri's answer, here's a small sample project I worked up that uses a custom AVVideoCompositing class with custom instructions that implement AVVideoCompositionInstructionProtocol

https://github.com/claygarrett/CustomVideoCompositor

The project allows you to place a watermark over a video, but the idea could extend to do whatever you need. This prevents the AVPlayer bug in question from surfacing.

Another interesting solution on a separate thread that might help: AVPlayer playback fails while AVAssetExportSession is active as of iOS 10

Community
  • 1
  • 1
Clay Garrett
  • 1,023
  • 8
  • 11
  • Thanks for putting this together! This is the basic solution but I really recommend using the GPU instead of CoreGraphics for performance. I'll try to submit a pull request. – Sami Samhuri Nov 17 '16 at 00:25
  • My solution is rather different because I'm using a layer instead of an image, similar to how the CA animation tool works. Here's a gist with my compositor and instruction. Maybe somebody can cobble together a complete working example: https://gist.github.com/samsonjs/71e27c1f500725d3d0c48064af7c1fd3 – Sami Samhuri Nov 17 '16 at 01:18
  • 1
    Ok I finally packaged up my solution into an example project: https://github.com/samsonjs/LayerVideoCompositor – Sami Samhuri Nov 17 '16 at 04:07
  • Great @SamiSamhuri! Thanks for doing that. I'll definitely give it a look. Interested in the GPU optimizations for sure. – Clay Garrett Nov 17 '16 at 15:49
0

I have met this issue on iOS 10.1 and the issue was fixed on iOS 10.2. One way I found to solve this issue on iOS 10.1 is to delay several seconds to add the playerLayer to your container layer.

Chuyang
  • 71
  • 1
  • 7