4

I'm using this Swift class (shown originally in the answer to this question: Capture Metal MTKView as Movie in realtime?) to try to record my Metal app frames to a movie file.

class MetalVideoRecorder {
    var isRecording = false
    var recordingStartTime = TimeInterval(0)

    private var assetWriter: AVAssetWriter
    private var assetWriterVideoInput: AVAssetWriterInput
    private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor

    init?(outputURL url: URL, size: CGSize) {
        do {
            assetWriter = try AVAssetWriter(outputURL: url, fileType: AVFileTypeAppleM4V)
        } catch {
            return nil
        }

        let outputSettings: [String: Any] = [ AVVideoCodecKey : AVVideoCodecH264,
            AVVideoWidthKey : size.width,
            AVVideoHeightKey : size.height ]

        assetWriterVideoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
        assetWriterVideoInput.expectsMediaDataInRealTime = true

        let sourcePixelBufferAttributes: [String: Any] = [
            kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA,
            kCVPixelBufferWidthKey as String : size.width,
            kCVPixelBufferHeightKey as String : size.height ]

        assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: assetWriterVideoInput,
                                                                           sourcePixelBufferAttributes: sourcePixelBufferAttributes)

        assetWriter.add(assetWriterVideoInput)
    }

    func startRecording() {
        assetWriter.startWriting()
        assetWriter.startSession(atSourceTime: kCMTimeZero)

        recordingStartTime = CACurrentMediaTime()
        isRecording = true
    }

    func endRecording(_ completionHandler: @escaping () -> ()) {
        isRecording = false

        assetWriterVideoInput.markAsFinished()
        assetWriter.finishWriting(completionHandler: completionHandler)
    }

    func writeFrame(forTexture texture: MTLTexture) {
        if !isRecording {
            return
        }

        while !assetWriterVideoInput.isReadyForMoreMediaData {}

        guard let pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool else {
            print("Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame")
            return
        }

        var maybePixelBuffer: CVPixelBuffer? = nil
        let status  = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)
        if status != kCVReturnSuccess {
            print("Could not get pixel buffer from asset writer input; dropping frame...")
            return
        }

        guard let pixelBuffer = maybePixelBuffer else { return }

        CVPixelBufferLockBaseAddress(pixelBuffer, [])
        let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!

        // Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned
        let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
        let region = MTLRegionMake2D(0, 0, texture.width, texture.height)

        texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)

        let frameTime = CACurrentMediaTime() - recordingStartTime
        let presentationTime = CMTimeMakeWithSeconds(frameTime, 240)
        assetWriterPixelBufferInput.append(pixelBuffer, withPresentationTime: presentationTime)

        CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
    }
}

I am not seeing any errors, but the frames in the resulting Quicktime file are all black. The frames are the correct size, and my pixel format is correct (bgra8Unorm). Anyone know why it might not be working?

I am calling the writeFrame function before I present and commit the current drawable, like this:

        if let drawable = view.currentDrawable {

            if BigVideoWriter != nil && BigVideoWriter!.isRecording {
                commandBuffer.addCompletedHandler { commandBuffer in
                    BigVideoWriter?.writeFrame(forTexture: drawable.texture)
                }
            }

            commandBuffer.present(drawable)
            commandBuffer.commit()      
        }

I did get an error initially, that my MetalKitView layer was 'framebufferOnly'. So I set that to false before trying to record. That got rid of the error but the frames are all black. I also tried setting it to false at the very beginning of the program, but I get the same results.

I also tried using 'addCompletedHandler' instead of 'addScheduledHandler', but that gives me the error "[CAMetalLayerDrawable texture] should not be called after already presenting this drawable. Get a nextDrawable instead. ".

Thanks for any suggestions!


EDIT: I got this resolved with the help of @Idogy. Testing revealed that the original version worked on iOS but not Mac. He said that since I have an NVIDIA GPU, the framebuffers are private. So I had to add a blitCommandEncoder with a synchronize call on the texture, then it started working. Like this:

   if let drawable = view.currentDrawable {

        if BigVideoWriter != nil && BigVideoWriter!.isRecording {
 #if ISMAC
            if let blitCommandEncoder = commandBuffer.makeBlitCommandEncoder() {
                blitCommandEncoder.synchronize(resource: drawable.texture)
                blitCommandEncoder.endEncoding()
            }
 #endif
            commandBuffer.addCompletedHandler { commandBuffer in
                BigVideoWriter?.writeFrame(forTexture: drawable.texture)
            }
        }

        commandBuffer.present(drawable)
        commandBuffer.commit()    
    }
bsabiston
  • 721
  • 6
  • 22

1 Answers1

4

I believe you are writing your frames too early -- by calling writeFrame from within your render loop, you are essentially capturing the drawable at a time when it is still empty (the GPU just hasn't rendered it yet).

Remember that before you call commmandBuffer.commit(), the GPU hasn't even begun rendering your frame. You need to wait for the GPU to finish rendering before trying to grab the resulting frame. The sequence is a bit confusing because you're also calling present() before calling commit(), but that isn't the actual order of operations in run-time. That present call is merely telling Metal to schedule a call to present your frame to the screen once the GPU has finished rendering.

You should call writeFrame from within a completion handler (using commandBuffer.addCompletedHandler()). That should take care of this.

UPDATE: While the answer above is correct, it is only partial. Since the OP was using a discrete GPU with private VRAM, the CPU wasn't able to see the render target pixels. The solution to that problem is to add an MTLBlitCommandEncoder, and use the synchronize() method to ensure the rendered pixels are copied back to RAM from the GPU's VRAM.

ldoogy
  • 2,819
  • 1
  • 24
  • 38
  • Oh, right. That was stupid. However, it didn't fix the problem. I still get the black frames. I tried it with addCompletedHandler() and addScheduledHandler() like the example in the link I provided. No dice. Any other ideas? I updated my post to show what I have now. – bsabiston Jun 01 '18 at 16:05
  • Also -- when I use 'addCompletedHandler' it gives me this error:"[CAMetalLayerDrawable texture] should not be called after already presenting this drawable. Get a nextDrawable instead. Using 'addScheduledHandler', like the sample code, does not give an error but still gives black frames. – bsabiston Jun 01 '18 at 16:21
  • I think you should retain the texture before committing your command buffer. So you'll save view.currentDrawable.texture BEFORE entering the completion handler, and you'll reference it from within the block. I believe that should work. To be clear, you must call your writeFrames from the completion handler (not from scheduled, by the way). Even if you're still getting black frames, keep it in the completion handler -- it just means you have another bug elsewhere... – ldoogy Jun 01 '18 at 16:26
  • Well -- holding onto the texture does prevent the error message, but I still get only black frames. – bsabiston Jun 01 '18 at 16:49
  • Interesting. Have you inspected you pixelBufferBytes to ensure there are actual values in there and not all zeros? – ldoogy Jun 01 '18 at 17:21
  • trying to figure out how to do that now! I suspect they are all zeroes. – bsabiston Jun 01 '18 at 17:22
  • Yep -- getting all zeroes. – bsabiston Jun 01 '18 at 17:30
  • Right. You should try writing some values in there, just to make sure the video is encoding correctly. I assume your Metal app is correctly showing your frames on screen? – ldoogy Jun 01 '18 at 17:37
  • That works as expected. It just isn't getting the right texture from the drawable somehow. The frames onscreen are right. Maybe I will try it on iOS to see if the problem is specific to Mac. – bsabiston Jun 01 '18 at 17:52
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/172267/discussion-between-ldoogy-and-bsabiston). – ldoogy Jun 01 '18 at 18:24
  • Hello ldoogy & bsabiston! Looks like you've been though some pain on this matter. I'm dealing with what I believe a relevant issue. Would you mind taking a look? https://stackoverflow.com/questions/56018503/making-cicontext-renderciimage-cvpixelbuffer-work-with-avassetwriter – Ian Bytchek May 07 '19 at 10:58