I adapted my answer here and the Apple Metal game template to create this sample, which demonstrates how to record a video file directly from a sequence of frames rendered by Metal.
Since all rendering in Metal draws to a texture, it's not too hard to adapt normal Metal code so that it's suitable for rendering offline into a movie file. To recap the core recording process:
- Create an
AVAssetWriter
that targets your URL of choice
- Create an
AVAssetWriterInput
of type .video
so you can write video frames
- Wrap an
AVAssetWriterInputPixelBufferAdaptor
around the input so you can append CVPixelBuffer
s as frames to the video
- After you start recording, each frame, copy the pixels from your rendered frame texture into a pixel buffer obtained from the adapter's pixel buffer pool.
- When you're done, mark the input as finished and finish writing to the asset writer.
As for driving the recording, since you aren't getting delegate callbacks from an MTKView
or CADisplayLink
, you need to do it yourself. The basic pattern looks like this:
for t in stride(from: 0, through: duration, by: frameDelta) {
draw(in: renderBuffer, depthTexture: depthBuffer, time: t) { (texture) in
recorder.writeFrame(forTexture: texture, time: t)
}
}
If your rendering and recording code is asynchronous and thread-safe, you can throw this on a background queue to keep your interface responsive. You could also throw in a progress callback to update your UI if your rendering takes a long time.
Note that since you're not running in real-time, you'll need to ensure that any animation takes into account the current frame time (or the timestep between frames) so things run at the proper rate when played back. In my sample, I do this by just having the rotation of the cube depend directly on the frame's presentation time.