I am using ARKit 4 (+MetalKit) with the new iPhone device, and I am trying to access the depth data (from the LiDAR) and save it as the depth map along with the actual RGB image. While I have seen many demos showing it, non showed the code for actually visualising the depth map. I have made a button, and this is the code that is used with it:
func saveCapturedBuffer(buffer: CVPixelBuffer) {
let depthCIImage = CIImage(cvPixelBuffer: buffer)
let context: CIContext = CIContext.init(options: nil)
let depthCGImage: CGImage = context.createCGImage(depthCIImage, from: depthCIImage.extent)!
let depthUIImage = UIImage.init(cgImage: depthCGImage)
UIImageWriteToSavedPhotosAlbum(depthUIImage, self, #selector(saveError), nil)
}
@objc func capture(sender: UIButton) {
let frame: ARFrame = session.currentFrame!
guard let depthData = frame.sceneDepth ?? frame.sceneDepth else { return }
var pixelBufferDepth: CVPixelBuffer!
pixelBufferDepth = depthData.depthMap
saveCapturedBuffer(buffer: pixelBufferDepth)
var capturedImageBuffer: CVPixelBuffer!
capturedImageBuffer = frame.capturedImage
print(CVPixelBufferGetPlaneCount(capturedImageBuffer))
saveCapturedBuffer(buffer: capturedImageBuffer)
}
@objc func saveError(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {
if let error = error {
print("error: \(error.localizedDescription)")
} else { }
}
So, the RGB image is saved by this code without any issues. However, the depth map sometimes is absolutely empty (while I see a better depth map when I am rendering the depth map). So, I guess something is wrong with the CVPixelBuffer
-to-UIImage
conversion. I am absolutely new to Swift, and I have no clue what it can be.
I have figured out that the CVPixelBuffer
is non-planar, so I would guess this is my issue, but neither I have any idea what to do about it, nor I have found any answer explaining how to proceed.
So, how to properly convert the non-planar CVPixelBuffer
with sceneDepth.depthMap
to UIImage
?