I want to be able to store the depth data of a captured ARframe in iOS in a non-lossy compressed image file.
According to this WWDC talk:
"In iOS 11 we support two kinds of images with depth. The first is HEIF HEVC, the new format, also called HEIC files, and there, there is first-class support for depth...The second format we support is JPEG. Boy, JPEG wasn't meant to do tricks like this, but we made it do this trick anyway. The map is 8-bit lossy JPEG if it's filtered, or if it has not a numbers in it, we use 16-bit lossless JPEG encoding to preserve all of the not a numbers, and we store it as a second image at the bottom of the JPEG, so it's like a multipicture object, if you're familiar with that."
When I compare the original depth buffer(16-bit) with the depth buffer I retrieve from the stored image pixel wise, I get these results:
First Second
0.61865234 0.6196289
0.62109375 0.6196289
0.6269531 0.6274414
0.6298828 0.63134766
0.6328125 0.63134766
nan 0.003921509
nan 0.0
nan 0.0
nan 0.007843018
nan 0.003921509
Even when I have unfiltered depth data with NANs in it, the stored data isn't able to preserve them and neither does it use lossless encoding as it seems
This is the code I wrote:
if let currentFrame = session.currentFrame, let depthData = currentFrame.capturedDepthData { // The session variable is an ARSession object
let outputURL: URL? = filePath(forKey: "test")
guard let cgImageDestination = CGImageDestinationCreateWithURL(outputURL! as CFURL, kUTTypeJPEG, 1, nil) else {
return
}
depthData.depthDataMap.normalize() // Normalizing depth data between 0.0 and 1.0
let sixteenBitDepthData = depthData.converting(toDepthDataType: kCVPixelFormatType_DepthFloat16)
let ciImage = CIImage(cvPixelBuffer: currentFrame.capturedImage)
let context = CIContext(options: nil)
let dict: NSDictionary = [
kCGImageDestinationLossyCompressionQuality: 1.0,
kCGImagePropertyIsFloat: kCFBooleanTrue,
]
if let cgImage: CGImage = context.createCGImage(ciImage, from: ciImage.extent) {
CGImageDestinationAddImage(cgImageDestination, cgImage, nil)
}
var auxDataType: NSString?
let auxData = sixteenBitDepthData.dictionaryRepresentation(forAuxiliaryDataType: &auxDataType)
CGImageDestinationAddAuxiliaryDataInfo(cgImageDestination, auxDataType!, auxData! as CFDictionary)
CGImageDestinationFinalize(cgImageDestination)
if let second = getDepthBufferFromFile(key: "test") {
self.compareBuffers(first: sixteenBitDepthData.depthDataMap, second: second)
}
}