I'm trying to create environment mesh with textures, the mesh has been built successfully but not sure how to the store captured textures to apply to the mesh later. According to this as we are creating a geometry for each anchor at the final state of scan, I suppose we need to store each frame texture for corresponding anchor with help of ARSession delegate.
The way textures are recording:
//Create a dictionary to store textures corresponding to each anchor
private var anchors = [UUID: UIImage]()
//Store every new anchor with the current frame image
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
guard let cameraImage = captureCamera() else { return }
anchors.forEach { anchor in
self.anchors[anchor.identifier] = cameraImage
}
}
//In case of an anchor update, texture needs to be updated too(not sure it's necessary or not)
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
guard let cameraImage = captureCamera() else { return }
anchors.forEach { anchor in
self.anchors[anchor.identifier] = cameraImage
}
}
//Remove every removed anchor to free up the memory
func session(_ session: ARSession, didRemove anchors: [ARAnchor]) {
anchors.forEach { anchor in
self.anchors.removeValue(forKey: anchor.identifier)
}
}
//Capture current camera frame image
func captureCamera() -> UIImage? {
guard let frame = metalARSession.currentFrame else {return nil}
let pixelBuffer = frame.capturedImage
let image = CIImage(cvPixelBuffer: pixelBuffer)
let context = CIContext(options:nil)
guard let cameraImage = context.createCGImage(image, from: image.extent) else {return nil}
return UIImage(cgImage: cameraImage)
}
In rest of the code I have created a MDLAsset and export it as an OBJ file, also textures are saved on storage and texture coordinates are calculated too. Now how can I apply these textures to my OBJ file?