4

I’m trying to render a face mesh with RealityKit, no success yet. So when ARKit detected a human face, then ARSession generates an ARFaceAnchor which has a face geometry mesh in it.

But it cannot being generated as a model entity.

Could anyone help on this?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Frank Lan
  • 103
  • 5

1 Answers1

3

Canonical Face Mesh in RealityKit

To programmatically generate and render an ARKit's canonical face mesh (ARFaceGeometry object consisting of 1220 vertices) in RealityKit 2.0 use the following code:

import ARKit
import RealityKit

class ControllerView: UIViewController {
    
    @IBOutlet var arView: ARView!
    var anchor = AnchorEntity()
    var model = ModelEntity()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        arView.automaticallyConfigureSession = false
        arView.session.delegate = self

        guard ARFaceTrackingConfiguration.isSupported
        else { 
            fatalError("We can't run face tracking config") 
        }
                
        let config = ARFaceTrackingConfiguration()
        config.maximumNumberOfTrackedFaces = 1
        arView.session.run(config)
    }
}

Then create a method for converting face anchor's sub-properties. Note that I used for-in loop to convert indices from [Int16] to [UInt32] type (type casting doesn't help here).

extension ControllerView {
    
    private func nutsAndBoltsOf(_ anchor: ARFaceAnchor) -> MeshDescriptor {
        
        let vertices: [simd_float3] = anchor.geometry.vertices
        var triangleIndices: [UInt32] = []
        let texCoords: [simd_float2] = anchor.geometry.textureCoordinates
        
        for index in anchor.geometry.triangleIndices {         // [Int16]
            triangleIndices.append(UInt32(index))
        }
        print(vertices.count)         // 1220 vertices
        
        var descriptor = MeshDescriptor(name: "canonical_face_mesh")
        descriptor.positions = MeshBuffers.Positions(vertices)
        descriptor.primitives = .triangles(triangleIndices)
        descriptor.textureCoordinates = MeshBuffers.TextureCoordinates(texCoords)
        return descriptor
    }
}

And, at last, let's run a delegate's method to feed a mesh resource:

extension ControllerView: ARSessionDelegate {
    
    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {

        guard let faceAnchor = anchors[0] as? ARFaceAnchor else { return }
        arView.session.add(anchor: faceAnchor)
        self.anchor = AnchorEntity(anchor: faceAnchor)
        self.anchor.scale *= 1.2

        let mesh: MeshResource = try! .generate(from: [nutsAndBoltsOf(faceAnchor)])
        var material = SimpleMaterial(color: .magenta, isMetallic: true)
        self.model = ModelEntity(mesh: mesh, materials: [material])
        self.anchor.addChild(self.model)
        arView.scene.anchors.append(self.anchor)
    }
}

Result (tested on iPad Pro 4th gen in iPadOS 16.2).

enter image description here


I also recommend you take a look at the post about visualizing detected planes in RealityKit 2.0.

Merry Christmas!

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220