Apple document says you can set userFaceTrackingEnabled
to simultaneous front and back camera. After add ARView
and setting configuration correctly, i can confirm that ARSessionDelegate
functions will be called normally like below:
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors where anchor is ARFaceAnchor {
// triggerd
}
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
for anchor in anchors where anchor is ARFaceAnchor {
// triggerd
}
}
So now i have ARFaceAnchor
object, what should i do next? Is it possible to render this ARFaceAnchor
using RealityKit? Or can only be rendered by SceneKit? Because all examples on internet are implemented using SceneKit.