3

Apple document says you can set userFaceTrackingEnabled to simultaneous front and back camera. After add ARView and setting configuration correctly, i can confirm that ARSessionDelegate functions will be called normally like below:

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
    for anchor in anchors where anchor is ARFaceAnchor {
        // triggerd
    }
}

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    for anchor in anchors where anchor is ARFaceAnchor {
        // triggerd
    }
}

So now i have ARFaceAnchor object, what should i do next? Is it possible to render this ARFaceAnchor using RealityKit? Or can only be rendered by SceneKit? Because all examples on internet are implemented using SceneKit.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Eros Cai
  • 45
  • 6

2 Answers2

1

If you wanna use RealityKit rendering technology you should use its own anchors.

So, for RealityKit face tracking experience you just need:

AnchorEntity(AnchoringComponent.Target.face)

And you don't even need session(_:didAdd:) and session(_:didUpdate:) instance methods in case you're using Reality Composer scene.

If you prepare a scene in Reality Composer .face type of anchor is available for you at start. Here's how non-editable hidden Swift code in .reality file looks like:

public static func loadFace() throws -> Facial.Face {

    guard let realityFileURL = Foundation.Bundle(for: Facial.Face.self).url(forResource: "Facial", 
                                                                          withExtension: "reality") 
    else {
        throw Facial.LoadRealityFileError.fileNotFound("Facial.reality")
    }

    let realityFileSceneURL = realityFileURL.appendingPathComponent("face", isDirectory: false)
    let anchorEntity = try Facial.Face.loadAnchor(contentsOf: realityFileSceneURL)
    return createFace(from: anchorEntity)
}

If you need a more detailed info about anchors, please read this post.

P.S.

But, at the moment, there's one unpleasant problem – if you're using a scene built in Reality Composer, you can use only one type of anchor at a time (horizontal, vertical, image, face, or object). Hence, if you need to use ARWorldTrackingConfig along with ARFaceTrackingConfig – don't use Reality Composer scenes. I'm sure this situation will be fixed in the nearest future.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
0

I believe, it cannot be done by Reality Kit, As i read documentation of face tracking, i could not find anything about tracking with Reality Kit. But you may use SceneKit and also SpriteKit. Please check this document.

https://developer.apple.com/documentation/arkit/tracking_and_visualizing_faces

This sentence also took my attention.

This sample uses ARSCNView to display 3D content with SceneKit, but you can also use SpriteKit or build your own renderer using Metal (see ARSKView and Displaying an AR Experience with Metal).

Yucel Bayram
  • 1,653
  • 2
  • 22
  • 41
  • Yeah, I also searched all the documents i could find, but find nothing. Apple recommends RealityKit so much, I can hardly imagine that it doesn't support it! – Eros Cai Jan 10 '20 at 15:48