ARKit 2.0
TrueDepth front-facing camera of iPhone X
/Xr
/Xs
gives you Depth channel at 15 fps frame rate plus Image front-facing camera gives you RGB channels at 60 fps frame rate.

Principle of work: It's like Depth Sensing System in MS Xbox Kinect but more powerful. Infrared emitter projects over 30,000 dots in a known pattern onto the user’s face. Those dots are then photographed by a dedicated infrared camera for analysis. There is a proximity sensor, presumably so that the system knows when a user is close enough to activate. An ambient light sensor helps the system set output light levels.
At the moment only iPhone X
/Xr
/Xs
models have TrueDepth Camera. If you don't have TrueDepth Camera
and Sensor System
in your iPhone (just like iPhone SE, iPhone 6s, iPhone 7 and iPhone 8 don't have) you cannot use your gadget for such features as Animoji
, Face ID
, or Depth Occlusion Effects
.
In ARKit 2.0 framework a configuration that tracks the movement and expressions of the user’s face with the TrueDepth camera uses special class ARFaceTrackingConfiguration
.
So, the answer is NO, you can use the front-facing camera of iPhones with A11 and A12 chipset (or higher version), or iPhone with TrueDepth Camera
and its Sensor System
.
ARKit 3.0 (addition)
Now ARKit allows you simultaneously track a surrounding environment with back camera and track you face with front camera. Also, you can track up to 3 faces at a time.
Here's a two code snippets how to setup you configuration.
First scenario:
let configuration = ARWorldTrackingConfiguration()
if configuration.supportsUserFaceTracking {
configuration.userFaceTrackingEnabled = true
}
session.run(configuration)
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors where anchor is ARFaceAnchor {
// you code here...
}
}
Second scenario:
let configuration = ARFaceTrackingConfiguration()
if configuration.supportsWorldTracking {
configuration.worldTrackingEnabled = true
}
session.run(configuration)
func session(_ session: ARSession, didUpdate frame: ARFrame) {
let transform = frame.camera.transform
// you code here...
}