3

I am developing an app that needs to use the front facing camera of the iPhone for Augmented Reality experience using swift. I have tried to use the ARKit, but the front facing camera made by the ARKit is only supported for iPhone X.

So, which frameworks or libraries that I can use with swift to develop apps that has AR experience especially fro front facing camera, other than ARKit?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220

2 Answers2

4

ARKit 2.0

TrueDepth front-facing camera of iPhone X/Xr/Xs gives you Depth channel at 15 fps frame rate plus Image front-facing camera gives you RGB channels at 60 fps frame rate.

enter image description here

Principle of work: It's like Depth Sensing System in MS Xbox Kinect but more powerful. Infrared emitter projects over 30,000 dots in a known pattern onto the user’s face. Those dots are then photographed by a dedicated infrared camera for analysis. There is a proximity sensor, presumably so that the system knows when a user is close enough to activate. An ambient light sensor helps the system set output light levels.

At the moment only iPhone X/Xr/Xs models have TrueDepth Camera. If you don't have TrueDepth Camera and Sensor System in your iPhone (just like iPhone SE, iPhone 6s, iPhone 7 and iPhone 8 don't have) you cannot use your gadget for such features as Animoji, Face ID, or Depth Occlusion Effects.

In ARKit 2.0 framework a configuration that tracks the movement and expressions of the user’s face with the TrueDepth camera uses special class ARFaceTrackingConfiguration.

So, the answer is NO, you can use the front-facing camera of iPhones with A11 and A12 chipset (or higher version), or iPhone with TrueDepth Camera and its Sensor System.


ARKit 3.0 (addition)

Now ARKit allows you simultaneously track a surrounding environment with back camera and track you face with front camera. Also, you can track up to 3 faces at a time.

Here's a two code snippets how to setup you configuration.

First scenario:

let configuration = ARWorldTrackingConfiguration()

if configuration.supportsUserFaceTracking {
    configuration.userFaceTrackingEnabled = true
}
session.run(configuration)

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
    for anchor in anchors where anchor is ARFaceAnchor {
        // you code here...
    }
}

Second scenario:

let configuration = ARFaceTrackingConfiguration()

if configuration.supportsWorldTracking {
    configuration.worldTrackingEnabled = true
}
session.run(configuration)

func session(_ session: ARSession, didUpdate frame: ARFrame) {
    let transform = frame.camera.transform
    // you code here...
}
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • But yet still there are applications that uses ARKit for AR apps development. One of the apps I have found working on iPhone 7 for both frontal and rear camera is called Bemo. So, I am sure that there is a way to use AR with the front camera. Even if I know that the facial recognition for AR experience is only available for iPhone X. – Mohammad Albardaweel Aug 29 '18 at 14:00
  • AppStore rejects apps with third-party frameworks and libraries (like Bemo). But if you're using The Enterprise Apple Developer Program (what provides resources for developing and distributing proprietary, in-house iOS apps to employees) – you're maybe welcome! – Andy Jazz Aug 29 '18 at 14:04
  • Bemo app uses only ARKit, as noted in the description, and it is in the AppStore – Mohammad Albardaweel Aug 29 '18 at 14:07
  • Sometimes AppStore may miss such an app due to inattention. But after some time App with a third-party framework will be deleted from AppStore (there are countless examples of this). Have you seen a code of `Bemo` app? – Andy Jazz Aug 29 '18 at 14:24
  • Apple locked a front camera (for developers of ARKit for iPhone 6s/7/8) for a definite purpose, didn't it ??? It's a pure marketing of iPhone X. – Andy Jazz Aug 29 '18 at 14:31
  • Read this SO post for detailed info: https://stackoverflow.com/questions/47083462/can-i-require-a-user-to-have-a-true-depth-camera-to-download-my-app-from-the-app – Andy Jazz Aug 29 '18 at 14:53
4

ARKit isn't the only way possible to create "AR" experiences on iOS, nor is it the only way that Apple permits creating "AR" in the App Store.

If you define "front-facing-camera AR" as something like "uses front camera, detects faces, allows placing virtual 2D/3D content overlays that appear to stay attached to the face", there are any number of technologies one could use. Apps like Snapchat have been doing this kind of "AR" since before ARKit existed, using technology they've either developed in-house or licensed from third parties. How you do it and how well it works depends on the technology you use. ARKit guarantees a certain precision of results by requiring a front-facing depth camera.

It's entirely possible to develop an app that uses ARKit for face tracking on TrueDepth devices and a different technology for other devices. For example, looking only at what you can do "out of the box" with Apple's SDK, there's the Vision framework, which locates and tracks faces in 2D. There's probably a few third party libraries out there, too... or you could go looking through academic journals, since face detection/tracking is a pretty active area of computer vision research.

rickster
  • 124,678
  • 26
  • 272
  • 326