2

I am using ARKit 2 under iOS 12 (16A5288q), building with Xcode 10 beta 6, running on an iPhone X, and lookAtPoint is always zeroes.

I access the face data (in Swift) with:

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let faceAnchor = anchor as? ARFaceAnchor else { return }

    FaceAnchorsProcessedCount = FaceAnchorsProcessedCount + 1
    let rightEyeTransform: simd_float4x4 = faceAnchor.rightEyeTransform
    let leftEyeTransform:  simd_float4x4 = faceAnchor.leftEyeTransform
    let lookAtPoint:       simd_float3   = faceAnchor.lookAtPoint
}

And I get data like:

rightEyeTransform    simd_float4x4    \n[ [9.999874e-01, 0.000000e+00, 5.010252e-03, -3.208227e-02],\n  [2.375229e-04, 9.988756e-01, -4.740678e-02, 2.703529e-02],\n  [-5.004618e-03, 4.740737e-02, 9.988630e-01, 2.525132e-02],\n  [0.000000e+00, 0.000000e+00, 0.000000e+00, 1.000000e+00] ]\n    
leftEyeTransform     simd_float4x4    \n[ [9.978353e-01, 0.000000e+00, -6.576237e-02, 3.208223e-02],\n  [-3.110934e-03, 9.988804e-01, -4.720329e-02, 2.703534e-02],\n  [6.568874e-02, 4.730569e-02, 9.967182e-01, 2.525137e-02],\n  [0.000000e+00, 0.000000e+00, 0.000000e+00, 1.000000e+00] ]\n    
lookAtPoint          simd_float3      (0.000000e+00, 0.000000e+00, 0.000000e+00)    

What am I doing wrong? Or is this a known bug?

UPDATED 4 Oct 2018 I did a simple test of lookAtPoint today. I moved my face close to the handset, and then farther away, and close again; repeatedly. The minimum z for lookAtPoint was 38.59 inches, and the max was 39.17 inches (converted from meters).

The actual distances, measured with a measuring tape, were ~4.5 inches and ~33 inches.

Apple's declaration that lookAtPoint will "[...] estimate what point, relative to the face, the user's eyes are focused upon." does not seem to be correct.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Rob Tow
  • 111
  • 8

2 Answers2

1

Today iOS 12 was released, along with XCode 10 (replacing the beta releases). I tested accessing lookAtPoint with these new releases, and am now getting populated vectors.

Swift code:

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let faceAnchor = anchor as? ARFaceAnchor else { return }

    let lookAtPoint: simd_float3 = faceAnchor.lookAtPoint
    os_log("lookAtPoint: %.12f,%.12f,%.12f", type: .debug, lookAtPoint.x, lookAtPoint.y, lookAtPoint.z)

Log output:

2018-09-17 16:17:12.097369-0700 EyeSync[512:41060] lookAtPoint: 0.049317009747,-0.004630976822,0.981833696365
2018-09-17 16:17:12.113925-0700 EyeSync[512:41060] lookAtPoint: 0.050239805132,-0.006484962534,0.981752157211
2018-09-17 16:17:12.130867-0700 EyeSync[512:41060] lookAtPoint: 0.051697697490,-0.011350239627,0.981206715107
2018-09-17 16:17:12.147272-0700 EyeSync[512:41060] lookAtPoint: 0.052744854242,-0.012763299979,0.981896817684
2018-09-17 16:17:12.163683-0700 EyeSync[512:41060] lookAtPoint: 0.054889015853,-0.015469233505,0.982917487621
2018-09-17 16:17:12.180636-0700 EyeSync[512:41060] lookAtPoint: 0.056391790509,-0.017265520990,0.983718335629
2018-09-17 16:17:12.197387-0700 EyeSync[512:41060] lookAtPoint: 0.059109147638,-0.018527992070,0.983208477497
2018-09-17 16:17:12.214021-0700 EyeSync[512:41060] lookAtPoint: 0.061453290284,-0.019032688811,0.981536626816
2018-09-17 16:17:12.230689-0700 EyeSync[512:41060] lookAtPoint: 0.063107110560,-0.019657038152,0.978309571743
Rob Tow
  • 111
  • 8
0

Yep, I tried it one month ago and I could say that you're absolutely right – at the moment the lookAtPoint instance property doesn't work or even stubbed out. Now it always returns float3(0.0, 0.0, 0.0).

I guess Apple hasn't implemented it yet (it's in beta state). Cross eyes' detection is an ARKit's feature that we'll seemingly see in iOS 12 final stable release.

At the moment I don't have a Mac and I can't check it, so try to use open class with open instance properties like these:

open class ARFaceAnchor: ARTrackable {
    open var leftEyeTransform: simd_float4x4 { get } 
    open var rightEyeTransform: simd_float4x4 { get } 
    open var lookAtPoint: simd_float3 { get }
}

Hope it helps!

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • There is seemingly an example of lookAtPoint working here: However, I downloaded the same sample code from Apple Developer, added the on-liner to print, and got zeroes, again, for lookAtPoint. Was it working earlier, and then stopped? – Rob Tow Aug 24 '18 at 21:48
  • Thanks, @andy, but to me as a beginning Swift programmer (I am more of C and Java guy) this is a bit obscure as a code snippet; I am not quite able to figure out how to use it in the context I am exploring. :-( – Rob Tow Aug 25 '18 at 17:18
  • I did a simple test of lookAtPoint today. I moved my face close to the handset, and then farther away, and close again; repeatedly. The _minimum_ z for lookAtPoint was 38.59 inches, and the _max_ was 39.17 inches (converted from meters). – Rob Tow Oct 04 '18 at 22:44
  • And what's the result? – Andy Jazz Oct 04 '18 at 22:46
  • I did a simple test of `lookAtPoint` today. I moved my face close to the handset, and then farther away, and close again; repeatedly. The _minimum_ z for lookAtPoint was 38.59 inches, and the _max_ was 39.17 inches (converted from meters). The **actual** distances, measured with a measuring tape, were ~4.5 inches and ~33 inches. Apple's declaration that `lookAtPoint` will "[...] estimate what point, relative to the face, the user's eyes are focused upon." does not seem to be correct. – Rob Tow Oct 04 '18 at 22:52
  • That's interesting. Please post it in your question as UPDATED or P.S. issue. – Andy Jazz Oct 04 '18 at 22:54