1

We're working on Hololens 2 and have created our own Button design, where we followed a MRTK Tutorial. Now sadly we cannot execute the buttons using the Gaze-cursor in Hololens 2.

We are using our own Configurationprofile, but the same is valid when using the defaultHololens2configurationprofile.

Also there is a weird behaviour (Valid for both profiles mentioned before): When starting the app the Gaze-cursor is visible, the moment my hands are recognized the gaze-cursor disappears (all good until now), but when I move my hands behind my back the gaze-cursor doesn't appear anymore.

Does anybody have a similar problem, knows how to solve it or has observed something similar?

We are using:

    Unity 2020.3.6f1
    MRTK 2.7.0
    All XR Packages up to date, except XR Plugin Management 4.0.1

Here some screenshots which components our buttons have attached: First part of Button componentsSecond part of button componentsThird part of button components

Cheers and thanks for the help

Spoon
  • 173
  • 1
  • 2
  • 12

2 Answers2

0

The reason is that MRTK is currently designed in a way that at a distance hand rays act as the prioritized focus pointers, so the eye gaze is suppressed as a cursor input if hand rays are used.

If you want to use both eye focus and hand rays at the same time, please follow this documentation:Use hand rays and eye-gaze input together. However, in this way, voice command will be the only method to interact with the hologram which focusing on.

Besides, if you want to support a 'look and pinch' interaction, you need to disable the hand ray according to this document:How to support look + hand motions (eye gaze & hand gestures)

Hernando - MSFT
  • 2,895
  • 1
  • 5
  • 11
  • Thanks for the answer. I just tried the first link you provided sadly it didn't work :S Testing it in the Handtracking demo there are buttons that can be controlled using gaze and voice, but in the demos the cursor in the center of the face is the preferred option. I will try a bit more, but if you have more inputs, always welcome. – Spoon Jul 07 '21 at 14:07
  • Could you check out our already configured MRTK [eye tracking examples](https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/features/example-scenes/eye-tracking-examples-overview?view=mrtkunity-2021-05) with tons of great examples that you can directly build on? – Hernando - MSFT Jul 13 '21 at 09:14
  • thanks for the idea. Just build the eye tracking example. Sadly also in there the eye-tracking doesn't work properly: I can look at the menu and see the cursor on it, now saying 'select' makes the 'select' message pop-up but no 'click' is executed. Also I included the eye tracking profile into our scene and the same behaviour. Do you know where this could come from or should I post a bug in Github? Cheers – Spoon Jul 13 '21 at 18:36
  • We recommend you run the Calibration process manually first (please navigate to System > Calibration > Eye Calibration > Run eye calibration). For more information, please see: https://learn.microsoft.com/en-us/hololens/hololens-calibration#manually-starting-the-calibration-process – Hernando - MSFT Jul 15 '21 at 08:15
  • I did run the eye calibration and it still doesn't work. Using an old project of mine that runs on MRTK 2.3 to 2.4 the selection of the buttons and the gaze was running just fine. Same device and same eye calibration. – Spoon Jul 16 '21 at 08:30
0

I filed the following GitHub issue and it's being investigated - "Select" voice command does not fire the appropriate events when using OpenXR on HoloLens 2