Hello everyone.
The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Options
Pupil Tracking
Is it possible to do pupil tracking for cursor manipulation? If not, is this something that is in development for future revisions?
2
Best Answers
-
OptionsBrekel ✭✭
There is no camera facing the eyes in the current hardware, so pupil tracking is not possibly unless you find a way to integrate your own sensor(s)/camera(s).
5 -
OptionsHoloSheep mod
@LanceLarsen there is a calibration app built in that guides you through a multi step process where you are asked to match your finger placement with a system generated view target. Your pupillary distance is calculated based on your responses and is not measured by an eye tracking sensor.
Windows Holographic User Group Redmond
WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
WinHUGR YouTube Channel -- live streamed meetings5
Answers
There is no camera facing the eyes in the current hardware, so pupil tracking is not possibly unless you find a way to integrate your own sensor(s)/camera(s).
Personally I think something like this may be a bit disorienting
Understand that it there are no eye tracking cameras -- how does the Hololens measure the pupillary distance?
//Lance Larsen | www.lancelarsen.com | @lancelarsen | 608.31LANCE
//HoloSoft | holosoft.net | @holosoftnet | www.facebook.com/holosoft.net
//MADdotNET President | maddotnet.commaddotnet
@LanceLarsen there is a calibration app built in that guides you through a multi step process where you are asked to match your finger placement with a system generated view target. Your pupillary distance is calculated based on your responses and is not measured by an eye tracking sensor.
Windows Holographic User Group Redmond
WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
WinHUGR YouTube Channel -- live streamed meetings
@runamuck, I don't think this would be disorienting as it would not move the camera in, for instance, a Unity scene but only used to place a cursor or ray cast from the camera. This could cause less strain on a users neck in things like Roboraid. I really think this would be a very useful feature and the same camera could be used to make a much fast IPD measurement.
FWIW I have not had much luck (in terms of comfort and accuracy) on other projects with eye tracking for repetitive, precision selection tasks using Tobii EyeX. Gaze is also challenging, but a bit easier to work with than eye tracking.
There is some good research on fitt's law comparision for eye tracking based selection: https://www.researchgate.net/publication/221052647_A_Fitts_Law_comparison_of_eye_tracking_and_manual_input_in_the_selection_of_visual_targets tl;dr hybrid approaches like gaze + clicker are the most accurate.
With Gaze based selection, one trick is to get folks to move their entire body. Take a page from the book of VR avatar 'physics' - "humans lead with their head, followed by their shoulders, and then torso".
Repetitive, precision selection tasks can become uncomfortable or feel less natural when the player moves only their neck. Space out or move content sufficiently within the scene to get the player to engage their head/shoulder/torso more. This can even work with vertical lists if tuned well. If a player is to follow a character, follow ahead or behind them enough to avoid small saccades.
This can be a challenge with world anchored menus. The rub lies in that the player can position themselves an arbitrary distance [or angle] from the objects they are selecting between. The Roboraid wall menu is a good example of this; difficult to select between options when far away and easy up close. I want to explore a menu system that adjusts the listview padding based on distances from the viewer for this reason.
UX Designer @ Anki
https://www.twitter.com/wintermoot
@wintermoot, thanks for the link, a good starting point for some human factors engineering topics. I hadn't heard about Fitt's Law before, but instantly recognized it as being quite similar to Shannon's theorem on channel capacity. Makes sense though, the human mind deals with information in channels and noise all the time, so makes sense something linked to our nervous system would behave similarly.