The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Detecting if Airtap is tapped/hold regardless of gaze
I currently detect a held Airtap using the iHoldHandler and "OnHoldStarted"/"OnHoldCompleted" on a GameObject. However, this means the release of the Airtap is only detected when releasing it while gazing at that object.
How can I always detect the release of an held Airtap, regardless of the gaze? Similar to that: how do I detect an Airtap click when not looking at any detectable object?
Thanks
Best Answer
-
stepan_stulov ✭✭✭
Hey, @Thomvdm
Gestures, such as air tap, hold, navigation etc. are fundamentally global and the gaze is irrelevant (well, as long as HoloLens still sees your hands). To catch gestures you can simply use GestureRecognizer class.
It's just that HoloToolkit (which you apparently use) bakes the gaze and gestures concepts together into ITapHandler, IHoldHandler etc. for convenience. There is also a way to do global gestures in HoloToolkit but I'm not sure how.
Remember, that HoloToolkit is built on top of existing Unity (and native UWP) APIs and its use is optional and its approaches are not the only ones. It just so happens that HoloToolkit is traditionally promoted as the default solution.
Hope this helps.
Building the future of holographic navigation. We're hiring.
6
Answers
Hey, @Thomvdm
Gestures, such as air tap, hold, navigation etc. are fundamentally global and the gaze is irrelevant (well, as long as HoloLens still sees your hands). To catch gestures you can simply use GestureRecognizer class.
It's just that HoloToolkit (which you apparently use) bakes the gaze and gestures concepts together into ITapHandler, IHoldHandler etc. for convenience. There is also a way to do global gestures in HoloToolkit but I'm not sure how.
Remember, that HoloToolkit is built on top of existing Unity (and native UWP) APIs and its use is optional and its approaches are not the only ones. It just so happens that HoloToolkit is traditionally promoted as the default solution.
Hope this helps.
Building the future of holographic navigation. We're hiring.
Detect an airtap when not looking at any object : use modal input.
Release of airtap held airtap when not focused - same thing (modal)
Events are consumed:
Modal
Focused Object
FallBack
You could also check for when hands are no longer detected and consider that as an airtap release (which sounds like what you want to solve).
Hope this helps!
Src:
https://github.com/Microsoft/MixedRealityToolkit-Unity/issues/277
I use this code in my projects:
For current versions of the MRTK, check out how it is done in OverrideFocusedObjectTest.unity and TestOverrideFocusedObject.cs. See https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/master/Assets/HoloToolkit/Input/README.md