The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Toggle of tools
I am trying to implement a zooming function which one can toggle up when a zoom button is pressed. This is very similar to what is designed in Galaxy Explorer project so I try to see how it is done in that project first. But I couldn't really find the point where toggle of functions (of cursor) happens.
The README suggests it is in ToolPanel (Tool section of README). But I looked thru the code and to me it is only handling situations where the panel is out of view and its re-position, as well as relevant fading. It doesn't say how to toggle the usage of cursor from, for example, another tool like tilt, to zoom.
To my understanding, we have two general functions for in gesture: manipulation and navigation. And since both tilt and zoom seem to fall in group of navigation, I am guessing there is no toggle between manip and nav at all. But now question comes to how you toggle from tilt (where NavigationY is enabled) to zoom (where NavigationX is enabled) since obviously in the demo of GE there is no use in zooming when you move your hands vertically in Y direction. ToolPanel seems not doing this.
Also, another piece I cannot understand is the interaction source of zoom. To me it should be Hand, but the code I found says it is Controller. If you look into Tool.cs and find function HandleUpdatedInput, you can see how zoom process is achieved frame by frame. And in Select function of Tool.cs, it is added explicitly to InputUpdated event, which is called by OnNavigationUpdated of InputRouter.cs. In turn, OnNavigationUpdated is called in Update function, passed with InteractionSourceKind.Controller as argument. And the second argument suggests it is fake input indeed. So why am I able to do the zoom with hands in GE?
Sorry the post may seem a little messy; I will separate it in part 1 and 2. Any discussions/suggestions are welcomed. Thank you .
What I plan to do now: probably just let NavigationZ handle the zoom directly and forget about the whole toggle issue. But again, really, I am very curious what I missed in code base of GE.