The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Hand movement tracking and detection help
I'm a student at University of Washington, currently developing a Hololens application for my VR capstone course. Currently, we are trying to get our application to detect hand movement (X, Y, and Z directions) but we are unable to get it to work. The Holograms 211 tutorial isnt very clear about how each piece of code interacts with each other to achieve hand navigation and the documentation for Navigation Events doesn't have any specifics for what is happening. We are hoping that someone could shed some light on what we need to accomplish hand movement detection.
We have tried using the holoacademy 211 course materials and incorporating the handsmanager, interactiblemanager, interactible, and cursorstates scripts into our project. We expected that the hand detection would trigger a change in the cursor type as seen in the tutorial but the cursor stays the same which probably means that the hands manager is not working correctly.
For some context, we are trying to do something identical to Skype's drawing tool so we need to detect the hand movement in order to draw on the canvas.