Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Hand movement tracking and detection help

boruilboruil
edited April 2016 in Questions And Answers

Hey,
I'm a student at University of Washington, currently developing a Hololens application for my VR capstone course. Currently, we are trying to get our application to detect hand movement (X, Y, and Z directions) but we are unable to get it to work. The Holograms 211 tutorial isnt very clear about how each piece of code interacts with each other to achieve hand navigation and the documentation for Navigation Events doesn't have any specifics for what is happening. We are hoping that someone could shed some light on what we need to accomplish hand movement detection.

We have tried using the holoacademy 211 course materials and incorporating the handsmanager, interactiblemanager, interactible, and cursorstates scripts into our project. We expected that the hand detection would trigger a change in the cursor type as seen in the tutorial but the cursor stays the same which probably means that the hands manager is not working correctly.

For some context, we are trying to do something identical to Skype's drawing tool so we need to detect the hand movement in order to draw on the canvas.

Thanks!

Best Answer

Answers

  • Yeah I figured that out. Thank you so much!

  • How much detail can "finger motion" be captured? In other words, can Hololens track each joint in the hand/fingers? I realize that navigation gestures are part of the OS, but in a gaming environment how much detail can the hands and fingers track? Thanks.

  • @mrlinfisher said:
    How much detail can "finger motion" be captured? In other words, can Hololens track each joint in the hand/fingers? I realize that navigation gestures are part of the OS, but in a gaming environment how much detail can the hands and fingers track? Thanks.

    None of this data appears to escape the HPU - unless there are undocumented calls (which you ought not develop on top of anyway), the SDK doesn't provide finger-and-join level abstraction. Hand position (which is pretty rough) and tap state data only.

    This is something that I'd very much like to see made available soon - and I have high hopes that it will be - but if you've ever coded for gesture recognition, you know how finnicky it is. Since Windows Holographic is a platform, not just a device, there is a sort of sense in providing an very high level abstraction that can be swapped out with a clicker or a touchpad controller for future devices that may lack refined gesture recognition. The things it recognizes and converts to clicks and drags are the most predictable and repeatable of gestures.

    I'd love to have Leap Motion-like skeletal geometry, but I understand why maybe you'd keep that out of the initial offering. No point encouraging people to use a feature that may never be as dependable as a mouse click. Air tap and air tap-and-drag meet that standard where many other gestural combinations would not.

  • HoloLens can only detect your hands when they are in air-tap or bloom position, right? Can I say this is useless? Can I know a hand position at anytime?

Sign In or Register to comment.