Hello everyone.

We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.

On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.

So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.

Unity Get Hand Position

Jarrod1937Jarrod1937 ✭✭✭
edited April 2016 in Questions And Answers

I was wondering, aside from OpenCV, since the Hololens is already tracking the users hand, if I can get access to the position of the hand in the users view? I'd like to recognize if the user has their hand open to provide them with a menu at the position of their hand. I've been trying to thinking of an intuitive method for allowing the user to access internal menus and I think this would work pretty well. Getting the actual position of the hand is tricky obviously, there are some nicely documented methods in OpenCV, but considering Hololens is already doing some of the work, I am wondering if there is already a method available.

Edit: Or if anyone has any ideas to spontaneously bring up a menu, I'm all for it. Voice could potentially work, a command in the scene to bring up a menu, but I'd like to have a physical method as well in case the user has issues with speech or the environment is too noisy.


According to that, I can access it's position. However, it also states that, "When hands are in other poses, the HoloLens will ignore them." So am I able to get the hand position in any pose? Perhaps define my own gesture?


  • @Jarrod1937 said:
    I'd like to recognize if the user has their hand open to provide them with a menu at the position of their hand.

    With respect to placing menus at the position of their hand you may want to consider Microsoft's guidance regarding placing content within a meter of the user (which typically is where their hands are):

    "Place holograms in the optimal zone - between 1.25m and 5m. Two meters is the most optimal, and the experience is very poor closer than one meter. Assume that a user will not be able to touch your holograms and consider ways of gracefully clipping or fading out your content when it gets too close so as not to jar the user into an unexpected experience."

    However, it does sounds like there are some Gesture apis that might be helpful to detect when the user's hands are in the gesture frame and when the get near the edge of that frame.

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • Ah, I didn't consider the distance aspect. You're right, maybe I'll try and use the navigation gesture as a swipe gesture to switch out menus at the appropriate distance.

  • MichealMicheal ✭✭
    edited April 2016

    You can accomplish what you're looking for (at least the actual hand tracking part) through the interaction manager. Documentation is kind of sparse, but I highly recommend looking at how Galaxy Explorer integrates some of the hand tracking.


    It's a bit tricky, but if you set the source as a hand and primarily work with the update method, you should be able to get a grasp of how it works by logging the position of the Vector3 I.E Debug.Log(handPosition). An interesting thing I learned is that the emulator hand tracking does not report the position correctly. It's typically in the general area, but not precise at all. I somewhat made my own swipe gesture by playing around with this a bit.

    I hope this helps!

  • How about GetHand[r]Finger[i].tip.position? I would also like to project to the hand or fingertip location... despite the recommendations. Anyone know how to access the code? Or is this a Visual Studio problem at this point?

  • Michael, thanks for the pointer to Galaxy Explorer. For those trying this, note the position returned is for the center of your hand (palm area), not your finger, and I could only get it to track my hand in a few positions (e.g. open air-tap, closed air-tap), and not in a general open hand position. I want to echo MoHolo's request for fingertip position.

  • Vote for "cahorton". HoloLens cannot recognize your hands in any shape. It only works when you open air-tap, closed air-tap & bloom. In other words, ITS USELESS!!!!

Sign In or Register to comment.