Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Hands' postures tracking

I'd like conduct some studies about the hands postures tracking by means of HoloLens.
Is it possible to reconstruct hand segments and reproduce their movements by extracting angular data?

Thanks in advance




  • Hey, @AlessandroGreco87

    To my knowledge the hand-related APIs of HoloLens are very limited and only allow you to recognize a set of predefined gestures, event-like (tap, bloom) and continuous (manipulation, navigation). No internals are exposed to the app developer while capturing those gestures, only end result.

    However, there is this Motion Leap that, I believe, was discussed here on the forums earlier.


    Building the future of holographic navigation. We're hiring.

Sign In or Register to comment.