Hello everyone.
The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Hand or depth camera API? Getting hand or individual finger position?
I'd like to explore creating some of my own gestures (ie, scrolling by flicking your hand), finger based pointing, etc.
Is there anyway I can get the hand or positions aside from using a manipulation gesture?
Something with the richness of the LeapMotion api would be amazing.
Short of that, is there anyway to access the depth camera data to try and infer it yourself?
Tagged:
1
Best Answer
-
HoloSheep mod
Hi @Anandx, the question has been asked and discussed (a lot) in previous threads.
Search the forums on terms like "custom gestures" or check out this thread as a good example of the discussion.Bottom line, no advanced custom gesture support is not currently supported.
Windows Holographic User Group Redmond
WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
WinHUGR YouTube Channel -- live streamed meetings5
Answers
Hi @Anandx, the question has been asked and discussed (a lot) in previous threads.
Search the forums on terms like "custom gestures" or check out this thread as a good example of the discussion.
Bottom line, no advanced custom gesture support is not currently supported.
Windows Holographic User Group Redmond
WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
WinHUGR YouTube Channel -- live streamed meetings
That link is very helpful, thanks! I didn't see it on my initial search.
It seems that InteractionSourceState in HandsManager.cs mentioned here could be useful but I can't seem to find documentation on it anywhere. Don't suppose you have any pointers?
@anandx HandsManager.cs can be found in the HoloToolkit under:
Input | Scripts
the online docs on github simply say:
Windows Holographic User Group Redmond
WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
WinHUGR YouTube Channel -- live streamed meetings
@HoloSheep is there any api so that we can get each and every joint as in kinnect or leap device.
@krishna_chaithanya31
As for the skeleton and body tracking question the current version of HoloLens does not expose any skeleton or body tracking api (related to either the device user or other people the environment) but it has been mentioned that you could set up an environment that includes a Kinect and use its skeleton and body tracking capability.
Windows Holographic User Group Redmond
WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
WinHUGR YouTube Channel -- live streamed meetings