Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Left/Right hand tracking

I am trying to detect when the left or right hand is in the FOV. I couldn't find any variable that does this, so I am trying to manually doing it by checking the state.source.id from the HandInput script. Seems the id is randomized for each hand, being 0 or 1 either left or right.
When both hands are in the FOV then one is 0 and the other one is 1. I've noticed that's similar to what the Leap Motion SDK does.

Seems a unique ID is assigned to the hand, whose value remains the same across consecutive frames while the tracked hand remains visible. If the tracking is lost (for example, when a hand is occluded by another hand or when it is withdrawn from or reaches the edge of the Hololens field of view), it seems to assign a new ID when it detects the hand in a future frame.
I was planning to identify which hand is farthest to the left or right, but I am curious if there already a variable that tells you if it's the left or right hand?

Thanks :)

www.judithamores.com

Tagged:

Best Answer

  • Answer ✓

    @jdthamores,
    You are correct in your interpretation that once a hand is seen it is assigned an ID that remains constant while it is being tracked. There is no indication of which hand is the left or the right.

    One thing to watch out for in your approach to determining which hand is which is that users can cross their arms and cause the hands to switch sides within the gesture frame.

    Thanks!
    David

Answers

  • Answer ✓

    @jdthamores,
    You are correct in your interpretation that once a hand is seen it is assigned an ID that remains constant while it is being tracked. There is no indication of which hand is the left or the right.

    One thing to watch out for in your approach to determining which hand is which is that users can cross their arms and cause the hands to switch sides within the gesture frame.

    Thanks!
    David

  • Ok, thanks! It would be great to have that in future API versions.

    Judith

    www.judithamores.com

  • @jdthamores as mentioned there are no APIs or values that give your left Vs right information. Honestly this might be a dangerous assumption to make in your app :) There are several cases where this left vs right assumption can break:
    1. Crossed hands.
    2. Occluded hands as you point out.
    3. Multiple hands or someone else's hand.

    Out of curiosity, what scenario are you trying to build that would require this knowledge? Thanks!

  • In regards of the crossed hands, I guess once you have identified the id of which hand is farthest to the left or right (at the moment they enter the FOV) you already know if it's the left or right hand?
    So you can know if they are crossing the hands by the ID and position from the beginning.
    The problem would be if the user starts with his hands completely crossed before assigning the left/right and then moves his hands to the gesture frame?

    As I mentioned in the other comment I am just trying to experiment with other kind of gestures, I would love to experiment with both hands to zoom in/out etc since there is no current finger tracking available.

    Judith

    www.judithamores.com

  • @jdthamores,
    Correct, if the user starts with crossed hands, your logic will be backwards.

    From experience, two handed gestures are very challenging. It is very easy for one hand to fall out of the gesture frame, as the user performs the gesture, moves his/her head, etc.

    Thanks!
    David

  • i know this a year ago but is it possible for example:
    if the two hands are detected do something like open a menu??
    not only every hand by it self

  • Hey Judith,

    I just wanted to know, if you found a solution to this problem. I am also trying to simultaneously track both hands and be able to differentiate, which hand is which(not necessarily right and left). I am struggling with the current iteration. When I get the sourceId from what I believe to be the eventdata corresponding to each hand and a second source is detected, the source id is different and automatically updates the eventdata of the first hand that was detected.

    Any help is very much appreciated, thanks.


    Maximiliano
  • > @Deaa said:
    > i know this a year ago but is it possible for example:
    > if the two hands are detected do something like open a menu??
    > not only every hand by it self

    The cursor script has a int variable called visibleHandsCount that tells you how many hands are currently detected, so I believe this is veru much possible.
  • Here's HOW to do it, at least it worked for me several months ago. I had similar concerns about "should I do it?" but since the original post is just curious as to HOW, here it is.

    In the "OnSourceDetected" method, from ISourceStateHandler, you can add some code like - it's basically taking your hand's detected position and your gaze position and getting a dot product based on those, that can determine which hand is used.

    Likewise this interface's methods can be used to increment/decrement some 'hands visible' counter

    if (eventData.InputSource.TryGetPosition(eventData.SourceId, out handDetectedPosition))
                {
     Vector3 heading = handDetectedPosition - GazeManager.Instance.GazeOrigin;
                    Vector3 perp = Vector3.Cross(GazeManager.Instance.GazeTransform.forward, heading);
                    float dot = Vector3.Dot(perp, GazeManager.Instance.GazeTransform.up);
    
                    if (dot <= 0)
                    {
                        detectedHandOnLeft = true;  //left hand detected
                    }
                    else
                    {
                        detectedHandOnLeft = false; //right hand detected
                    }
    }
    
Sign In or Register to comment.