Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Changing Navigation Range

The Navigation gesture appears to use a range between -1 and 1 that corresponds to moving your hand less than a foot left or right. Is it possible to "unlock" this range so that Navigation can detect movement across the whole FOV?

Use case: I am trying to use the Navigation gesture to turn an object in a way that correspond to the distance the hand moved rather than the standard Navigation use of rotating the object continuously. Something like move your hand a few inches rotate the object 30 degrees and move your hand all the way across the FOV and rotate it 180 degrees.

I using Navigation rather than Manipulation because Navigation appears to use coordinates relative to the current direction the user is facing. Using Manipulation the coordinates are absolute so moving the object to the other side of me resulted in the object rotating backwards.


Best Answer


  • You will need to use Manipulation. Navigation is meant to enable continuous gestures with your hand position offset from the starting position whereas Manipulation is 1:1 with your hand movements. If you find your object is rotating backwards. Isolate that condition with an if statement and multiply the rotation by -1.


  • VRLakeVRLake
    edited June 2016

    Is there a way to use Manipulation tracked relative to the user's Gaze? Preferably without doing a great deal of mucking about with projections and such between the gaze vector and the manipulation vector.

    My issue is not that rotation is backwards all the time. It's that rotation is correct when I start, but changes based on where I move the object. So when I am facing the direction I launched the app then moving my hand to the right is in the positive X direction. Then I move the object behind me and moving my hand to the right is negative X. I move the object to my left or right from the starting direction and sideways hand movements are in the Z axis.

  • Why not transform the position of the hand from world space into camera space?

  • @Pentose said:
    Why not transform the position of the hand from world space into camera space?

    Is there a function that already exists for doing that?

    My current solution is create two rotations that would rotate the gaze vector to point along the Positive Z axis and apply those rotations to the position vector for the hand.

  • Continuing on this issue, is there any way to make the clicker trigger the manipulation event.

  • If you click and hold the clicker, then rotate your hand it should act as a manipulation:


  • Actually, holding the clicker and rotating your hand will still trigger Navigation (along the Z plane), not Manipulation. Manipulation, which allows for free-form movement along all planes, requires hand tracking.

  • It's my fault, I told Jackson it worked. I was thinking about the shell (which uses navigation events to allow you to 'manipulate' your windows), not the API.

    This post provided as-is with no warranties and confers no rights. Using information provided is done at own risk.

    (Daddy, what does 'now formatting drive C:' mean?)

Sign In or Register to comment.