Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.
Options

How to transform gesture move relative vector to move an object?

My be it is a newbe question.
In my first HL app I can move a object in a spatial mapped scene.
If I stay in front of the scene, the coordinate axis of both worlds are parallel. The moving directions of hand and object fits.
Which transformation(s) I have to use for a correct work from all view points?
I tried something without success.

Best Answer

Answers

  • Options
    utekaiutekai ✭✭✭

    You can capture the position of object at the start of the gesture and then the translation captured and offered up at the completion of the gesture, and add that translation to the object's position at the start and ... voila ... you've moved the object relative to the gesture.

  • Options

    Thanks utekai for quick answer!
    But this I do, so I think.
    I add the gesture translation to the objects position.
    I use in the event NavigationUpdateEvent the relativePositions difference as the translation vector.
    But the relativePosition of my hand and the objects position are in different worlds.

Sign In or Register to comment.