Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Issues with Tracked Controller Position in Unity 2017.2.0b10

Hi,

So I've been getting the Acer Mixed Reality HMD working in Unity over the weekend, and am hitting into some issues with the controllers.

First, I can only get the tracked controllers to even detect if I follow the steps outlined here:
https://forums.hololens.com/discussion/8389/detecting-motion-controllers-in-unity
(Holding down the play button then putting on the headset) How they worked that out I'll never know.

InteractionSourceState.sourcePose.TryGetPosition(out pos, InteractionSourceNode.Pointer) is working for me, however, the position it returns is right next to the origin. From what I can see, its seems to be centering the positioning system of the controller where they are first tracked. So the Left and Right controllers are overlapping if I just pass the position directly into them.

I've tried parenting them to the camera, but that makes them move with my head, which makes sense.

Is there a specific offset the controllers need? or object they need to be a child of?

Also,
InteractionSourceState.sourcePose.TryGetPosition(out pos)
and
InteractionSourceState.sourcePose.TryGetPosition(out pos, InteractionSourceNode.Grip)
fail to find a position at all. Does anyone know why that is?

Tagged:

Answers

  • I don't have a set of motion controllers, however in the Mixed Reality Toolkit Repo, there is a branch titled Dev_Unity_2017.2 that has an implementation of displaying and tracking the controllers. Might be worth a look.

    https://github.com/Microsoft/MixedRealityToolkit-Unity/tree/Dev_Unity_2017.2.0

  • Hi,

    Thanks Jesse for for pointing this out. I have just tested the motion controllers in the MotionControllerTest.unity scene of the Dev_Unity_2017.2.0 branch and buttons are detected but don't see the motion controllers displayed anywhere. Reported grip position and rotation in the VR debug console is (0,0,0).

    Do you reproduce the issue or you managed to see or at least get the pose of the motion controllers in latest Unity 2017.2.0b10?

  • I've also been having this issue as well, would love to hear if there's a solution to it.

  • Sorry, the reason they don't show up yet is the implementation in the HTK used glTF streaming of the model. The model for the controllers will be delivered in a driver update for the Motion Controllers, but has not yet been delivered.

    That's why nothing is showing up yet in that implementation.

  • Actually, please correct me if I am wrong but it is not just the motion controller model that is not visible but the reported position and rotation that is (0,0,0).

    For example in MixedRealityToolkit-Unity project, in ControllerVisualizer.InteractionManager_InteractionSourceUpdated method, adding a bit of debug shows that the scope of:

    if (obj.state.sourcePose.TryGetPosition(out newPosition, InteractionSourceNode.Grip))

    is never entered and consequently the controller position remains (0,0,0). In addition, there is even LeftControllerOverride and RightControllerOverride to override the motion controller displayed 3D model.

  • Hi !

    I have exactly the same problem on Unity 2017.2.0b10
    So I downgrade on Unity 2017.2.0b8 and it's "almost" work like a charm.

    Hope Unity will fixe this issue on the next version.

  • New version of Unity Beta this morning. I don't have motion controllers to test with, so I don't know if that fixes it or not. Also, I am expecting a Windows Build to drop today as well, and that always could have improvements too...

    https://unity3d.com/unity/beta

  • @Jesse_McCulloch said:
    New version of Unity Beta this morning. I don't have motion controllers to test with, so I don't know if that fixes it or not. Also, I am expecting a Windows Build to drop today as well, and that always could have improvements too...

    https://unity3d.com/unity/beta

    Still not working for me in 2017.2.0b11, but the release notes suggest it should be working.

  • demonixisdemonixis
    edited September 2017

    Hi there,

    Why don't you use the built-in input API for motion controllers? It's the same for the Vive and the Rift, no need to a 3rd party SDK.

    leftControllerTransform.localPosition = InputTracking.GetLocalPosition(XRNode.LeftHand);
    leftControllerTransform.localRotation = InputTracking.GetLocalRotation(XRNode.LeftHand);
    

    I made a small script to work with the unity XR input implementation. You can get the sources here.

    var input = XRInput.Instance;
    var triggerLeft = input.GetButtonDown(XRButton.Trigger, true);
    var gripRight = input.GetButtonDown(XRButton.Grip, false);
    

    Finally, this is how I detect the motion controllers. It's tested with the simulator. I don't know if it works well because the simulator is not so easy to use.

    I hope it can help you.

    French XR/MR developer / Indie Game Developer

  • @demonixis said:
    Hi there,

    Why don't you use the built-in input API for motion controllers? It's the same for the Vive and the Rift, no need to a 3rd party SDK.

    leftControllerTransform.localPosition = InputTracking.GetLocalPosition(XRNode.LeftHand);
    leftControllerTransform.localRotation = InputTracking.GetLocalRotation(XRNode.LeftHand);
    

    Yes good question and it is what I actually started to use but the MixedRealityToolkit-Unity project still uses their own API so I guess we were looking at both to check which one would work.

    In the meantime with latest Windows updates and on Unity editor 2017.2.0b11 I just got it to detect the motion controllers and report the right position (with both APIs btw). So they have just fixed something.

    However the 3D model of the motion controller MixedRealityToolkit-Unity does not seem yet integrated or downloaded (Jesse mentioned glTF streaming of the 3D model).

    You also still need to do the hack consisting in maintaining the Unity editor play button while you put the headset on (or put your finger in front of the proximity sensor) to get the editor to actually detect the controllers but at least it enough to start working on it.

  • Well I don't use the MRTK so I can't give you more information about that sorry. Glad to read that motion controllers works well with the latest update. I can't wait to get mine soon ;)

    French XR/MR developer / Indie Game Developer

  • Using 2017.2.1f1, I could get it to work by shutting down MR Portal every time. When hitting play button in Unity, it will launch MR Portal, and will start tracking controller.

Sign In or Register to comment.