Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Any way to fix sporadic Motion controller input detection ?

Hello every one, I've spent the past two days working with the Immersive Motion controllers and the input manager. The code itself for creating input events is very straight forward and useful and yet I am having a difficult time getting the Motion controllers to Consistently activate within my unity scenes . I can see the Rendered versions of the controllers in the Home environment scene before entering play mode but once my Scene starts that's about a 20 - 80 chance at best that the controllers respond to my given input buttons.

Someone mentioned a neat trick to get the controllers working by Holding the unity editor play button , putting on the headset , and then releasing the button once in the headset ( not sure how much that release matters considering control leaves the desk top at that point anyway ) which worked for me the very first time but no longer after that .

One thing I did notice that was sometimes If I press the Windows button and exit then return to the editor scene by selecting the now open app window control is restored. Could it be an issue where Events aren't being registered during the first launch but are upon Returning to a suspended app ? Any help is appreciated , thank you .

I'll also add my current custom controller class in the case I am doing something wrong in how I declare / assign these input events .

        public class OVRImmersiveButton :OVR3DButton  {

            // Use this for initialization
            void Start () {
                base.Start ();

                //Debug.Log("Source detected , Initializing Immersive button!!!!!");
                InteractionManager.InteractionSourcePressed += InteractionManager_SourcePressed;
                InteractionManager.InteractionSourceDetected += InteractionManager_SourceDetected;
            }

            void Update(){

                base.Update ();
            }


            void InteractionManager_SourceDetected(InteractionSourceDetectedEventArgs state)
            {
                Debug.Log("Source detected , Initializing Immersive button!!!!!");
                // Source was detected
                // state has the current state of the source including id, position, kind, etc.

            }



            void InteractionManager_SourcePressed(InteractionSourcePressedEventArgs state)
            {
                // state has information about:
                // targeting head ray at the time when the event was triggered
                // whether the source is pressed or not
                // properties like position, velocity, source loss risk
                // source id (which hand id for example) and source kind like hand, voice, controller or other
                Debug.Log("Touch Pad Pressed!!!!! " + " targetted object == " +base.crosshair.TargettedObject.name  + " this.gameObject == " +this.gameObject.name );
                if ((state.pressType == InteractionSourcePressType.Touchpad) && (base.crosshair.TargettedObject == this.gameObject)) {
                    Debug.Log("This button Pressed!!!!!");
                    base.Click ();
                }
            }


        }

Best Answer

  • EvoxVREvoxVR
    edited September 2017 Accepted Answer

    Just an update I switched from using input events to just having a hard input check in the controllers and it works much better. I get the feeling assigning an input event to all of the buttons caused every one of them to process an event call whenever input was detected so in a scene where I have 30 button objects and up to 8 different instances active at a time it was just overloading the stack with event calls.

    My current implementation looks more like this for a simple touch pad press

    var interactionSourceState = InteractionManager.GetCurrentReading ();
    
    
    if (interactionSourceState [0].touchpadPressed) {
    Debug.Log("Touch Pad Pressed!!!!! " + " targetted object == "     +base.crosshair.TargettedObject.name  + " this.gameObject == " +this.gameObject.name );
     base.Click ();
                        }
    

    and this for a Simple Stick reading of left or right

      var interactionSourceState = InteractionManager.GetCurrentReading ();
    
       if (interactionSourceState.Length > 0) {
    
        if (interactionSourceState [0].thumbstickPosition.x >= 0.1f) {
        Debug.Log (" Left Stick moved Left");
                            }
    

Answers

  • Hello every one, I've spent the past two days working with the Immersive Motion controllers and the input manager. The code itself for creating input events is very straight forward and useful and yet I am having a difficult time getting the Motion controllers to Consistently activate within my unity scenes . I can see the Rendered versions of the controllers in the Home environment scene before entering play mode but once my Scene starts that's about a 20 - 80 chance at best that the controllers respond to my given input buttons.

    Someone mentioned a neat trick to get the controllers working by Holding the unity editor play button , putting on the headset , and then releasing the button once in the headset ( not sure how much that release matters considering control leaves the desk top at that point anyway ) which worked for me the very first time but no longer after that .

    One thing I did notice that was sometimes If I press the Windows button and exit then return to the editor scene by selecting the now open app window control is restored. Could it be an issue where Events aren't being registered during the first launch but are upon Returning to a suspended app ? Any help is appreciated , thank you .

    I'll also add my current custom controller class in the case I am doing something wrong in how I declare / assign these input events .

            public class OVRImmersiveButton :OVR3DButton  {
    
                // Use this for initialization
                void Start () {
                    base.Start ();
    
                    //Debug.Log("Source detected , Initializing Immersive button!!!!!");
                    InteractionManager.InteractionSourcePressed += InteractionManager_SourcePressed;
                    InteractionManager.InteractionSourceDetected += InteractionManager_SourceDetected;
                }
    
                void Update(){
    
                    base.Update ();
                }
    
    
                void InteractionManager_SourceDetected(InteractionSourceDetectedEventArgs state)
                {
                    Debug.Log("Source detected , Initializing Immersive button!!!!!");
                    // Source was detected
                    // state has the current state of the source including id, position, kind, etc.
    
                }
    
    
    
                void InteractionManager_SourcePressed(InteractionSourcePressedEventArgs state)
                {
                    // state has information about:
                    // targeting head ray at the time when the event was triggered
                    // whether the source is pressed or not
                    // properties like position, velocity, source loss risk
                    // source id (which hand id for example) and source kind like hand, voice, controller or other
                    Debug.Log("Touch Pad Pressed!!!!! " + " targetted object == " +base.crosshair.TargettedObject.name  + " this.gameObject == " +this.gameObject.name );
                    if ((state.pressType == InteractionSourcePressType.Touchpad) && (base.crosshair.TargettedObject == this.gameObject)) {
                        Debug.Log("This button Pressed!!!!!");
                        base.Click ();
                    }
                }
    
    
            }
    
  • EvoxVREvoxVR
    edited September 2017 Accepted Answer

    Just an update I switched from using input events to just having a hard input check in the controllers and it works much better. I get the feeling assigning an input event to all of the buttons caused every one of them to process an event call whenever input was detected so in a scene where I have 30 button objects and up to 8 different instances active at a time it was just overloading the stack with event calls.

    My current implementation looks more like this for a simple touch pad press

    var interactionSourceState = InteractionManager.GetCurrentReading ();
    
    
    if (interactionSourceState [0].touchpadPressed) {
    Debug.Log("Touch Pad Pressed!!!!! " + " targetted object == "     +base.crosshair.TargettedObject.name  + " this.gameObject == " +this.gameObject.name );
     base.Click ();
                        }
    

    and this for a Simple Stick reading of left or right

      var interactionSourceState = InteractionManager.GetCurrentReading ();
    
       if (interactionSourceState.Length > 0) {
    
        if (interactionSourceState [0].thumbstickPosition.x >= 0.1f) {
        Debug.Log (" Left Stick moved Left");
                            }
    
Sign In or Register to comment.