Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Hands No Longer Detected After Switching Back to 3D View From 2D View


I am using Holotoolkit 2017.4.1 and Unity 2017.3.1f1 in my project.

My project uses the ApplicationViewManager class from the Holotoolkit to navigate to a 2D xaml page that displays a WebView and then return to the 3D Unity view by pressing a button on the xaml page.

The app uses a tap gesture to navigate to the xaml page by calling the LaunchXaml function from the object being focused/gazed at.

The problem that I have is that when the CallBack function brings the user back to the 3D Unity app, hands are no longer detected. The user is no longer able to use any tap or manipulation gestures.

If it were just the navigation tap that was effected I would work around by using speech, but since all gesture functionality is lost there doesn't seem to be a way to work around this issue while preserving the key functionality of the app.

If anyone has encountered/solved this problem before I would appreciate any assistance you may be able to provide.


Sign In or Register to comment.