Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.
Options

Using a Windows MR HMD with a motion platform/simulator

There is motion platform software (sim tools) that captures in game Gforce data and outputs it to a motion platform. There is also software that can take the motion data from a Vive controller and subtract it from the HMD motion data. This will allow a person to sit on a motion platform (that has the controller mounted to it) while wearing a Vive HMD, with the motion of the platform removed from the displayed view inside the HMD. Why you wouldn’t just mount the lighthouse sensors to the platform is beyond me.
A Windows MR device lacks lighthouse sensors so all motion data comes from the HMD, plus the controllers have to remain in view of the HMD sensors, making the above solution impossible.
So my question is: Is there a way for the g-force data from the game (DCS, Project Cars 2, etc) to be subtracted from the HMDs positional data so that the screen will display the direction you are looking instead of the direction you are looking plus the position of the platform?
I’ve re-read this a few times and it makes sense to me. Hopefully it makes sense to you all as well. Thanks for taking the time to read this.
John

Sign In or Register to comment.