The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Shared experience - Moving objects
Based on the 240 tutorial, we are trying to build a Shared Experience app that allows two or more users to see the same objects in a room, but also to move and manipulate them.
We are using the same message based server from the tutorial, attempting to send an object's position (For now). Also, in addition to a remote user's head position, we are also attempting to send his Gaze information. We do so by drawing a line for each remote user, starting at his head and ending in his Gaze's hit position (Laser)
The main issue we are experiencing is that every time we run the app, we get strange behaviors such as:
1) Only one user is sending information (The other one can just watch but doesn't have any functionality)
2) One or none users can see the shared object
3) In general. "funny" and unexpected behaviors
We are also having doubts about the architecture of the sharing logic.
As of now, every user is sharing what he sees, and is receiving what the other users see.
Is this considered the optimal solution for sharing? We suspect this is causing all the strange behaviors we are experiencing.
Thanks in advance!