The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Holograms 240 Issue: Shared Room Offset in Real World
I'm creating an app using the sharing service from holograms 240. Everything works as it is supposed to: the two users join the room, and both can see the cube that represents the other's head in the proper position around the central hologram (To simplify things I grabbed the remoteheadmanager script from holotoolkit-unity to use instead of the remoteplayermanager. It seems to work fine, the head movement matches perfectly).
However, the room itself is in different locations for both users, at an offset approximately equal to the distance between the two hololenses. This makes the cubes that represent the heads not match up with the actual hololens device in the real world.
From what I understand reading other questions here, the initial hololens sends a world anchor to any connecting hololens. The receiving hololens then attempts to place the world anchor according to what it can see. I'm not sure why the second hololens in my case cannot accurately place the world anchor, given that both hololenses are in the same room.
Am I missing something? Or is this how the tutorial is supposed to work?