Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.
Options

Would it be possible to share the spatial mapping mesh between two hololens in addition to anchors?

I'm having issues with having placed objects in a shared experience being offset from where they should be despite exporting a world anchor and transferring that anchor to the second hololens.

I've followed the basic setup at the academy 240 sharing with the only differnce being that I'm not using the sharing service but rather the Unity UNET networking to transfer the anchor between hololens's. This method is demonstrated on the Microsoft git hub here:

https://github.com/Microsoft/HoloToolkit-Unity/tree/master/Assets/HoloToolkit-Examples/SharingWithUNET

Once the host creates the game, and successfully exports a world anchor, and a client connects with a second hololens, they both see each other no problem, the relative position of the user's heads are fairly accurate (because I parent a cube under the camera in unity and all players are a child of the anchored game-object.

The problem is, when a user places a hologram both the host and the client see said hologram but they are offset in space. User A may see the hologram directly in between themselves and User B, but User B sees that hologram behind User A, to the side, behind User B there doesn't really seem to be any rhyme or reason to the offset.

I've tried starting each experience standing in exactly the same spot etc nothing seems to make a difference.

Both Hololens are completely updated etc.

One of the ideas I was wondering is can the scanned meshes of the spatial mapping be shared in order to help accurately place the anchors? Is that completely off base?

Anyone else seen this behavior any advice?

Best Answer

  • Options
    stepan_stulovstepan_stulov ✭✭✭
    edited April 2017 Answer ✓

    One of the ideas I was wondering is can the scanned meshes of the spatial mapping be shared in order to help accurately place the anchors?

    Even if it can be shared it won't help because anchors are not stabilised against spatial mapping. Besides, whatever data anchors are stabilised against, that stabilisation always runs against the unmodified original data. You can neither modify, nor substitute it. All you have is a one way feed of a smoothened/processed/reconstructed spatial mesh snapshots.

    I'm sorry, I can't comment on the shared experience otherwise, I haven't tried it yet.

    Building the future of holographic navigation. We're hiring.

Answers

  • Options
    stepan_stulovstepan_stulov ✭✭✭
    edited April 2017 Answer ✓

    One of the ideas I was wondering is can the scanned meshes of the spatial mapping be shared in order to help accurately place the anchors?

    Even if it can be shared it won't help because anchors are not stabilised against spatial mapping. Besides, whatever data anchors are stabilised against, that stabilisation always runs against the unmodified original data. You can neither modify, nor substitute it. All you have is a one way feed of a smoothened/processed/reconstructed spatial mesh snapshots.

    I'm sorry, I can't comment on the shared experience otherwise, I haven't tried it yet.

    Building the future of holographic navigation. We're hiring.

  • Options

    @stepan_stulov said:

    One of the ideas I was wondering is can the scanned meshes of the spatial mapping be shared in order to help accurately place the anchors?

    Even if it can be shared it won't help because anchors are not stabilised against spatial mapping. Besides, whatever data anchors are stabilised against, that stabilisation always runs against the unmodified original data. You can neither modify, nor substitute it. All you have is a one way feed of a smoothened/processed/reconstructed spatial mesh snapshots.

    Ok, thanks for the info, I guess I was under the impression that the triangulation was happening based on the landmarks in the spatial mapping mesh and thus assumed sharing the same mesh would help the placement of anchors to be co-located.

Sign In or Register to comment.