Hello everyone.

We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.

On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.

So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.

World anchor sharing with UNET

rajraj
edited August 2016 in Questions And Answers

Hi,
I've been trying to share world anchors across HL devices with UNET so that there is no need for a separate machine for "hosting". I'm trying to do simple head tracking like the one in HL toolkit sample testscene. So far here's where I'm at.

Similar to the ImportExportAnchorManager I have my own class creating the byte array that needs to be shared across using WorldAnchorTransferBatch class.

Anchor blob generated is received at the other end and I'm getting import complete too. But the holograms are not where they are supposed to be. I saw that InverseTransform of the point is being passed in the HoloToolkit sample for head tracking. Really didn't get why we have to do that. But tried that as well. No luck. Is there anything more that I should be doing ?

Best Answer

Answers

  • Just wanted to add some additional info :smile:
    I'm using UNET HLAPI for this. I'm thinking of leveraging unity's network transform components to sync transforms. And head objects are the player objects in the network.

  • You are right. My issue was that I was expecting UNET NetworkTransform Component to track the placement in all the clients. This conflicts our interest that position of an object on each client should be updated with respect to their imported anchor.

    So far I've got sharing working with UNET alone. There's no need for additional services now.

  • @raj
    Is that possible to share your project using UNET alone via GITHUB without sharing services? This will be a great help. Thanks

  • @JonathanHager said:

    On the client use the relative position and rotation to calculate where and in what orientation the object should be.
    Calculate orientation from anchor and 
relative orientation as

    Quaternion rotation = relativeOrientation * anchor.transform.localRotation;

    I would love to hear how it goes using the Shared WorldAnchors. I have it on my list to try.

    Most of what you said is good stuff and got me on the right track, but the relativeOrientation and anchor.transform.rotation are in the wrong order for the 'decoding' on the receivers end. The *-operator for Quaternions is not commutative.
    The correct line would be: Quaternion rotation = anchor.transform.localRotation * relativeOrientation;

    I'm not great with rotations in a 3D-world, so I just think of it as having to point in the direction of the anchor, before i apply an anchor-relative rotation.

    Sorry for the post-necro, but is the most fitting I found.

Sign In or Register to comment.