The Mixed Reality Forums here are no longer being used or maintained.
There are a few other places we would like to direct you to for support, both from Microsoft and from the community.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.
If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.
And always feel free to hit us up on Twitter @MxdRealityDev.
Better Manipulation Like Hologram App
I've followed Holograms 211 and gotten rotation and translation to work. However translation (manipulation) feels sluggish and inaccurate especially when compared to the Holograms app. How can I get a better translation experience than what is shown in 211?
211 essentially stores the first position on ManipulationStarted, the computes a vector between the first position and the second position on ManipulationUpdated, then updates transform.position by the same offset. This works, but again it feels very sluggish compared to whatever the Holograms app is doing.
I could apply a scale factor to this vector which would increase speed, but I don't know if that would feel natural. The Holograms app almost makes it seem as if the hologram is attached to the end of a pole that the user is holding. It seems like the further the hologram is away from the user the faster it moves. Is this the way it's actually working? And can anyone share sample code that would achieve a similar behavior?