We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.
On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.
So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.
Holograms positioning and spatial anchors
I have a problem with displaying underground holograms in Unity app. The scenario is next:
- The app supposed to be run outdoors.
- I need to display some gameobjects (let it be "treasure chest") underground (up to 3m under) in relatively far distance(let's say up to 20m)
- Objects are added to the scene on its load.
- The position of this objects is relative to user init position in rigid coordinate system(for example - 4m forward and 2m left from user init position)
The issue is next:
Such holograms are displayed with a huge shift(towards camera). Some notes:
- The shift is proportional to distance. More distance - more positioning error.
- As I said - I run this app outdoors, so the surrounding also matters. In field with no objects around the error is bigger. The more real world objects around the better is positioning. I realize it's should be about spatial mapping
- To exclude case of some optical illusion I added thin cylinder projection from object to the ground.
- The shift is only on the distance, when I come to the object - it seems to be almost on its place.
- Same objects, but place on the ground level are displaying MUCH more accurately.
What I already checked:
1. Microsoft docs
From these I learned that one of the reasons could be using rigid relationships. The doc say that [spatial anchttps://developer.microsoft.com/en-us/windows/mixed-reality/spatial_anchorsnchors) could be a solution, but I'm a bit confused of them:
- The common scenario of using them - is when user places some object relatively to real world, that's not my case.
- Distance between objects could be more then 5m sometimes, so not sure if I group my objects into cluster
- Unity API. From MS docs anchors are looking like something I should place on scene relatively to real world and then place my objects in its space. But in unity World anchor is a component that is bound to gameobject itself. I tried to play with it a little bit, but haven't succeed on improving positioning.
- To my knowledge anchors are based on spatial mapping mesh, but my objects are below it.(I assume it's the main reason of positioning error for underground objects).
- Experiments from Digital Reality Guru. He shares some experience and conclusions about working with Hololens outdoors.
- Holographic academy 101 and 240. These are only samples I found for displaying something underground. They show interesting approach and just to check I tried to put my objects in "black box" like they do, so there would be some mesh nearby objects, but the positioning issue stays the same.
Maybe I want too much from currently available tech, but anyway hope to improve the positioning.
If anyone have some comments/remarks/ideas/corrections regarding what I describe - I'll appreciate any feedback.
Thanks for reading!