Hello everyone.

We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.

On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.

So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.

Infrared Scanning for Spatial Mapping

AR123AR123
edited June 2016 in Discussion

The next version of the Hololens should provide for infrared scanning in order to spatial map in dark spaces. Walking around areas with poor lighting really effect the spatial mapping experience. Industries where low-lighting is inevitable, such as subway construction workers, could really take advantage of this latest feature in the device.

Comments

  • This could probably be used to distinguish and map reflective surfaces as well.

  • Jarrod1937Jarrod1937 ✭✭✭

    Technically they're already using infrared. There are two components to the Hololens, the spatial mapping and the anchoring. The best I can tell, it uses infrared to scan the environment, and uses the visual camera to track anchor points. So it can certainly keep scanning in the dark, but it will have difficulty tracking the anchors.
    So the trick will be seeing if you can use two distinct bands of infrared that can be used and not interact with each other. However, materials react differently to infrared and may not reflect as much as you'd like, which can easily lead to natural sources of infrared overwhelming the intended signal.
    The best option would be low light image sensors and optics to gather as much light as possible from the environment. This may add more weight and use more power, which may then add more weight for more battery capacity.
    Reflective surfaces, by their nature, are difficult to spatially scan via light. Their high directionality means that any scanning technique that relies as reflected data needs more data points at varying angles to get the same quality scan as a more diffuse material. While a higher directionality means a stronger signal, it also means that you either need a larger sensor to pickup a wider angle of reflected data, or are forced to move the sensor around and take multiple samples, else you risk missing out on sensor data that is reflected in an entirely different direction than the sensor and holes can form.
    I'm just mentioning this, so people understand that engineering is usually a game of tradeoffs. While I of course would want to see an improvement in this area as well (imagine a horror game in the semi-dark as home!), I too am not sure how else they can do it given their current constraints.

Sign In or Register to comment.