Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.
Options

Mixed Reality HMD Clarification

Hi, I get that the Acer and the HP Headsets should be capable of Inside Out Head Tracking and Displaying 3D Virtual Reality Graphics. But I'm wondering if they & the SDK are capable of actually mixing camera images into the graphics before being displayed in the headset. Is the headset capable of delivering color and depth of what a user is looking at into the render pipeline? Do the cameras on the front actually capture all the necessary information? Or are they only used for positional tracking?

Thanks for any clarification, I have experience tracking the ZED depth sensing camera in space and matching a 3D camera / projection to the ZED camera spec. Then being capable of exposing the ZED camera color & depth image buffers into the render pipeline as to Mix/Augment real world images with 3D graphics. The depth information allows the render process to tell if 3D graphics are in front or behind what is captured by the camera on a per pixel level.

Are the headsets capable of mixing the camera images into the typical 3D pipeline before the 3D graphics are lens corrected?

Answers

  • Options

    Is the headset capable of delivering color and depth of what a user is looking at into the render pipeline?

    This is not available in the HoloLens headset, by SDK to developers, so this will not be available in Immersive headset as its just plain Windows 10. The cameras are not Kinect 2 cameras, which is the only camera to allow access to the depth stream for developers

  • Options

    thanks for the response, while I completely back inside out head tracking, I think these development related hardware releases should be more clear on specifics that developers would care about, like what the sensors are actually capable of delivering to an application.

  • Options

    @Poppyspy: The reality is, there are only a very, very small number of the headsets out in developers hands, and they are the only ones who have the new SDK, so we don't really know what API's are available with the new headsets. It's not just Windows 10, the Immersive Headsets are also on the Mixed Reality Platform, so there are API's outside of the normal Windows 10. We just have not been given access to them yet.

Sign In or Register to comment.