Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Raw frames from locatable camera

Hi all,
Currently I don't have access to Hololens, but eventually I'd like to use it for object tracking. I've read the article about the locatable camera: https://developer.microsoft.com/pl-pl/windows/holographic/locatable_camera. It says that "the video and still image streams are undistorted in the system's image processing pipeline before the frames are made available to the application" but also that "the preview stream contains the original distorted frames".

Is it possible to access these distorted frames and calculate distortions manually? I would also like to know if the frames that are available to the application are raw bitmaps or does the processing pipeline perform some kind of compression or can introduce other artifacts?


  • I am trying to get Raw Data in C++ using Windows Media. Here is the sample code I have for the app, for some reason ProcessSample method never gets called in the StreamSink. Here is the link to repo : https://github.com/deaga/hololensdatacollection . Any of you gurus here can help me with it?

Sign In or Register to comment.