Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Different Hologram displays between "normal use" and "MixedRealityCapture"

BathurBathur
edited February 2019 in Troubleshooting

Why I get two different Hologram displays between Mixed Reality Capture (through the Windows Device Portal) and normal use. When I use the the Mixed Reality Capture through Windows Device Portal, the Hologram looks like nearer than I do not use this function. This situation also appears when I use the Miracast function. My app is developed by Unity 2017.4.19f1 which is an LTS version. I also use the MRTK 2017.4. I noticed that this situation is especially serious in the UGUI Canvas which uses the "Camera Render Mode". To be precise, I think this problem may only affects the UGUI Canvas which uses the "Camera Render Mode". I really need to use the Mixed Reality Capture or Miracast function to display my app. But problem makes me crazy. I can not design my UI with the original resolution. To display the app in the Mixed Reality Capture or Miracast function, I should scale down almost every UI. That's so sad. By the way, the Windows 10 in my HoloLens is the newest version.

Answers

  • If you using a HUD it pretty much cuts it out, so the user doesnt see it on the hololens, and the livestream doesnt show it as well. Is there an option we might missed? @Devs

  • I have not noticed this behavior before, let me test today. This might be a bug.

  • We found out that, the FoV of the CameraCache.main changes when the livestream starts. The main problems is that when the livestream start that mainCamera gets overwritten, therefore some GameObjects or scripts which use properties of the mainCamera calculate with larger values, which can lead to Problems.
    @Bathur For UI i personally will chatch the case when the FoV is more then 18 (around 17 is it before livestreaming) and put the Ui further away that should not bother the user too much.
    For HUD i will have to look into it, because my HUD isnt visible at all while livestreaming only a dark filter over the whole.

  • BathurBathur
    edited February 2019

    @Jesse_McCulloch said:
    I have not noticed this behavior before, let me test today. This might be a bug.

    @Jesse_McCulloch
    Are you sure this is a bug now? When can you fix it?

  • @tasibalint @Jesse_McCulloch
    Thank you very much. Your observation is more detailed than me. This is true. When the livestream is started, the camera or camera's parameters do feel like it has been replaced. In HoloLens, the "UGUI Overlay Render Mode" is not supported, the "UGUI Camera Render Mode" will be affected by the livestream. What advice do you have for the implementation of the HUD? By using the "UGUI World Space Render Mode" and controlling the Transform component manually? Can we get the livestream state in Unity through some APIs. So we can adjust the UGUI Canvas manually when the livestream has started.

Sign In or Register to comment.