We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.
The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.
The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.
On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.
So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.
Live Stream of Locatable Camera (Webcam) in Unity
Try as I might, I can't seem to determine the correct workflow for accessing a live stream of what the Locatable Camera sees from within the device.
I've come up with two feasible methods:
1) Taking a picture a la the Locatable Camera in Unity tutorial and using InvokeRepeating() to keep taking them and applying them to a material as a texture.
2) Use the Device Portal API to GET the video stream and somehow turn it into a video texture to apply to a material.
Method 1 has the downside of having very slow framerates; I tried doing it with video instead of images, but there doesn't seem to be any version of the StartRecordingAsync() method that allows for accessing frames instead of saving the whole thing as a video file, unlike the TakePhotoAsync() method.
Method 2 potentially has the downside of sizeable latency. There's close to a full second delay when accessing the stream from a browser on the same network, and even accessing it from the Edge browser within the HoloLens has a substantial delay. Plus I'm not even sure if its possible to utilize that kind of video stream within Unity.
Is there a third method I'm missing? How does Skype do it?