Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Live Stream of Locatable Camera (Webcam) in Unity

Try as I might, I can't seem to determine the correct workflow for accessing a live stream of what the Locatable Camera sees from within the device.

I've come up with two feasible methods:

1) Taking a picture a la the Locatable Camera in Unity tutorial and using InvokeRepeating() to keep taking them and applying them to a material as a texture.

2) Use the Device Portal API to GET the video stream and somehow turn it into a video texture to apply to a material.

Method 1 has the downside of having very slow framerates; I tried doing it with video instead of images, but there doesn't seem to be any version of the StartRecordingAsync() method that allows for accessing frames instead of saving the whole thing as a video file, unlike the TakePhotoAsync() method.

Method 2 potentially has the downside of sizeable latency. There's close to a full second delay when accessing the stream from a browser on the same network, and even accessing it from the Edge browser within the HoloLens has a substantial delay. Plus I'm not even sure if its possible to utilize that kind of video stream within Unity.

Is there a third method I'm missing? How does Skype do it?

Answers

  • I'm also interested in doing this. Has anyone gotten this to work successfully?

    I think a third option would be to use a c# wrapper for the c++ Window Media Foundation libraries that are used to access the camera in the DirectX approach. I only found http://mfnet.sourceforge.net/ for a c# approach to accessing the Media Foundation, but mfnet might be outdated. Does anyone know if this third approach I'm describing is feasible? If it is, does anyone know of a good tutorial for learning how to do this?

  • I'm also interested in doing this, if anyone has an idea.

    Thanks

  • You can get the webcam stream as a webcamtexture in Unity, it doesn't have the AR projector view just the webcam. https://docs.unity3d.com/ScriptReference/WebCamTexture.html

        void Start(){
        WebCamTexture _webTex = new WebCamTexture();
                _webTex.Play();
                GetComponent<Renderer>().material.mainTexture = _webTex;
        }
    
  • shftdshftd
    edited June 2016

    @LeefromSeattle said:
    You can get the webcam stream as a webcamtexture in Unity, it doesn't have the AR projector view just the webcam. https://docs.unity3d.com/ScriptReference/WebCamTexture.html

        void Start(){
        WebCamTexture _webTex = new WebCamTexture();
                _webTex.Play();
                GetComponent<Renderer>().material.mainTexture = _webTex;
        }
    

    I have already done this but it doesn't appear on the screen with the Hololens emulator. Is your webcam projected in the device/emulator ?

  • Webcamtexture works on the hololens but not the emulator for me

  • Yeah I'm using device for testing, it plays the webcam texture on a cube I have the script attached to for me.

  • I don't think the webcam is being emulated by the emulator.

    ===
    This post provided as-is with no warranties and confers no rights. Using information provided is done at own risk.

    (Daddy, what does 'now formatting drive C:' mean?)

  • @yena, @LeefromSeattle , @Patrick: very useful information, thank you

  • holobenholoben
    edited July 2016

    Here's one downside of using WebCamTexture compared to LocatableCamera: Using the Unity webcam doesn't give you the locatable camera's position and rotation. However, PhotoCaptureFrame has a function, TryGetCameraToWorldMatrix, that gives you this information.

    What's more worrisome for me is that WebCamTexture drops the framerate to 15fps for a 1344x756 image (20fps for a 896x504 image). Can anyone confirm that it also drops to 15fps for them as well? If not, how did you keep a higher frame rate? @LeefromSeattle what was the frame rate for you when showing the WebCamTexture on a cube?

  • @holoben the camera's position and rotation is same as the main camera. The world to camera matrix is very close to the one we can get from PhotoCaptureFrame, and the projection matrix is a little off.

    I also experience the same fps drop.

  • @YuJen said:
    @holoben the camera's position and rotation is same as the main camera. The world to camera matrix is very close to the one we can get from PhotoCaptureFrame, and the projection matrix is a little off.

    I also experience the same fps drop.

    @YuJen I haven't tested it thoroughly, but I would expect the Camera.main.transform to be the human's "camera" (i.e., eyes) position & rotation, while the PhotoCaptureFrame's position & rotation to be the device's RGB camera's position & rotation. Thus, there should be a small difference between them (as you said, "very close") but they probably won't be equal. Furthermore, I would expect this to be a static rigid transformation that doesn't change (unless someone recalibrates the HoloLens)... Has anyone else tested this or can confirm or not confirm this?

  • Jimbohalo10Jimbohalo10 ✭✭✭
    edited August 2016

    @holoben said:
    Here's one downside of using WebCamTexture compared to LocatableCamera: Using the Unity webcam doesn't give you the locatable camera's position and rotation. However, PhotoCaptureFrame has a function, TryGetCameraToWorldMatrix, that gives you this information.

    What's more worrisome for me is that WebCamTexture drops the framerate to 15fps for a 1344x756 image (20fps for a 896x504 image). Can anyone confirm that it also drops to 15fps for them as well? If not, how did you keep a higher frame rate? @LeefromSeattle what was the frame rate for you when showing the WebCamTexture on a cube?

    I have only one thing for you to check, from
    holograms_100 unity_performance_settings
    If you look at the box, the Default is "Fantastic", change to "Fastest". Select Fastest for WSA and green tick.

    I have also deleted other functions leaving "Fastest", so the Default stays "Fastest".

  • @holoben said:
    Can anyone confirm that it also drops to 15fps for them as well?

    Same here. If you're doing mesh processing for spatial mapping at the same time, the frame rate drops below 10. This is independent of the Unity quality settings.

  • @holoben said:
    I would expect this to be a static rigid transformation that doesn't change (unless someone recalibrates the HoloLens)... Has anyone else tested this or can confirm or not confirm this?

    I've tested this and it is a fixed transformation on one webcam resolution. I haven't tested other resolutions but I believe the projection matrix should be relative to it.

  • @Waterine said:

    @holoben said:
    I would expect this to be a static rigid transformation that doesn't change (unless someone recalibrates the HoloLens)... Has anyone else tested this or can confirm or not confirm this?

    I've tested this and it is a fixed transformation on one webcam resolution. I haven't tested other resolutions but I believe the projection matrix should be relative to it.

    @Waterine Great, thanks for the comment. That's good to know!

  • Actually @Waterine, if we use this fixed transformation, we'd still have to be careful about any lag between accessing Camera.main.transform and the time when the webcam image was captured. If it's anything more than several milliseconds, the resulting camera pose would have some drift/error in it.

  • @holoben
    Agree, I try to capture several frame and compare the transform matrix.
    I use Ax = B to solve the matrix where A is the matrix I get from unity's camera and B is the matrix I get from PhotoCapture. And the x is not always the same. But they are very close. I think that is because there are some milliseconds off between the webcam and unity's camera.

  • For anyone who finds this thead, it may help to checkout the HoloLensCameraStream project which is a C# Media Foundation implementation to get locatable camera frames from the HoloLens into Unity (disclosure: I wrote this). I get access to the raw transform and projection matrices with each frame, however I have not yet got the time to learn how to make them useful in Unity. Community help with this part would be valuable.

Sign In or Register to comment.