Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Accessing the raw video stream in Unity (not saving to a file)

I want to be able to access the raw video stream in Unity. Apparently, you can only save the video to a file right now and not access the raw bytes.

It seems that the only option is to use a c# wrapper for the c++ Window Media Foundation libraries that are used to access the camera in the DirectX approach. I only found http://mfnet.sourceforge.net/ for a c# approach to accessing the Media Foundation, but mfnet might be outdated. Does anyone know if this approach I'm describing is feasible?

If it is, does anyone know of a good tutorial for learning how to do this? The DirectX approach does not describe in detail how to access the camera. Furthermore, I'm not familiar with how to wrap C++ code in a C# class for accessing in Unity.

Tagged:

Answers

  • Jarrod1937Jarrod1937 ✭✭✭

    What are you wanting to achieve? You can access the camera frames on a per frame basis, which is essentially a video.

  • @Jarrod1937 said:
    What are you wanting to achieve? You can access the camera frames on a per frame basis, which is essentially a video.

    Oh really? I would like to be able to do 2 things: (1) stream the video over a network and (2) analyze each frame with OpenCV or another library.

    Have you been able to get each camera frame in real-time? I have not tried it yet but someone else told me it didn't seem like the photo capture for Unity was made for that purpose... Was I incorrectly informed about this?

  • Jarrod1937Jarrod1937 ✭✭✭

    @holoben said:

    @Jarrod1937 said:
    What are you wanting to achieve? You can access the camera frames on a per frame basis, which is essentially a video.

    Oh really? I would like to be able to do 2 things: (1) stream the video over a network and (2) analyze each frame with OpenCV or another library.

    Have you been able to get each camera frame in real-time? I have not tried it yet but someone else told me it didn't seem like the photo capture for Unity was made for that purpose... Was I incorrectly informed about this?

    Yep, the Locatable Camera has a few parameters specifically for image processing. You may only want to grab frames when needed, to ease the impact, but it's there.

  • edited June 2016

    You can access the camera in Unity as a webcamtexture. It doesn't have the holograms on it, just the raw camera feed. https://docs.unity3d.com/ScriptReference/WebCamTexture.html

    I've tested the opencv stuff off the asset store on the hololens and it uses the webcamtexture to convert to a mat for opencv, and it runs on the hololens directly.

  • holobenholoben
    edited July 2016

    I finally got around to testing accessing each frame via calling PhotoCapture.TakePhotoAsync on every frame (i.e., in Update()). This however appears to be too slow... I'm not getting 30 frames a second with this approach.

    I can't use Unity's WebCamTexture because it doesn't contain the position and rotation data associated with the camera frame (e.g., PhotoCaptureFrame has a function TryGetCameraToWorldMatrix). I guess I could just use the current Camera.main.transform instead of this? (UPDATE: I just checked and the transformation matrix represented by Camera.main.transform does NOT equal the Matrix4x4 returned by TryGetCameraToWorldMatrix)

    Can anyone verify that using Unity's WebCamTexture updates in 30 frames per second?

  • holobenholoben
    edited July 2016

    I've now tested WebCamTexture, and it seems fairly slow to me. Just by calling the following in Start()

    cameraResolution.width = 1344;
    cameraResolution.height = 756;
    cameraResolution.refreshRate = 30;
    webTex = new WebCamTexture(WebCamTexture.devices.First<WebCamDevice>().name, cameraResolution.width, cameraResolution.height, cameraResolution.refreshRate);
    webTex.Play();
    

    I get only 15 frames per second (UPDATE: if I use an 896x504 image, I get around 20fps). If I comment out the last line (webTex.play()), I get 60 frames per second. So it seems to me that either using WebCamTexture or calling PhotoCapture.TakePhotoAsync once per frame are both NOT able to support 30 fps.

    Has anyone found a way to access every frame of the RGB camera at 30fps?

  • @LeefromSeattle said:

    I've tested the opencv stuff off the asset store on the hololens and it uses the webcamtexture to convert to a mat for opencv, and it runs on the hololens directly.

    @LeefromSeattle did you make OpenCV work on Hololens? If yes, can I ask you some questions? Because I am trying to deploy an app that uses OpenCv and I have some problems with it.

Sign In or Register to comment.