Hello everyone.

We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.

On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.

So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.

What is Windows.Media.IVideoFrameStatics?

I try to implement faceTacker into the hololens. For that I use HololensCameraStream, because MediaCapture use CaptureElement which from XAML. My problem is when I implement videoFrame for FaceTracker I have this error :

System.InvalidCastException: Unable to cast object of type 'System.__ComObject' to type 'Windows.Media.IVideoFrameStatics'. at System.StubHelpers.StubHelpers.GetCOMIPFromRCW_WinRT(Object objSrc, IntPtr pCPCMD, IntPtr& ppTarget) at Windows.Media.VideoFrame.CreateWithSoftwareBitmap(SoftwareBitmap bitmap) at test02.OnFrameSampleAcquired(VideoCaptureSample sample)

And when I write it on google, I have result about Windows Media Player, so I have no idea what is it and how to resolve it ...

Here is my code :
`void OnFrameSampleAcquired(VideoCaptureSample sample)
if (frameSample == null || frameSample.Length < sample.dataLength)
frameSample = new byte[sample.dataLength];


    IBuffer ibuffer = null;
    if(frameSample != null)
        ibuffer = frameSample.AsBuffer();
        Debug.Log("frameSample null");

    SoftwareBitmap softwareBitmapFrameSample = new SoftwareBitmap(BitmapPixelFormat.Nv12, this._resolution.width, this._resolution.height);

    if(softwareBitmapFrameSample == null)
        Debug.Log("Failed to create SoftwareBitmap");
        Debug.Log("SoftwareBitmap success");
        Debug.Log("BEFORE VIDEOfRAME");
        VideoFrame videoFrameSample = VideoFrame.CreateWithSoftwareBitmap(softwareBitmapFrameSample);
    }catch(Exception ex){
    UnityEngine.WSA.Application.InvokeOnAppThread(() =>

// faces = await this.faceTracker.ProcessNextFrameAsync();
}, false);*/

Thanks for the help

Sign In or Register to comment.