Hello everyone.

We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.

On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.

So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.

Announcing Spectator View

When wearing a mixed reality headset, we often forget that a person who does not have it on is unable to experience the wonders that we can. Today, we are pleased to share our Spectator View solution with you. Spectator view allows others to see on a 2D screen what a HoloLens user sees in their world.  Spectator view is a preferred approach for live demonstrations of your HoloLens experience because you can use a high-quality video camera to produce high-quality images meant for a big screen. Spectator view gives you the flexibility to choose the camera, lens, and framing that best suits how you want to showcase your app. It puts you in control of the video quality based on the video hardware you have available.

Documentation for Spectator View can be found at:

You can clone the repository from:

We trust you will find this useful and welcome your feedback & contributions.



  • Absolutely Excited! Thank you.

    Developer | Check out my new project blog

  • Anyone have a recommendation for a place to get the pieces for the bracket milled?

  • This looks fantastic.
    Is it possible to use this without a capture card, with the process:
    1. Mount hololens and camera as in the documentation
    2. Record the camera feed to the camera SD
    3. Record the holographic feed in the unity app (with black/transparent background)
    4. Composite the feeds together afterwards in after effects

    Whilst not ideal, this would allow it to be done easily through any capable laptop, rather than a dedicated machine.

    And, if so - how would one go about creating the calibration data?


  • JacksonJackson mod
    edited February 2017

    @TimT : You can do an internet search for machine shops in your area. Note also that the included hardware does not have to be the only mounting solution if you do not have any existing experience with machining. I will be updating the documentation soon with a design I am experimenting with that does not require any custom hardware. You can also experiment with the spectator view code while waiting for your bracket. When starting the project, I taped or held a camera to my HoloLens to simulate the mount.

    We have added documentation for mounting your HoloLens without any custom hardware.

  • JacksonJackson mod
    edited February 2017

    @newske : This is possible, but would require some code changes. Notably:
    1. The video texture is not cleared unless it has a valid color texture.
    2. The hologram texture is selected based on the timestamp of the color texture, without a color texture, you will get the newest hologram texture instead, you will have to figure out what the alignment would be manually.
    3. You will need to update the FRAME_WIDTH and FRAME_HEIGHT in the compositor to match the resolution of the video you are capturing on your camera. If this value is too large, you might overwhelm your PC's resources.
    4. You would also want to record the alpha channel so you can easily mask the hologram texture and easily add any transparency that is present in the hologram. This would require another video encoder to be active which might also overwhelm your PC's resources.

    Note that we have tested spectator view on laptops with various USB3 capture cards (linked to in the documentation) These might fit your needs instead.

  • BozzyboyBozzyboy
    edited February 2017

    this is great! @Jackson do you think it might be possible to use this with a USB webcam like 'logitech 920c full HD" and without a capture card for realtime viewing ?

  • JacksonJackson mod
    edited February 2017

    @Bozzyboy - You can absolutely use this with a webcam. In CompositorShared.h set USE_OPENCV to TRUE and the other frame providers to FALSE. (There will be a static assert thrown if more than 1 provider is true)

    You will also need to ensure you have OpenCV 3.1 or 3.2 installed and referenced in dependencies.props - This is needed for calibration too.

  • @JayAb and @Jackson What would you recommend in terms of a PC setup for both still and video capture? I've heard mention of being able to use a decently fast laptop plus an external video card, but wanted to hear from the experts in terms of recommended specs

  • @Sungparkphoto: I have used spectator view with a few different PC's:
    On the low-performance end I have used a laptop with an i7 and an NVidia Quadro K1000M. Spectator view works, but will not hit target frame rates when recording video. (Note, you could use an external recorder to mitigate the performance hit from encoding video in software)

    I have also used a Surface Book with Performance Base and performance was similar to a powerful desktop.

    I would expect a laptop with a more modern gaming GPU would perform well.

    On the desktop side, I have used an i7 with a GTX 680, 770, 980, and 1070 - all perform well, but frame rate will be better as you increase the GPU you are using.

  • Here is a new YouTube video of Microsoft HoloLens Spectator in use

  • If you are going to be in Redmond, WA next Wednesday, April 12 you should RSVP and come hear the author of Spectator View present on the topic.

    For those who are not in the area on that date, you can subscribe to our channel and join us on our YouTube live stream.

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • Hi All,

    I'm working on getting my hardware together for Spectator View.

    Are there any external video capture cards that have worked? I have a gaming laptop that I'd like to use.

    Ideally it would be HD-SDI input and not HDMI, but if I have to convert, that's fine too.


  • Jimbohalo10Jimbohalo10 ✭✭✭
    edited April 2017

    @ChaseBowman said:
    Hi All,

    I'm working on getting my hardware together for Spectator View.

    Are there any external video capture cards that have worked? I have a gaming laptop that I'd like to use.

    Ideally it would be HD-SDI input and not HDMI, but if I have to convert, that's fine too.


    This box is used on a USB 3.0 the opening display picture on the desk
    Blackmagic Design Intensity Shuttle for USB 3.0
    Shows the usage of the device.

    The only software change USE_OPENCV TRUE in CompositorShared.h and build as RELEASE, x64 in Visual Studio 2015

    There is possibility of using this other device, but this has not been tested. The price risk is significantly cheaper.

    Personally the Kinect 2 for PC works well as this has High Resolution and runs as an ordinary camera , but its not 4K! or HD-SDI input

    Go Pro 4K camera as in the documentation is USB 3.0 device and probably could be an answer, once again does not support HD output -SDI input.

    Several users have experience problems setting up cameras with different frame settings, so extensive knowledge of how to adjust your camera from the User manual is required.

  • newskenewske
    edited May 2017

    Hi @Jackson and others
    I've had some success getting spectator view working correctly, but I have a couple of issues which I would value advice on.

    I am using a Dell Precision 5510 with a Quadro M1000M and SSD, and an Elgato Game Capture HD60 (USB-3). I'm streaming video from a Canon 750D and attempting to composite at 1080p.

    1. (Major Issue) - Recorded video is riddled with artifacts like this and dropped frames / spikes every few seconds. It previews reasonably well in the compositor window - the issues are only prevalent when recording. Video preview in the capture card preview application works as expected (with higher frame rates that in the compositor window ,and smoother).
    2. (Minor Issue) - To synchronize the video stream with the holographic overlay I require a frame offset of ~90. Not entirely sure why this is, as the default is set to 0-3

    I'd expected that I wouldn't get optimum performance out of this machine and capture card - however I was expecting slower frame rates rather than heavy glitches in the footage.

    Any suggestions and/or things to try would be very useful.


    HARDWARE_ENCODE_VIDEO FALSE removes the major glitches, but there are still significant pauses in the video. Reducing VIDEO_FPS to even 10 doesn't solve the frame skipping.

    QUEUE_FRAMES does not appear to have any effect whether true or false.

  • newskenewske
    edited May 2017

    Further information with GPU-Z monitoring while recording:
    The integrated GPU (Intel HD Graphics P530) is running at max load - and gets hot to about 90 degrees before presumably being capped. The Quadro runs lower at about 70% load. CPU usage is far from max - about 50% on all cores.

    My suspicion is that the Elgato Capture Card does not utilise Quadro cards - and hence hardware encoding uses Intel HD Graphics - which is underpowered for the task. Is this the likely issue?

  • [email protected]@Jackson and others,
    Does this camera"Blackmagic Design Micro Studio Camera 4K" work for spectator view?

  • Jimbohalo10Jimbohalo10 ✭✭✭

    @holofans said:
    [email protected]@Jackson and others,
    Does this camera"Blackmagic Design Micro Studio Camera 4K" work for spectator view?

    This box is used on a USB 3.0 the opening display picture on the desk
    Blackmagic Design Intensity Shuttle for USB 3.0
    The card mentioned above DOES NOT work with the camera, because the device only supports. HDTV 1080i/59.94. The list is
    Professional Video Standards: Intensity will instantly switch between HD and SD video standards including HDTV 1080i/59.94, 1080i/50, 720p/59.94, 720p/50, NTSC and PAL.

    The Ultra HD called 4K in US is minimum resolution of 3840×2160 pixels.

    from the
    Blackmagic Design USB 3.0 Devices FAQ about PC's that are supported
    This FAQ covers the following devices: UltraStudio Pro, UltraStudio SDI, Intensity Shuttle, Pocket UltraScope and ATEM 1 M/E & ATEM 2 M/E Production Switchers

    Certified Notebook Computers
    Notebooks certified for use with UltraStudio Pro, UltraStudio SDI, Intensity Shuttle, ATEM 1 M/E and ATEM 2 M/E:

    Sony VAIO SVE17115FG (recommended) Sony VAIO SVS13116FG Asus N76VM Lenovo T420s* MSI GE620

    *The basic configuration of the Lenovo T420s uses integrated Intel HD Graphics 3000, therefore the preview window in Blackmagic Media Express may be a little choppy. This however, does not affect the output or any captured files.

    There are currently no Notebook computers certified for use with the Pocket Ultrascope

    Certified Desktop Motherboards
    Desktops certified for use with UltraStudio Pro, UltraStudio SDI, Intensity Shuttle, ATEM 1 M/E, ATEM 2 M/E and Pocket Ultrascope*

    Asus P6X58D-Premium Asus P6X58D-E Gigabyte GA-X58A-UD7 Gigabyte GA-X58A-UD5 Gigabyte G1.Sniper (rev. 1.0) Intel DH67BL Asus Crosshair IV

    *Pocket Ultrascope requires a dedicated NVIDIA or AMD/ATI Graphics card.

    What Operating Systems are compatible with these devices?
    Only Windows 7 64-bit and Microsoft Windows Server 2008 R2 (Enterprise edition) are currently supported for use with Blackmagic Design USB 3.0 devices.

    What USB 3.0 spec is required?
    Computers must either have 3rd Generation Intel Core Processors (Ivy Bridge) which have Intel Native USB 3.0 or the NEC/Renesas USB 3.0 controller for older computers (Pre-2012). You can easily check if you have either of these by heading to the "Universal Serial Bus Controllers" tree in the Windows Device Manager. If your computer does not have either of these, you may be able to add a USB 3.0 PCIe expansion card instead.

    Which USB 3.0 PCIe expansion cards can I use?
    USB 3.0 PCIe expansion cards must use the NEC/Renesas µPD720200 (or variant of) controller as it is the most compatible controller that provides enough bandwidth for Blackmagic USB 3.0 products. It is also recommended that the card has a 3-pin floppy or 4-pin molex power external power port, as some PCIe x1 lanes and PCIe buses cannot provide enough power. Here is a short list of USB 3.0 PCIe expansion cards that fit these criteria and are available from many PC retailers for typically $30: Aereon 2 Port USB3.0 PCIe Card ICY BOX USB3.0 PCI-E Expansion Card Patriot USB 3.0 SuperSpeed PCI-E Vantec UGT-PC312 2-Port SuperSpeed USB 3.0 PCI-E Host Card Welland Turbo Leopard UP-312C USB 3.0 PCIe card

    USB 3.0 PCIe expansion cards will only work with Intel X58 or 2nd Generation Intel motherboards (2011)

    Can you recommend any motherboards with USB 3.0 PCIe expansion cards combinations that are compatible?
    The following combinations were tested by customers of New Magic and are not certified to work, only reported:

    Asus P8Z68-Pro + Welland Turbo Leopard UP-312C USB 3.0 PCIe card ASRock Z68 Pro-3 + Silverstone EC03 USB 3.0 PCIe card

    Are there any Blackmagic Design Tested and Certified PC Workstations?
    The following workstations are certified for use with Blackmagic Design USB 3.0 devices with USB 3.0 expansion cards:

    Hewlett Packard Z400 Hewlett Packard Z800

    Last update 17/07/2012 (E. & O.E.) New Magic Australia Pty Ltd +61 3 9722 9700 [email protected]

    The Blackmagic UltraStudio 4K Device Not Detected seems a big problem to resolve according to the Blackmagic

    There is no simple way to know if you can step down this camera and its the same price as a HoloLens!, so its a big risk

  • bradenbraden
    edited May 2017

    Trying to build the calibration app and get in CompositorDLL\DeckLinkDevice.h:
    cannot open source file DeckLinkAPI_h.h

    DeckLinkAPI_h.h doesn't seem to exist in the Black Magic SDK (10.9)...anyone else using this version of the SDK, or know that we must be on an older one?

    EDIT: changed it to #include "DeckLinkAPI_v10_6.idl"..no idea if that is correct. Now it complains about..."cannot open source file cpprest/http_client.h"

    are there more steps to building the Calibration tool than the GitHub let on?

    Edit2: seems my problems were with VS2017. tried on another PC with VS2015 and its happy.

  • I'm having terrible trouble everything Spectator view. Specifically with the sample project and getting the compositor to work. The app builds great for the hololens but as soon as I press the play button in Unity it results in a crash after a brief pause then goes straight to a unity bug report.

    I've tried everything I can think of and I speculate that its the compositor crashing though I have no idea how to narrow it down.

    I've tried deploying the components and prefabs to a fresh basic project with just a cube with the same result.

    I've tried compiling the compositor in VS 2015 and 2017 to no avail. Any suggestions on how to narrow it down

    I've tried the Spectator View in @Jackson fork where he's implementing UNET

    This doesn't crash but I only get black as the background in the compositor. I'm guessing this means that the video isn't coming through. I have the USB3 blackmagic shuttle connected to a Gopro 4 black which is set to 1080p NTSC 60. It shows the video feed fine in Blackmagic Media Express but I can't get it to display in Unity (using version 5.6.1f1)

    Anyone have any suggestions on what I can try next? We bought a second Hololens specifically to use spectator view but my experience thus far is not encouraging.

  • @Matt_Walker:
    When you get a crash dialogue in Unity, you can open the crash dump to get a better idea of what is happening:

    In a file explorer window, navigate to %temp%, sort by "Date modified", then type 'crash' to find the latest crash report (if the crash just happened, it will be the top item)

    In this directory, open the crash dump, and select debug with native only. You may need to add the compositor pdb's (that are in your Unity project) This should point to the offending line of code.

    If you run the calibration app, do you see color frames? (The frame provider for the calibration app is the same code as the compositor)

    Also make sure you are not running any app that is using your capture card while trying to run the compositor (any app that uses the capture card, eg: BlackMagic Media Express, will take exclusive control over the capture card)

  • Thanks @Jackson! I'll give the crash dump a look.
    The calibration worked though the colour on the frames from the GoPro weren't quite right. Maybe they were yuv instead of rgb?
    Nothing else using the capture card at time of running the app.
    Appreciate the reply. I'll try to dig a bit more.
  • I don't see the hologram feed coming from lens of attached to the rig instead it is coming from unity. I am not sure this is how it is - as my holograms size and position is not matching when i watch through unity -> Compositor. Is it something i am doing wrong?

  • @Matt_Walker I am not sure even i had the same problem once i was setting it up. Once i closed my black decker it started working for me. then i assumed both should not run at the same time. But not sure though.

  • So the GitHub doco says to get calibration numbers close to 0...but its a bit hard to know what is considered "close" to zero without knowing the range limit for...
    Hololens RMS

    Anyone got some guidance there ?


  • Hi ,
    I am facing problem with sharing Service, I started the app with device and one system(Spectator view implemented) every thing works fine .when we exit from the app, device is leaving session but in system even unity exit , session is not leaving. so the problem is, again if the app started in the device it is not creating new session, joining in the same session that is not Closed previously. I stuck with the problem. Is this problem with Implementation of Spectator in my appliction?

  • Hi guys,

    Does anyone ever use SpectatorView with Vuforia?

    Seems it doesn't work cause Vuforia has its own camera.

    Need help!


  • @JayAb said:

    Hey, guy. I have a problem when i set this project. when my camera moving, my monitor show the hologram moving too. I don't know this phenomenon is good or bad. :( .
    Anyone can give me some advice?

  • Hello @Jackson,
    Need to know how can I relay messages from unity back to the connected hololens. I have two hololens. One acts as a Anchor Manager or the one wore by the user and another is mounted over my camera and acts as the spectator view hololens. I am able to see both of my hololens in unity game engine when I run my demo. I want to understand how can I update and sync my data over the network for all devices to update what is happening. I am able to get my live feed in unity's compositor window too. I am using the SpectatorViewSample demo app that comes with companionkit github project (and not the legacy one).

    I even read that
    "This sample has a simple network implementation that routes client pose data and game state to Unity. This network sample does not relay these messages back to all of the connected clients. For a more robust networking solution, check the Mixed Reality Academy 250 course which uses UNET.

    To Add your own network messages to this network stack:

    Update the NetworkData script to serialize and deserialize a new network message.
    Update the SharingManager script to route the new network message.
    Update the LocalPlayerManager script to send a message from the HoloLens application.
    Update the RemotePlayerManager script to react to a network message in Unity that changes the state of the HoloLens that sent the message.
    The anchor data is create with the AnchorManager script and shared with the AnchorNetworkTransmitter script over a StreamSocket. This is the same way the anchor is shared in the MixedReality 250 course. The SpectatorViewPoseProvider app is expecting an anchor to be shared like this. If your shared experience does sharing differently, you will need to modify the ConnectToServer function in SpectatorViewPoseProvider or the ConnectToAnchorOwner function in SpectatorViewPoseProviderMain.cpp in the SpectatorViewPoseProvider app to localize in your way."

    How can I do it? Any help would be highly appreciated.


  • Cool, our team also work on the Spectator View and have a imazing creative demo here
    Welcome to Hololens AR world, We hope to see Hololens 2 soon with many update and cheeper.
    Our site: https://onetech.jp/

    See more profile at https://onetech.jp

Sign In or Register to comment.