Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

A Star Trek-inspired, biometrically-controlled HoloLens app

In celebration of Star Trek and Augmented Reality, I had an awesome time getting a gang back together to recreate “The Game” from Star Trek: The Next Generation using HoloLens and Galvanic’s Pip.

Here's a summary of the game we made at this link: http://robburke.net/projects/star-trek-hololens-biometrics-game/

I'll do a more technical writeup and post a link to it here soon. The game itself was built with a Maya->Unity->VS pipeline. The animations are done using morph targets, and I'm impressed at how many polys the HL was able to render. We used SignalR to relay the biometric signal from the PC to the HoloLens.

The biometric input device we used, Galvanic's Pip, is Bluetooth enabled and HoloLens can see it in the Network UI. So in theory, we could also connect it directly, and build a UX within HoloLens to discover, pair, and stream the signal without the need for a relay.

I don't think I've seen a biometrically-controlled HoloLens app yet, so I thought you guys would find this cool! It has been a joy working on the project and learning the ins and outs of HoloLens dev.


Sign In or Register to comment.