Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Poor Man's Inverse HoloLens AKA HoloBoard (2x Kinect shoulder pad suit + Camera phone VR HMD) :-)

I've been starting to thinking a bit about how to create Cross-Platform AV/VR Unity apps, perhaps leveraging pass through cameras so that the VR experience could approximate the AR experience or my app could get into consumer hands sooner with Google Cardboard, Gear VR, HTC Vive and Oculus. Haven't looked at the last two, but seen some camera pass apps on Cardboard and Gear VR. Main issue to me is the flatness of the image, doesn't seem like it could sustain a holographic illusion.

Not a hardware expert, limited physics knowledge, but I wonder if anyone has invented a type of endoscope/fiber optic attachment/prosthesis so that you could film in 3D on today's mobile phones, where you typically have a camera on both sides....and adjust POV so that the source light input was adjustably aligned in front on the person's eyes to a similar IPD (inter-pupillary distance) that the person would be used to, so that the VR camera experience would map 1 to 1. I guess too many sources of distortion or reasons this wouldn't work, but I did buy an optics kit for my last phone, so I would think it might be theoretically possible if the harness was created phone-specific. Gear VR 2...or I guess I'll go on the "Shark Tank" if no one's created it and it can be done.

Or just thinking out loud, maybe I can get 2 endoscopic cameras with long cable length usb on an adjustable harness to simulate....Anyone seen anyone create a home built rig on the cheap for quasi-AR?

Okay, in the end, I'll probably just wait until March 31st, but fun thinking if I already have the parts for Gesture (Kinect), Gaze (Gear VR) and Voice (Kinect) lying around the house when they could be used for diving into AR-Lite...

Comments

  • james_ashleyjames_ashley ✭✭✭✭
    edited March 2016

    @BotGreet ,

    This is a great exercise. It helps in trying to understand what makes HoloLens special compared to other devices as well as what's hard and what's not. It's often overlooked but IMO the spatial mapping, done on a chip (the so-called HPU), is probably the coolest and most difficult thing and comes out of lots of research from the Kinect.

    The exercise also highlights the importance of the decision to go untethered -- makes so much sense for the future of AR scenarios. Finally, the choice of a very bright display technology with a smaller field of view as opposed to a wider FOV with a dimmer image. That's probably the hardest because you really have to experience it for yourself to understand why it was the right choice (at least I think it was the right choice).

    Another cool thought experiment is to compare the sorts of experiences you plan to create with HL to what Microsoft Research has already done with projection mapping. Here's a hint: being able to interact between two HoloLens's is going to be the killer app maker. :)

    James Ashley
    VS 2017 v5.3.3, Unity 2017.3.0f3, MRTK 2017.1.2, W10 17063
    Microsoft MVP, Freelance HoloLens/MR Developer
    www.imaginativeuniversal.com

  • Thanks for the project mapping link! Good to see the ending to every episode of Scoobie Doo ever has inspired some great research...e.g. with the ghoulish "The Other Resident."

    Yeah, no HPU catching dust around my house.

    For killer apps, I'm thinking what many people will want/need pre-consumer at least are AR\VR development, bootstrap productivity tools with the HoloLens as the enabler. If someone hasn't already created it (harder to search for a 3d IDE development environment - what I found makes me roll my eyes so far: 3d-ide?), I'd expect soon as next month a Unity Editor add-on to allow your manipulation of selected Unity GameObjects as a hologram....making it more intuitive to develop 3D VR and AR or traditional games, at least for newbies . One thing I dislike now with my beginners dev cycles for VR (and poor spatial reasoning) is the workflow of switching between VR headset and IDE...I think having HoloLens and an Unity Editor add-on to broadcast a hologram-stream will shorten my dev cycles..

  • james_ashleyjames_ashley ✭✭✭✭

    The HoloLens Emulator is supposed to be the tool to cut down on the constant back and forth between IDE and headset. We're also going to get HoloLens Studio which will allow us to create -- and I assume save -- 3D models that can then be used in Unity.

    James Ashley
    VS 2017 v5.3.3, Unity 2017.3.0f3, MRTK 2017.1.2, W10 17063
    Microsoft MVP, Freelance HoloLens/MR Developer
    www.imaginativeuniversal.com

Sign In or Register to comment.