Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.
Options

Requesting Light Source API

BoBoDevBoBoDev
edited April 2016 in Discussion

Can we have build-in API that detects light sources and their intensity? Additionally, the API can also do auto-shadow using default behavior or we can implement our own shadow. It will be better if API handles identification of light sources for us. This will make development so much easier. Making virtual object so much more believable.

Comments

  • Options

    I thought about doing this myself. As the user is looking at the room/environment you can do snapshots of the camera and stitch them together (easier said than done). Then you can generate a cube map, which can be used for reflections are well as lighting analysis.
    The limitation you may want to consider is that lighting, especially pixel lighting, carries performance penalty per light in most modern 3d applications. This depends on your rendering pipeline, though deferred is better in this respect it still has a similar penalty. In this sense, you have to ask yourself, how many lights are required to moderately represent the realworld lighting captured?
    I've looked at possibly using shaders for this, as you can have a single main light and then a shader that takes the cube map and uses it for the more finer contributions the environment provides to the lighting appearance.
    Getting this done is no easy feat though, maybe take a stab at it yourself, and share with others here if you get something going.

  • Options
    BoBoDevBoBoDev
    edited April 2016

    I think your idea is more advance than mine. Mainly I just want to detect where the lights are. Auto cube map generation as a light source is good too.

    How user wants to use those lights (converted from physical world into virtual world) are up to individual developers. For me personally, I don't need the light source to lit up the virtual objects accurately. Maybe just a simple global light per object to make entire object darker when most of its body is under a shadow. And then, use Forward+ to select top 3 light sources to compute a simple shadow of the virtual objects, so, it doesn't feel like it is floating in air. Even just 1 light source to cast shadow is good enough.

  • Options

    @Jarrod1937 it almost sounds like you are considereing creating a custom runtime variant of light probes (as opposed to the new Unity 5 Light Probes that prebake)? For those curious about light probes here is a helpful video tutorial on the topic. No easy feat indeed :) I also spotted and interesting post from someone who did their own flavor of custom light probes that might be inspirational.

    @BoBoDev as far as detection goes are you interested in identifying hot spots? or are you thinking some sort of more advance image analysis to detect gradient change direction of surface color?

    It might be pretty easy in the app startup and initialization to have the user scan the room and simply look for the hottest spot from the photo camera that you can find and treat that as your primary light source. Primitive for sure, but I am guessing much more immediately practical than waiting for a new platform api (unless it is already in development).

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • Options

    @HoloSheep, indeed, they'll probably stay only considerations for now. While the effect would be nice, I am not sure if the processing required would be worth it. I'm thinking maybe a simpler approximation would be better so that it can be updated in real time as lighting in the environment changes,

  • Options

    @BoBoDev,

    Thank you for your feature request. I have filed it with the appropriate team.

    David

Sign In or Register to comment.