Hello everyone.

We have decided to phase out the Mixed Reality Forums over the next few months in favor of other ways to connect with us.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

The plan between now and the beginning of May is to clean up old, unanswered questions that are no longer relevant. The forums will remain open and usable.

On May 1st we will be locking the forums to new posts and replies. They will remain available for another three months for the purposes of searching them, and then they will be closed altogether on August 1st.

So, where does that leave our awesome community to ask questions? Well, there are a few places we want to engage with you. For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality. If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers. And always feel free to hit us up on Twitter @MxdRealityDev.

Can i get spatial mapping mesh normals in a shader?

TubeliarTubeliar
edited October 2016 in Questions And Answers

I plan to visualise the spatial mapping mesh using a shader that will make use of normals. As a first step i made a shader that just visualises the normals. The shader works fine in Unity in the editor, even on meshes i downloaded form the device. I know i can get the normal if i raycast against the mesh. The problem however is that the exact same shader that works in the editor does not work on the device. The normals for all vertices seem to point to positive z. I'm using appdata_base.normal to get the normals. I also tried defining a struct that has a float4 with NORMAL semantics. Is there another way to obtains mesh normals that will work on the device?

Best Answer

Answers

  • sptspt ✭✭
    edited October 2016

    You can also calculate the normals by adding a geometry shader and doing it there.

  • @DanAndersen said:
    I believe that for the purposes of optimization, the normals are not precomputed.

    This sounds logical but i don't think this is true. Like i said, when i raycast against the meshes they do have normals, but maybe this is for physics only. Also, when i download a mesh from the device it does have normals, although Unity might compute them on import.

    I guess i'll have to test on the device whether or not the mesh actually has normals. Also i can test the shader on objects for which i know they have normals.

  • edited October 2016

    Unless I'm not overstanding this question, I guess the answer to this depends on the tools you're using. In DirectX/C++ & SharpDX/C# yes you can. I'm not sure how you do it in Unity however. I'm sure there has to be a way though.

    Here's the example in C++/Directx

    Dwight Goins
    CAO & Founder| Independent Architect | Trainer and Consultant | Sr. Enterprise Architect
    MVP | MCT | MCSD | MCPD | SharePoint TS | MS Virtual TS |Windows 8 App Store Developer | Linux Gentoo Geek | Raspberry Pi Owner | Micro .Net Developer | Kinect For Windows Device Developer
    http://dgoins.wordpress.com

  • I noticed that my lights in the shader weren't effectiving anything below them. I eventually realised the normals on the mesh weren't correct. As such i added RecalculateNormals() in the OnDataReady method which is called as a result of the following call.

    surfaceObserver.RequestMeshAsync(sd, OnDataReady);
    public void OnDataReady(SurfaceData cookedData, bool outputWritten, float elapsedCookTimeSeconds) { cookedData.outputMesh.mesh.RecalculateNormals(); }

Sign In or Register to comment.