MR and Azure 303: Natural language understanding (LUIS) - Issue

Hi, I'm trying to run the MR and Azure 303 sample from the Mixed Reality Academy on a HoloLens. The application builds and starts running on the HoloLens, the mic is recognized, but whenever I say sentences declared as utterances nothing happens. I'm getting the following in the output window:

Windows Mixed Reality spatial locatability state changed to 3.

(Filename: C:\buildslave\unity\build\Runtime/VR/HoloLens/HoloLensWorldManager.cpp Line: 388)

UnloadTime: 6.156009 ms

Capturing Audio...

(Filename: C:\buildslave\unity\build\artifacts/generated/Metro/runtime/DebugBindings.gen.cpp Line: 51)

Mic Detected

(Filename: C:\buildslave\unity\build\artifacts/generated/Metro/runtime/DebugBindings.gen.cpp Line: 51)

Setting up 1 worker threads for Enlighten.

Thread -> id: 1518 -> priority: 1

Failed to get spatial stage statics - can't retrieve or interact with boundaries! Error code: '0x80040154'.

(Filename: C:\buildslave\unity\build\Runtime/VR/HoloLens/StageRoot.cpp Line: 20)

Any ideas what could be going wrong and why the voice commands are not being processed?


  • Any feedback on this? I would really appreciate it.
    Thank you

Sign In or Register to comment.