Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Creating a new Gesture

Is it possible to create a new gesture? I was looking to create a swipe action if it is possible.

Thanks

Calzaretta

Best Answers

  • Answer ✓

    @Calzaretta we recommend not creating custom gestures for a few reasons:
    1. Gestures in HoloLens work across all apps. Custom gestures will break that paradigm.
    2. Users can learn the known gestures once and always be able to use them. Less learning on their part.

    Out of curiosity and for future, what cant you do with the current gestures provided by the GestureRecognizer?

  • edited April 2016 Answer ✓

    @HoloSheep yeah we recommend using either Navigation gesture (often useful for scrolling through large amounts of data like web pages or rotating objects) OR Manipulation gesture (often useful for more precise one to one movements like moving objects or adjusting size of holograms).

    I would say try using the Navigation gesture in your scenario and enable it along the X axis and when the user taps and starts moving left, you can write code to turn the page. The Holograms-211 Gesture academy can give you some code ideas if needed.

    Our recommendations are to use standard supported gestures in your app experiences. You can certainly choose to do different things with these gestures.

    Hope the above clarifies the different use cases for Navigation Vs Manipulation.

«1

Answers

  • From Gestures in Unity page of docs:

    Unity provides both low level access (raw position and velocity information) and a high level gesture recognizer that exposes more complex gesture events (for example: tap, double tap, hold, manipulation and navigation).

    Sounds like you might want to experiment with the Manipulation Gesture.

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • There's not really a built-in feature within Unity or the HoloLens SDK to record gestures yet. I had to manually program in a swipe gesture using the hand position system within interaction manager. Building good gestures requires a machine learning component to accommodate differences from human to human. There are probably some solid gesture machine learning libraries you could implement in your scripts; from what I can tell (I don't have a device yet), the hand tracking is awesome and can also account for depth which would open development to a whole new frontier in gesture creation.

    My swipe gesture is far from perfect, and was a simple fix to get it implemented for a menu system in our prototype, but I can certainly see the potential here. It's definitely on my list to dive further into gesture creation, so stay in touch.

    Skype or Email: Micheal.Reed@Outlook.com

    -- If you like this answer, please don't hesitate to give me a star --

  • Answer ✓

    @Calzaretta we recommend not creating custom gestures for a few reasons:
    1. Gestures in HoloLens work across all apps. Custom gestures will break that paradigm.
    2. Users can learn the known gestures once and always be able to use them. Less learning on their part.

    Out of curiosity and for future, what cant you do with the current gestures provided by the GestureRecognizer?

  • @Bryan thank you for the detailed feedback. Will pass this along to our teams.

  • Actually, what would be pretty sweet is finger tracking, combined with hand tracking. This would open up many possibilities for custom gestures.

  • @neerajwadhwa said:
    Out of curiosity and for future, what cant you do with the current gestures provided by the GestureRecognizer?

    I was looking to swipe left or right to work just like a phone swipe. The app could implement or not, but as example in the browser on phones you can swipe right to go back to the prior page.

    It just allows for quicker actions, over having to gaze and use your finger to select.

    Thanks

    Calzaretta

  • HoloSheepHoloSheep mod
    edited April 2016

    @neerajwadhwa so would a variation of the swipe left or right interaction be feasible to implement with the "Manipulation Gesture"?

    The documentation makes it sound like the interaction might need to be triggered by an initial gaze and tap... but as an example

    if the target hologram was a book of pages it sounds like we could use the "Manipulation Gesture" once we gazed at the book and tapped it to then swipe left or right with our hands to move between pages of the virtual book until we found the page we were looking for?

    So for experience specific scenarios it looks like we could use one or more of the existing gesture types to implement this kind of thing without building a new low level gesture with machine learning smarts.

    Am I correct in understanding that your recommendations on this point are geared to system wide supported operations like dismissing windows and less about experience specific?

    But for clarity, apart from the particular recommendations, is it possible to create a swipe experience inside an app by basically leveraging the Manipulation Gesture?

    Thanks.

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • edited April 2016 Answer ✓

    @HoloSheep yeah we recommend using either Navigation gesture (often useful for scrolling through large amounts of data like web pages or rotating objects) OR Manipulation gesture (often useful for more precise one to one movements like moving objects or adjusting size of holograms).

    I would say try using the Navigation gesture in your scenario and enable it along the X axis and when the user taps and starts moving left, you can write code to turn the page. The Holograms-211 Gesture academy can give you some code ideas if needed.

    Our recommendations are to use standard supported gestures in your app experiences. You can certainly choose to do different things with these gestures.

    Hope the above clarifies the different use cases for Navigation Vs Manipulation.

  • @Micheal Hi, I'm interested in doing similar things. But one thing is bothering me. How can I got data from Hololens. You have said in your answer that "using the hand position system within interaction manager". I still not get the interface of getting data. Is it device portal API?

  • @yang sample code and more info on interaction manger here.

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • @HoloSheep Thanks a lot~

    I know that there can be detect some gestures based on the Unity. I'm wondering that where can I get 3D point clouds and video from hololens to train a hand gesture recognition by myself. Is there any interface to get these datas?

  • @Yang no point cloud data is exposed from the HoloLens.

    Note that the gesture frame is different from the gaze or camera frame, however one api that might interest you (besides the interaction manger) is the Locatable Camera.

    But, as has been said elsewhere, low level gesture recognition and training is not supported on the HoloLens at this time.

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • @HoloSheep You mention that the gesture frame of hololens is different from the gaze or the camera frame? Does it mean that Hololens uses 2D frame to recognize gesture?

  • @Yang no it is a different 3D zone of space you might say.

    From the docs:

    HoloLens looks for hand input within a cone in front of the device, known as the gesture frame, which extends above, below, left and right of the display frame where holograms appear. This lets you keep your elbow comfortably at your side while providing hand input.

    Windows Holographic User Group Redmond

    WinHUGR.org - - - - - - - - - - - - - - - - - - @WinHUGR
    WinHUGR YouTube Channel -- live streamed meetings

  • @neerajwadhwa are we able to create a new complex gesture with Hololens? We are trying to develop a training module for CPR with the hololens. My question is - is there a way for the hololens to be programmed to recognize chest compressions?
    Also does it have any depth perception of hand motion? What I mean by that is - does the hololens have the capability to recognize how "far down" you're pushing your hand?

  • @Yash, there are no supported APIs for creating custom gesture. Plus there is no training mechanism either. You do have access to x, y and z of the hands using the Interaction Manager events.

  • I know its not recommended, but in regards to creating your own gestures, there is a git repository that includes a machine learning component for creating gestures. You can find it here. https://github.com/nickgillian/grt

  • @neerajwadhwa, this is fine for app development, but it doesn't really work for games. There is no standard set of gamepad commands for Xbox games, or mouse and keyboard actions for desktop games, or touch screen gestures for mobile games that will work universally for every game. Gaming experiences differ so widely that game developers need to be able to define their own custom interactions apart from some commercial standard. Microsoft is shooting itself in the foot with the HoloLens as a target game platform at a critical moment in the emergence of games on VR and AR platforms. AAA studios will eventually demand this functionality; in the meantime, game developers are going to use high-level hacks in C++ and C# to get around this limitation, as you have given us no other option. If it isn't forthcoming, developers will just switch to VR platforms like the Vive because there will be no incentive to make games for the HoloLens, as they both target the same operating system ecosystem. And if customers have the choice of a $900 peripheral for their existing computer, with lots of robust gaming experiences available, and a $3,000 standalone computer with lower specs than their existing computer and very few gaming experiences, let alone robust ones, I think the market will make the obvious choice. I really hope Microsoft changes its mind about this, because the HoloLens is a tremendous platform and this could be a deal-breaker for game devs, and hence a platform breaker.

  • @ssedlmayr With X, Y, and Z position tracking, you can implement your own heuristics for gesture tracking. This is how it was done for Kinect, which is one of the few other devices with sesnors for gesture recognition out there. To get you started, here's a Kinect tutorial on gesture heuristics by a friend of mine. You should be able to adapt the concepts since it doesn't depend on any specialized APIs, really.

    James Ashley
    VS 2017 v5.3.3, Unity 2017.3.0f3, MRTK 2017.1.2, W10 17063
    Microsoft MVP, Freelance HoloLens/MR Developer
    www.imaginativeuniversal.com

  • @neerajwadhwa - One gesture we had discussed in the HoloDevelopers group that might be nice to have as a gesture that could be used would be a standard gesture that could be used to open and close a menu in game. One idea would be hand outstretched and plam down, rotated to palm up, to open, and palm up rotated to palm down to close the menu. This would stop people from having to build what is essentially a "hamburger button" to open and close menu components. There is an argument that you could use voice for that, but I think for accessibility and use in noisy environments, it would be useful to have a gesture for open and close menu as well.

  • BryanHartBryanHart
    edited October 2016

    I'm also interested in this, after I saw the Creators Update VR announcement.

    I imagine a game scenario where you have special gestures for things like changing weapons, or initiating special moves. Air-tapping/navigation/manipulation gesture on a HUD element would really break the immersion.
    Roll-your-own gesture heuristics is certainly an option, but this really seems like a feature that should be baked into the platform, if enabling VR gaming is part of their roadmap.

    Thinking that it could be done by specifying a 3D path for each finger, a special error margin, and a temporal error margin... then I can register it as a global event, or I can attach it to an entity such that the gesture must either start or end on the entity. But I have no experience with training gestures, so there are probably subtleties I haven't considered.

  • utekaiutekai ✭✭✭

    You can use voice to 'Swipe Left' and 'Swipe Right' to easily implement, and leave your hands free.

  • Pinch I/O to Zoom, Rotate, Swipe L+R. Watch iron man. Many practical gestures.

  • I would like to see a rotate gesture as all of the applications that have a rotate (ActionGram and 3D Viewer) are difficult to use.

  • @neerajwadhwa
    I'm working on an app in which I would LOVE to be able to snap my fingers instead of air tap. Is there any way for Hololens to see a snap?

  • @neerajwadhwa Follow up (sorry really excited), is there a way in the backend to make it such that when you grab a sliding axis (tap and hold) can you allow it hold onto your hand such that you can relax your fingers form the tap and hold position? In other words, I want to flatten my fingers because the tap and hold position is quickly uncomfortable.

  • @neerajwadhwa And last one, is there any material that Hololens can recognize? As in I would love to put something onto the tip of a drumstick such that when it came in contact with a hologram, it triggered an action. I would just need a material that could react to crossing holograms.

  • @ChaseBowman currently there are no such recommendations from a platform perspective but feedback noted.

  • @neerajwadhwa Do the developers not realize the tap gesture is really really bad for your tendon? Or that people would enjoy custom gestures like using finger guns to shoot? Why would you hold back developers in the name of "less to teach the user"?

Sign In or Register to comment.