Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

How GestureManager redirect the OnSelect message to projectile launcher in Tutorial240 ?

edited November 2016 in Questions And Answers

In Chapter 6 - Real-World Physics, it seems a user can shoot the orb whenever the user gazing at his friend's POLY or other reality surfaces.

I have studied a while the Gesture Manager.cs and Projectile Launcher.cs I did not find any assignment on the public GameObject OverrideFocusedObject in Gesture Manager.cs when user not gazing at any hologram.
And I also can't figure out how Gesture Manager launch the projectile anyway when user does gazing at some holograms. Any holograms gazing at will be assigned to the private GameObject focusedObject , but these holograms have no connection to the Projectile Launcher.cs.
we only got a sentence in Gesture Manager.cs that focusedObject.SendMessage("OnSelect");
this is really frustrating right now, can anyone please explain even a little bit ? Thanks.

Best Answer


  • @ahillier Thank you so much :smiley: I never thought about the change of "OverrideFocusedObject" was happened in AppStateManager.cs. I thought all the changes of it and FocusedObject will be within GestureManager.cs or GazeManager.cs. I still got confused about where should i put my own codes. Am i supposed NOT to modifiy source code files from holotoolkit if i am not prepared to dig deep into hololens api, and i just want to make a simple game or application on it ?

    And now back to AppStateManager.cs, we know after placing the hub, the OverrideFocusedObject = shootHandler. But I don't get the meaning of the syntax of this line
    shootHandler = GetComponent().gameObject; The "gameObject" is apparently not function or variable belonging to ProjectileLauncher.cs. why it could be used in this way ?

    And the 3rd question is that when user gazes at any POLY, the FocusedObject will not be OverrideFocusedObject, but the POLY object. But i have checked the 3 PlayerAvatar****.cs files, didn't find any OnSelect() that trigger the ProjectileLauncher .cs to shoot. How can message be linked to ProjectileLauncher .cs this time ?

    Thank you for your patience. Your answer will help me so a lot.

  • Hi @tep2016920,
    1) You are more than welcome to alter the HoloToolkit and make a version of the code that is customized for your project. This is common practice, and why many people create a branch of the project on GitHub. In our academies, we tend to keep project-specific code out of the HoloToolkit, because it usually does not map well to other projects (including other academy courses). We also want to make it easier on ourselves for updating the courses with the latest HoloToolkit when possible (note: Holograms 240 is somewhat out-of-date with the actual HoloToolkit on GitHub). Setting the overrideFocusedObject is one example of project-specific behavior that is dependent on your application's design. In Holograms 240, we set the override based on the application state. However, in Holograms 101, we have the TapToPlaceParent.cs script set the override. If you dig into the code from our other courses, you will probably find that they each do something different with the focused object.

    2) The following code will find an object in the scene that has the ProjectileLauncher script attached to it, and will return the underlying gameObject:

    shootHandler = GetComponent<ProjectileLauncher>().gameObject;

    So, 'shootHandler' is simply the gameObject that has the ProjectileLauncher script attached to it, which happens to be the 'HologramCollection' object in our scene. Once 'shootHandler' is set to be the overrideFocusedObject, all tapped events will be handled by the 'HologramCollection' object. Since the ProjectileLauncher script is attached to 'HologramCollection', the OnSelect() method in that component will respond to all tapped events, which is what spawns projectiles in the world.

    3) When the PlayerAvatarStore is generated, the 'AvatarSelector.cs' component will be added to each poly in the list. This is done via code (see PlayerAvatarStore.cs), so you won't see it attached to anything when examining the poly objects in Unity's inspector . During the 'PickingAvatar' state, the overrideFocusedObject is not set. Instead, the normal gaze/gesture behavior is used, so the OnSelect() function of the currently focused object will be called. Since there is an OnSelect() function in 'AvatarSelector', and each poly has an instance of that script, then only the poly that has focus will respond.

    Note: It should not be possible to spawn projectiles when you are picking a poly and/or moving the energyHub, because your application is in the wrong state. Once you have picked a poly and the energyHub's transform has been found (either by placing the EnergyHub or downloading the transform from the server), then 'shootHandler' will become the overrideFocusedObject and any tap events will result in projectiles being launched (as describe in #2 above). You can get out of this state by restarting the application or by saying 'Reset Target' to place the energyHub.

    Personally, I don't really like how Holograms 240 handles gesture input. I much prefer to have each object be responsible for its own behavior, and query application state to determine what that behavior should be. There is always more than one way to do something, and that's the beauty of programming :) I would suggest looking at the earlier academy courses, which are more straight-forward with their gesture handling. The key take-away from this discussion should be: if you have an application with multiple states, create an 'AppStateManager' to help manage those states.

    I hope this helps,

  • @ahillier Thank you ! Angela ! Such detailed explanation about holotoolkit and gesture input flow and the holographic app design suggestion. Advice from a veteran definitely helps.

    I have been trying to shoot a cube or whatever by an air-tap.
    Mt first version was directly inserting codes inside GestureRecognizer_TappedEvent()
    private void GestureRecognizer_TappedEvent(InteractionSourceKind source, int tapCount, Ray headRay) { var position = headRay.origin+ headRay.direction* 2.0f; Rigidbody cubeInstance; cubeInstance = Instantiate(CubePrefab, position, Quaternion.identity) as Rigidbody; cubeInstance.AddForce(cubeInstance.transform.forward * 1000); CalcFocusedObject(); OnTap(); }
    But in emulator, cube only got instantiated and then fell, no force was added. Then I try these lines with MainCamera in another new scene, and only run it in Unity, nothing related to hololens. This time I can press space to shoot the prefab. So I could not figure out why "Addforce" did not work in emulator.

    Then I reminded myself there is a projectile physics in Tutorial 240. I thought maybe I can find a guide there. Then I gave up the above approach. I created a new script, attached it to an empty gameobject. This gameobject also got Gaze & Gesture Manager scripts attached, similar to "HologramCollection". Inside the new script, I wrote the following, using the simplest way, just wanted to test whether I can at least shoot something or not.
    public GameObject CubePrefab; void awake() { GestureManager.Instance.OverrideFocusedObject = this.gameObject; } void OnSelect() { // these codes were copied from Tutorial 240 Vector3 ProjectilePosition = Camera.main.transform.position + Camera.main.transform.forward * 0.85f; Vector3 ProjectileDirection = Camera.main.transform.forward; GameObject cubeInstance = (GameObject) Instantiate(CubePrefab, ProjectilePosition, Quaternion.identity); Rigidbody rigidBody = GetComponent<Rigidbody>(); rigidBody.velocity = 4 * ProjectileDirection; rigidBody.angularVelocity = Random.onUnitSphere * 20; }

    But it did not work either. That's why I am trying to dig deep to see what did I miss. I am still struggling in this problem. :'(

  • ahillierahillier mod
    edited November 2016

    Hi @tep2016920,
    You might want to change your Awake() method to Start(), to be sure that the GazeManager has been initialized before setting the overrideFocusedObject.

    Your second script looks good (it works on my machine...), but you need to verify that your cubePrefab has gravity disabled. Otherwise, your cube will fall so fast to the ground that you might not realized anything is happening (unless you also have SpatialMapping, then you should get a colleciton of cubes at your feet). To be sure, you can set rigidbody.useGravity = false; right after creating the cubeInstance.

    Finally, you'll probably want to add another script to your cubePrefab that manages the lifetime of your object. After the object is created, you'll want to destroy it after a few seconds. If you get too many of them in your scene, they'll start to impact performance.

  • edited November 2016

    @ahillier said:
    Hi @tep2016920,
    You might want to change your Awake() method to Start(), to be sure that the GazeManager has been initialized before setting the overrideFocusedObject.

    Thank you ! @ahillier :smile: Great to know the code works on your machine. Now i know why it did not create any thing when i tried the second script. i didn't keep in mind that overrideFocusedObject variable is not inside Awake(). But may i know how does this related to GazeManager ? because overrideFocusedObject is defined in GestureManager.

    I have been using spatial mapping from the beginning And yes, the cube prefab enabled gravity in Unity inspector before, since my idea is to shoot it like a real projectile, moving forward while falling slowly. Unfortunately, the cube fell only, when i tested the first script.

    So now I disabled the gravity by unchecking the inspector box and typing rigidbody.useGravity = false; separately. But both of the ways did not work either :'( , the cube remained still in mid-air, the physics just did not apply. Then I go to check the orb prefab under " \Assets\Holograms\Support " in the 240 Unity project, i found that it wasn't even added the RigidBody component, i have no idea how Rigidbody rigidBody = GetComponent(); can function in this case. Anyway, I tried removing RigidBody of the cube prefab, and then tested it. The cubes stayed still and they can overlap with each other without bouncing, well, of course, right? They were not RigidBody. This became strange now, I have been using emulator by the way, I don't know whether the problem was cause d by it.

Sign In or Register to comment.