Holograms 211

Gestures turn user intention into action. With gestures, users can interact with holograms. In this course, we'll learn how to track the user's hands, respond to user input, and give feedback to the user based on hand state and location.

In Holograms 101, we used a simple air tap gesture to interact with our holograms. Now, we'll move beyond the air tap gesture and explore new concepts to:

  • Detect when the user's hand is being tracked and provide feedback to the user.
  • Use a navigation gesture to rotate our holograms.
  • Provide feedback when the user's hand is about to go out of view.
  • Use manipulation events to allow users to move holograms with their hands.

In this course, we'll revisit the Unity project Model Explorer, which we built in Holograms 210. Our astronaut friend is back to assist us in our exploration of these new gesture concepts.

Prerequisites

Project files

  • Download the files required by the project.
  • Unarchive the files to your desktop or other easy to reach location.

Errata and Notes

  • "Enable Just My Code" needs to be disabled (unchecked) in Visual Studio under Tools->Options->Debugging in order to hit breakpoints in your code.

Unity Setup

Instructions

  • Start Unity.
  • Select Open.
  • Navigate to the Gesture folder you previously unarchived.
  • Find and select the Model Explorer folder.
  • Click the Select Folder button.
  • In the Project panel, expand the Scenes folder.
  • Double-click ModelExplorer scene to load it in Unity.
  • In Unity select File > Build Settings.
  • If Scenes/ModelExplorer is not listed in Scenes In Build, click Add Open Scenes to add the scene.
  • Select Windows Store in the Platform list and click Switch Platform.
  • Set SDK to Universal 10 and UWP Build Type to D3D.
  • Check Unity C# Projects.
  • Click Build.
  • Create a New Folder named "App".
  • Single click the App Folder.
  • Press Select Folder and Unity will start building the project for Visual Studio.
  • When Unity is done building, a File Explorer window will appear.
  • Open the App Folder.
  • Open the ModelExplorer Visual Studio Solution.
  • Using the drop-down options near the top, change Debug to Release and ARM to x86.
  • Click the drop-down arrow on the right of 'Local Machine', and select Remote Machine.
  • Enter your device IP address and set Authentication Mode to Universal (Unencrypted Protocol). Click Select. If you do not know your device IP address, with your HoloLens look in Settings > Network & Internet > Advanced Options or ask Cortana "Hey Cortana, what's my IP address?"
  • Click Debug -> Start Without debugging in the menu or press Ctrl + F5. If this is the first time deploying to your device, you will need to pair it with Visual Studio. Follow these instructions: Pairing your HoloLens with Visual Studio
  • Note, you might notice some red errors in the Visual Studio Errors panel. It is safe to ignore them. Switch to the Output panel to view actual build progress. Errors in the Output panel will require you to make a fix (most often they are caused by a mistake in a script).

Chapter 1 - Hand detected feedback

Objectives

  • Subscribe to hand tracking events.
  • Use cursor feedback to show users when a hand is being tracked.

Instructions

  • In the Hierarchy panel, select the Managers object.
  • In the Inspector panel, click the Add Component button.
  • In the menu, type in the search box Hands Manager. Select the search result.

The HandsManager.cs script performs these steps:

  1. Subscribes to the SourceDetected and SourceLost events.
  2. Sets the HandDetected state.
  3. Unsubscribes from the SourceDetected and SourceLost events.
  • In the Hierarchy panel, select the Cursor object.
  • In the Inspector panel, click the Add Component button.
  • In the menu, type in the search box Cursor Feedback. Select the search result.
  • In the Project panel Holograms folder, find the HandDetectedFeedback asset.
  • Drag and drop the HandDetectedFeedback asset onto the Hand Detected Asset property in the Cursor Feedback (Script) component.
  • In the Hierarchy panel, expand the Cursor object.
  • Drag and drop CursorBillboard onto the Feedback Parent property of the Cursor Feedback (Script) component.

The CursorFeedback.cs script works like this:

  1. Instantiates the HandDetectedFeedback asset.
  2. Parents the HandDetectedFeedback asset to the cursor billboard object.
  3. Activates/deactivates the HandDetectedFeedback asset based on the HandDetected state.

Build and Deploy

  • In Unity, use File > Build Settings to rebuild the application.
  • Open the App folder.
  • If it's not already open, open the ModelExplorer Visual Studio Solution.
    • (If you already built/deployed this project in Visual Studio during set-up, then you can open that instance of VS and click 'Reload All' when prompted).
  • In Visual Studio, click Debug -> Start Without debugging or press Ctrl + F5.
  • After the application deploys to the HoloLens, dismiss the fit box using the air tap gesture.
  • Move your hand into view and point your index finger to the sky to start hand tracking.
  • Move your hand left, right, up and down.
  • Watch how the cursor changes when your hand is detected and then lost from view.

Chapter 2 - Navigation

Objectives

  • Use Navigation gesture events to rotate the astronaut.

Instructions

To use Navigation gestures in our app, we are going to edit four scripts to do the following:

  1. The focused object will be tracked by HandsManager.cs.
  2. Navigation events will be handled by GestureManager.cs.
  3. Rotating objects when the Navigation gesture occurs, will be handled by GestureAction.cs.
  4. The cursor will provide Navigation feedback to the user via CursorStates.cs.

Let's get started.

  • Open the HandsManager.cs script in Visual Studio.

We track the focused Interactible object in HandsManager.cs. Copy the completed code below into HandsManager.cs, or you can code this yourself by following the marked coding exercises.

HandsManager.cs[show]

Now let's work on rotating the astronaut whenever the user performs the Navigation gesture.

  • In the Hierarchy panel, click on Cursor.
  • In the Holograms folder, find the ScrollFeedback asset.
  • Drag and drop the ScrollFeedback asset onto the Scroll Detected Asset property of the Cursor Feedback (Script) component, which is visible in the Inspector panel.
  • In the Hierarchy panel, select the AstroMan object.
  • In the Inspector panel, click the Add Component button.
  • In the menu, type in the search box Gesture Action. Select the search result.
  • In the Hierarchy panel, find and select the Managers object.
  • Double-click on the GestureManager script to open it in Visual Studio.

We need to edit the GestureManager.cs file to perform these steps:

  1. Instantiate the NavigationRecognizer as a new GestureRecognizer.
  2. Use SetRecognizableGestures to recognize NavigationX and Tap gestures.
  3. Handle NavigationStarted, NavigationUpdated, NavigationCompleted, NavigationCanceled events.

You are welcome to write code by following the comments for coding sections found in GestureManager.cs, or you can replace the file contents with the following code block:

GestureManager.cs[show]

Next, open GestureAction.cs in Visual Studio. Edit the script to do the following:

  1. Rotate the AstroMan object whenever a Navigation gesture is preformed.
  2. Calculate the rotationFactor to control the amount of rotation applied to the object.
  3. Rotate the object around the y-axis when the user moves their hand left or right.

Complete the coding exercises in the script, or replace the code with the completed solution below:

GestureAction.cs[show]

Build and Deploy

  • Rebuild the application in Unity and then build and deploy from Visual Studio to run it in the HoloLens.
  • Gaze at the astronaut, two arrows should appear on either side of the cursor. This new visual indicates that the astronaut can be rotated.
  • Place your hand in the ready position (index finger pointed towards the sky) so the HoloLens will start tracking your hand.
  • To rotate the astronaut, lower your index finger to a pinch position, and then move your hand left or right to trigger the NavigationX gesture.

Chapter 3 - Hand Guidance

Objectives

  • Use hand guidance score to help predict when hand tracking will be lost.
  • Provide feedback on the cursor to show when the user's hand nears the camera's edge of view.

Instructions

  • In the Hierarchy panel, select the Managers object.
  • In the Inspector panel, click the Add Component button.
  • In the menu, type in the search box Hand Guidance. Select the search result.
  • In the Project panel Holograms folder, find the HandGuidanceFeedback asset.
  • Drag and drop the HandGuidanceFeedback asset onto the Hand Guidance Indicator property in the Inspector panel.
  • In the Hierarchy panel, expand the Cursor object.
  • In the Hierarchy panel, select the Managers object.
  • Drag & drop CursorBillboard from the Hierarchy panel onto the Indicator Parent property in the Inspector.

Build and Deploy

  • Rebuild the application in Unity and then build and deploy from Visual Studio to experience the app on HoloLens.
  • Bring your hand into view and raise your index finger to get tracked.
  • Start rotating the astronaut with the Navigation gesture (pinch your index finger and thumb together).
  • Move your hand far left, right, up, and down.
  • As your hand nears the edge of the gesture frame, an arrow should appear next to the cursor to warn you that hand tracking will be lost. The arrow indicates which direction to move your hand in order to prevent tracking from being lost.

Chapter 4 - Manipulation

Objectives

  • Use Manipulation events to move the astronaut with your hands.
  • Provide feedback on the cursor to let the user know when Manipulation can be used.

Instructions

GestureManager.cs and AstronautManager.cs will allow us to do the following:

  1. Use the speech keyword "Move Astronaut" to enable Manipulation gestures.
  2. Switch to using the Manipulation Gesture Recognizer.
  3. Manage GestureRecognizer transitions when switching between Navigation and Manipulation.

Let's get started.

  • In the Hierarchy panel, select the Managers object.
  • In the Inspector panel, click the Add Component button.
  • In the menu, type in the search box Astronaut Manager. Select the search result.
  • In the Hierarchy panel, click on Cursor.
  • In the Project panel Holograms folder, find the PathingFeedback asset.
  • Drag and drop the PathingFeedback asset onto the Pathing Detected Asset property in the Cursor States (Script) component in the Inspector.

Now we need to add code to GestureAction.cs to enable the following:

  1. Add code to PerformManipulationUpdate function, which will move the astronaut when a Manipulation gesture is detected.
  2. Calculate the movement vector to determine where the astronaut should be moved to based on hand position.
  3. Move the astronaut to the new position.

Complete the coding exercise in GestureAction.cs, or use our completed solution below:

GestureAction.cs[show]

Build and Deploy

  • Rebuild in Unity and then build and deploy from Visual Studio to run the app in HoloLens.
  • Move your hand in front of the HoloLens and raise your index finger so that it can be tracked.
  • Focus the cursor over the astronaut.
  • Say 'Move Astronaut' to move the astronaut with a Manipulation gesture.
  • Four arrows should appear around the cursor to indicate that the program will now respond to Manipulation events.
  • Lower your index finger down to your thumb, and keep them pinched together.
  • As you move your hand around, the astronaut will move too (this is Manipulation).
  • Raise your index finger to stop manipulating the astronaut.
  • Note: If you do not say 'Move Astronaut' before moving your hand, then the Navigation gesture will be used instead.

Chapter 5 - Model expansion

Objectives

  • Expand the Astronaut model into multiple, smaller pieces that the user can interact with.
  • Move each piece individually using Navigation and Manipulation gestures.

Instructions

In this section, we will accomplish the following tasks:

  1. Add a new keyword "Expand Model" to expand the astronaut model.
  2. Add a new Keyword "Reset Model" to return the model to its original form.

Complete the coding exercise in AstronautManager.cs, or copy and paste the finished code from below:

AstronautManager.cs[show]

Build and Deploy

  • Try it! Build and deploy the app to the HoloLens.
  • Say Expand Model to see the expanded astronaut model.
  • Use Navigation to rotate individual pieces of the astronaut suit.
  • Say Move Astronaut and then use Manipulation to move individual pieces of the astronaut suit.
  • Say Reset Model to return the astronaut to its original form.

The End

Congratulations! You have now completed Holograms 211 - Gesture.

  • You know how to detect and respond to hand tracking, navigation and manipulation events.
  • You understand the difference between Navigation and Manipulation gestures.
  • You know how to change the cursor to provide visual feedback for when a hand is detected, when a hand is about to be lost, and for when an object supports different interactions (Navigation vs Manipulation).