Training
Module
Introduction to the Mixed Reality Toolkit - Set Up Your Project and Use Hand Interaction - Training
This course provides the user with a basic understanding of all the foundational elements of MRTK.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Important
The Mixed Reality Academy tutorials were designed with HoloLens (1st gen), Unity 2017, and Mixed Reality Immersive Headsets in mind. As such, we feel it is important to leave these tutorials in place for developers who are still looking for guidance in developing for those devices. These tutorials will not be updated with the latest toolsets or interactions being used for HoloLens 2 and may not be compatible with newer versions of Unity. They will be maintained to continue working on the supported devices. A new series of tutorials has been posted for HoloLens 2.
Gestures turn user intention into action. With gestures, users can interact with holograms. In this course, we'll learn how to track the user's hands, respond to user input, and give feedback to the user based on hand state and location.
In MR Basics 101, we used a simple air-tap gesture to interact with our holograms. Now, we'll move beyond the air-tap gesture and explore new concepts to:
In this course, we'll revisit the Unity project Model Explorer, which we built in MR Input 210. Our astronaut friend is back to assist us in our exploration of these new gesture concepts.
Important
The videos embedded in each of the chapters below were recorded using an older version of Unity and the Mixed Reality Toolkit. While the step-by-step instructions are accurate and current, you may see scripts and visuals in the corresponding videos that are out-of-date. The videos remain included for posterity and because the concepts covered still apply.
Course | HoloLens | Immersive headsets |
---|---|---|
MR Input 211: Gesture | ✔️ | ✔️ |
Note
If you want to look through the source code before downloading, it's available on GitHub.
When Unity is done, a File Explorer window will appear.
If deploying to HoloLens:
If deploying to an immersive headset:
Note
You might notice some red errors in the Visual Studio Errors panel. It is safe to ignore them. Switch to the Output panel to view actual build progress. Errors in the Output panel will require you to make a fix (most often they are caused by a mistake in a script).
Note
On HoloLens 2 , hands detected fires whenever hands are visible (not just when a finger is pointing up).
The InteractionInputSource.cs script performs these steps:
Next, we'll upgrade our cursor from MR Input 210 into one that shows feedback depending on the user's actions.
The Cursor State Data works like this:
To use Navigation gestures in our app, we are going to edit GestureAction.cs to rotate objects when the Navigation gesture occurs. Additionally, we'll add feedback to the cursor to display when Navigation is available.
Next, open GestureAction.cs in Visual Studio. In coding exercise 2.c, edit the script to do the following:
Complete coding exercises 2.c in the script, or replace the code with the completed solution below:
using HoloToolkit.Unity.InputModule;
using UnityEngine;
/// <summary>
/// GestureAction performs custom actions based on
/// which gesture is being performed.
/// </summary>
public class GestureAction : MonoBehaviour, INavigationHandler, IManipulationHandler, ISpeechHandler
{
[Tooltip("Rotation max speed controls amount of rotation.")]
[SerializeField]
private float RotationSensitivity = 10.0f;
private bool isNavigationEnabled = true;
public bool IsNavigationEnabled
{
get { return isNavigationEnabled; }
set { isNavigationEnabled = value; }
}
private Vector3 manipulationOriginalPosition = Vector3.zero;
void INavigationHandler.OnNavigationStarted(NavigationEventData eventData)
{
InputManager.Instance.PushModalInputHandler(gameObject);
}
void INavigationHandler.OnNavigationUpdated(NavigationEventData eventData)
{
if (isNavigationEnabled)
{
/* TODO: DEVELOPER CODING EXERCISE 2.c */
// 2.c: Calculate a float rotationFactor based on eventData's NormalizedOffset.x multiplied by RotationSensitivity.
// This will help control the amount of rotation.
float rotationFactor = eventData.NormalizedOffset.x * RotationSensitivity;
// 2.c: transform.Rotate around the Y axis using rotationFactor.
transform.Rotate(new Vector3(0, -1 * rotationFactor, 0));
}
}
void INavigationHandler.OnNavigationCompleted(NavigationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void INavigationHandler.OnNavigationCanceled(NavigationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void IManipulationHandler.OnManipulationStarted(ManipulationEventData eventData)
{
if (!isNavigationEnabled)
{
InputManager.Instance.PushModalInputHandler(gameObject);
manipulationOriginalPosition = transform.position;
}
}
void IManipulationHandler.OnManipulationUpdated(ManipulationEventData eventData)
{
if (!isNavigationEnabled)
{
/* TODO: DEVELOPER CODING EXERCISE 4.a */
// 4.a: Make this transform's position be the manipulationOriginalPosition + eventData.CumulativeDelta
}
}
void IManipulationHandler.OnManipulationCompleted(ManipulationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void IManipulationHandler.OnManipulationCanceled(ManipulationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void ISpeechHandler.OnSpeechKeywordRecognized(SpeechEventData eventData)
{
if (eventData.RecognizedText.Equals("Move Astronaut"))
{
isNavigationEnabled = false;
}
else if (eventData.RecognizedText.Equals("Rotate Astronaut"))
{
isNavigationEnabled = true;
}
else
{
return;
}
eventData.Use();
}
}
You'll notice that the other navigation events are already filled in with some info. We push the GameObject onto the Toolkit's InputSystem's modal stack, so the user doesn't have to maintain focus on the Astronaut once rotation has begun. Correspondingly, we pop the GameObject off the stack once the gesture is completed.
GestureManager.cs and AstronautManager.cs will allow us to do the following:
Let's get started.
We'll now add the speech commands required to control the interaction state of the astronaut.
Next, we'll setup the manipulation feedback on the cursor.
Now we need to add code to GestureAction.cs to enable the following:
Complete coding exercise 4.a in GestureAction.cs, or use our completed solution below:
using HoloToolkit.Unity.InputModule;
using UnityEngine;
/// <summary>
/// GestureAction performs custom actions based on
/// which gesture is being performed.
/// </summary>
public class GestureAction : MonoBehaviour, INavigationHandler, IManipulationHandler, ISpeechHandler
{
[Tooltip("Rotation max speed controls amount of rotation.")]
[SerializeField]
private float RotationSensitivity = 10.0f;
private bool isNavigationEnabled = true;
public bool IsNavigationEnabled
{
get { return isNavigationEnabled; }
set { isNavigationEnabled = value; }
}
private Vector3 manipulationOriginalPosition = Vector3.zero;
void INavigationHandler.OnNavigationStarted(NavigationEventData eventData)
{
InputManager.Instance.PushModalInputHandler(gameObject);
}
void INavigationHandler.OnNavigationUpdated(NavigationEventData eventData)
{
if (isNavigationEnabled)
{
/* TODO: DEVELOPER CODING EXERCISE 2.c */
// 2.c: Calculate a float rotationFactor based on eventData's NormalizedOffset.x multiplied by RotationSensitivity.
// This will help control the amount of rotation.
float rotationFactor = eventData.NormalizedOffset.x * RotationSensitivity;
// 2.c: transform.Rotate around the Y axis using rotationFactor.
transform.Rotate(new Vector3(0, -1 * rotationFactor, 0));
}
}
void INavigationHandler.OnNavigationCompleted(NavigationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void INavigationHandler.OnNavigationCanceled(NavigationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void IManipulationHandler.OnManipulationStarted(ManipulationEventData eventData)
{
if (!isNavigationEnabled)
{
InputManager.Instance.PushModalInputHandler(gameObject);
manipulationOriginalPosition = transform.position;
}
}
void IManipulationHandler.OnManipulationUpdated(ManipulationEventData eventData)
{
if (!isNavigationEnabled)
{
/* TODO: DEVELOPER CODING EXERCISE 4.a */
// 4.a: Make this transform's position be the manipulationOriginalPosition + eventData.CumulativeDelta
transform.position = manipulationOriginalPosition + eventData.CumulativeDelta;
}
}
void IManipulationHandler.OnManipulationCompleted(ManipulationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void IManipulationHandler.OnManipulationCanceled(ManipulationEventData eventData)
{
InputManager.Instance.PopModalInputHandler();
}
void ISpeechHandler.OnSpeechKeywordRecognized(SpeechEventData eventData)
{
if (eventData.RecognizedText.Equals("Move Astronaut"))
{
isNavigationEnabled = false;
}
else if (eventData.RecognizedText.Equals("Rotate Astronaut"))
{
isNavigationEnabled = true;
}
else
{
return;
}
eventData.Use();
}
}
In this section, we will accomplish the following tasks:
We'll do this by adding two more keywords to the Speech Input Source from the previous chapter. We'll also demonstrate another way to handle recognition events.
Congratulations! You have now completed MR Input 211: Gesture.
Training
Module
Introduction to the Mixed Reality Toolkit - Set Up Your Project and Use Hand Interaction - Training
This course provides the user with a basic understanding of all the foundational elements of MRTK.