Events
May 19, 6 PM - May 23, 12 AM
Calling all developers, creators, and AI innovators to join us in Seattle @Microsoft Build May 19-22.
Register todayThis browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Motion controllers are hardware accessories that allow users to take action in mixed reality. An advantage of motion controllers over gestures is that the controllers have a precise position in space, allowing for fine grained interaction with digital objects. For Windows Mixed Reality immersive headsets, motion controllers are the primary way that users will take action in their world.
Image: A Windows Mixed Reality motion controller
Feature | HoloLens (1st gen) | HoloLens 2 | Immersive headsets |
Motion controllers | ❌ | ❌ | ✔️ |
Windows Mixed Reality motion controllers offer precise and responsive movement tracking in your field of view using the sensors in the immersive headset. There's no need to install hardware on the walls in your space. These motion controllers will offer the same ease of setup and portability as Windows Mixed Reality immersive headsets. Our device partners plan to market and sell these controllers on retail shelves this holiday.
Get to know your controller
Features:
You'll need:
Check for Windows, Unity, and driver updates
Motion controllers can be bonded with host PC using Windows settings like any other Bluetooth device.
After successfully pairing both controllers, your settings should look like the following, under “Mouse, keyboard, & pen” category
Image: Motion controllers connected
If the controllers are turned off after pairing, their status will show up as Paired. For controllers permanently under the “Other devices” category, pairing may have only partially completed. In this case, run the pairing steps again to get controller functional.
Windows Mixed Reality supports two key models for interaction; gaze and commit and point and commit:
Apps that support pointing with motion controllers should also enable gaze-driven interactions where possible, to give users a choice in what input devices they use.
When using motion controllers to point and commit, your users will use the controller to target and interact by pulling its trigger. Users who pull the trigger vigorously may end up aiming the controller higher at the end of their trigger pull than they'd intended.
To manage any such recoil that may occur when users pull the trigger, your app can snap its targeting ray when the trigger's analog axis value rises above 0.0. You can then take action using that targeting ray a few frames later once the trigger value reaches 1.0, as long as the final press occurs within a short time window. When using the higher-level composite Tap gesture, Windows will manage this targeting ray capture and timeout for you.
Windows Mixed Reality supports motion controllers in different form factors, with each controller's design differing in its relationship between the user's hand position and the natural "forward" direction that apps should use for pointing when rendering the controller.
To better represent these controllers, there are two kinds of poses you can investigate for each interaction source; the grip pose and the pointer pose.
The grip pose represents the location of either the palm of a hand detected by a HoloLens, or the palm holding a motion controller.
On immersive headsets, the grip pose is best used to render the user's hand or an object held in the user's hand, such as a sword or gun. The grip pose is also used when visualizing a motion controller, as the renderable model provided by Windows for a motion controller uses the grip pose as its origin and center of rotation.
The grip pose is defined specifically as follows:
The pointer pose represents the tip of the controller pointing forward.
The system-provided pointer pose is best used to raycast when you're rendering the controller model itself. If you're rendering some other virtual object in place of the controller, such as a virtual gun, you should point with a ray that is most natural for that virtual object, such as a ray that travels along the barrel of the app-defined gun model. Because users can see the virtual object and not the physical controller, pointing with the virtual object will likely be more natural for those using your app.
Like the headsets, the Windows Mixed Reality motion controller requires no setup of external tracking sensors. Instead, the controllers are tracked by sensors in the headset itself.
If the user moves the controllers out of the headset's field of view, in most cases Windows will continue to infer controller positions and provide them to the app. When the controller has lost visual tracking for long enough, the controller's positions will drop to approximate-accuracy positions.
At this point, the system will body-lock the controller to the user, tracking the user's position as they move around, while still exposing the controller's true orientation using its internal orientation sensors. Many apps that use controllers to point at and activate UI elements can operate normally while in approximate accuracy without the user noticing.
Apps that wish to treat positions differently based on tracking state may go further and inspect properties on the controller's state, such as SourceLossRisk and PositionAccuracy:
Tracking state | SourceLossRisk | PositionAccuracy | TryGetPosition |
---|---|---|---|
High accuracy | < 1.0 | High | true |
High accuracy (at risk of losing) | == 1.0 | High | true |
Approximate accuracy | == 1.0 | Approximate | true |
No position | == 1.0 | Approximate | false |
These motion controller tracking states are defined as follows:
The core interactions across hands and motion controllers are Select, Menu, Grasp, Touchpad, Thumbstick, and Home.
Both hand gestures and motion controllers can be tracked over time to detect a common set of high-level composite gestures. This enables your app to detect high-level tap, hold, manipulation and navigation gestures, whether users end up using hands or controllers.
3D controller models Windows makes available to apps a renderable model of each motion controller currently active in the system. By having your app dynamically load and articulate these system-provided controller models at runtime, you can ensure your app is forward-compatible to any future controller designs.
We recommend rendering all renderable models at the grip pose of the controller, as the origin of the model is aligned with this point in the physical world. If you're rendering controller models, you may then wish to raycast into your scene from the pointer pose, which represents the ray along which users will naturally expect to point, given that controller's physical design.
For more information about how to load controller models dynamically in Unity, see the Rendering the motion controller model in Unity section.
2D controller line art While we recommend attaching in-app controller tips and commands to the in-app controller models themselves, some developers may want to use 2D line art representations of the motion controllers in flat "tutorial" or "how-to" UI. For those developers, we've made .png motion controller line art files available in both black and white below (right-click to save).
Full-resolution motion controllers line art in '''white'''
Full-resolution motion controllers line art in '''black'''
Motion controllers support pairing with a single PC. Follow instructions on motion controller setup to pair your controllers.
Motion controller firmware is part of the headset driver and will be updated automatically on connection, if necessary. Firmware updates typically take 1-2 minutes depending on Bluetooth radio and link quality. In rare cases, controller firmware updates may take up to 10 minutes, which can indicate poor Bluetooth connectivity or radio interference. See Bluetooth best practices in the Enthusiast Guide to troubleshoot connectivity issues. After a firmware update, controllers will reboot and reconnect to the host PC (you may notice the LEDs go bright for tracking). If a firmware update is interrupted (for example, the controllers lose power), it will be attempted again the next time the controllers are powered on.
In the Windows Mixed Reality home, you can turn your controller over to see its battery level on the reverse side of the virtual model. There's no physical battery level indicator.
Not for Universal Windows Applications.
See motion controller troubleshooting in the Enthusiast Guide.
Give us feedback in Feedback Hub, using the "Mixed Reality -> Input" category.
Events
May 19, 6 PM - May 23, 12 AM
Calling all developers, creators, and AI innovators to join us in Seattle @Microsoft Build May 19-22.
Register todayTraining
Module
Introduction to the Mixed Reality Toolkit - Set Up Your Project and Use Hand Interaction - Training
This course provides the user with a basic understanding of all the foundational elements of MRTK.