Gaze is the first form of input, and reveals the user's intent and awareness. You will add contextual awareness to your cursor and holograms, taking full advantage of what your app knows about the user's gaze.
Gestures turn user intention into action. With gestures, users can interact with holograms. In this course, you will learn to track the user's hands, respond to user input, and give feedback based on hand state and location.
Voice allows us to interact with our holograms in an easy and natural way. In this course, you will learn to make users aware of available voice commands, give feedback that a voice command was heard, and your app will use dictation to understand what the user is saying.
Spatial sound breathes life into holograms and gives them presence. In this course, you will learn to use spatial sound to ground holograms in the real world, give feedback during interactions, and use audio to find your holograms.
Spatial mapping brings the real world and virtual world together. You'll explore shaders and use them to visualize your space. Then you'll learn to simplify the room mesh into simple planes, give feedback on placing holograms on real-world surfaces, and explore occlusion visual effects.
Our //Build 2016 project! We will walk you through a complete project where we will share coordinate systems between devices and create a shared experience that allows us to take part in a shared holographic world.