Academy: code, tutorials, and lessons
The Mixed Reality Academy is a set of online step-by-step tutorials with corresponding project files:
- The tutorials cover 100, 200, and 300 level topics, in which: 100-level covers project basics, 200-level covers core MR building blocks, and 300-level covers cloud service integration.
- Most courses cover concepts applicable to both HoloLens and immersive (VR) headsets.
- Each tutorial is organized by chapter, and most include video demonstrations of the key concepts.
- A Windows 10 PC with the correct tools installed is a common prerequiste to complete each tutorial.
We also have an in-person Mixed Reality Academy at the Reactor space in San Francisco. If you're looking for information about the physcial Academy space, or upcoming events, click here or scroll to the bottom of this page.
MR Basics 100: Getting started with Unity
Create a basic mixed reality app with Unity. This project can then serve as a starting template for any MR app you might want to build in Unity.
MR Basics 101: Complete project with device
Set up a complete project, introducing core mixed reality features (gaze, gesture, voice, spatial sound, and spatial mapping) using a HoloLens device.
MR Basics 101E: Complete project with emulator
Set up a complete project, introducing core mixed reality features (gaze, gesture, voice, spatial sound, and spatial mapping) using the HoloLens emulator.
MR Input 210: Gaze
Gaze is the first form of input, and reveals the user's intent and awareness. You will add contextual awareness to your cursor and holograms, taking full advantage of what your app knows about the user's gaze.
MR Input 211: Gesture
Gestures turn user intention into action. With gestures, users can interact with holograms. In this course, you will learn to track the user's hands, respond to user input, and give feedback based on hand state and location.
MR Input 212: Voice
Voice allows us to interact with our holograms in an easy and natural way. In this course, you will learn to make users aware of available voice commands, give feedback that a voice command was heard, and use dictation to understand what the user is saying.
MR Input 213: Motion controllers
This course will explore ways of visualizing motion controllers in immersive (VR) headsets, handling input events, and attaching custom UI elements to the controllers.
MR Spatial 220: Spatial sound
Spatial sound breathes life into holograms and gives them presence. In this course, you will learn to use spatial sound to ground holograms in the surrounding world, give feedback during interactions, and use audio to find your holograms.
MR Spatial 230: Spatial mapping
Spatial mapping brings the real world and virtual world together. You'll explore shaders and use them to visualize your space. Then you'll learn to simplify the room mesh into simple planes, give feedback on placing holograms on real-world surfaces, and explore occlusion visual effects.
MR Sharing 240: Multiple HoloLens devices
Our //Build 2016 project! Set up a complete project with coordinate systems shared between HoloLens devices, allowing users to take part in a shared holographic world.
MR Sharing 250: HoloLens and immersive headsets
In our //Build 2017 project, we demonstrate building an app that leverages the unique strengths of HoloLens and immersive (VR) headsets within a shared, cross-device experience.
MR and Azure 301: Language translation
Using the Azure Translator Text API, your mixed reality app can translate speech to text in another language. Learn how in this course!
MR and Azure 302: Computer vision
Use Azure Computer Vision APIs in a mixed reality app for image processing and analysis, without training a model.
MR and Azure 302b: Custom vision
Learn how to train a machine learning model, and use the trained model for image processing and analysis.
MR and Azure 303: Natural language understanding
This course will teach you how to use the Azure Language Understanding (LUIS) service to add natural language understanding into your mixed reality app.
MR and Azure 304: Face recognition
Learn how to use the Azure Face API to perform face detection and recognition in your mixed reality app.
MR and Azure 305: Functions and storage
In this course you will learn how to create and use Azure Functions, and store data within Azure Storage, within a mixed reality app.
MR and Azure 306: Streaming video
Learn how to use Azure Media Services to stream 360-degree video within a Windows Mixed Reality immersive (VR) experience.
MR and Azure 307: Machine learning
Leverage Azure Machine Learning Studio within your mixed reality app to deploy a large number of machine learning (ML) algorithms.
MR and Azure 308: Cross-device notifications
In this course, you'll learn how to use several Azure services to deliver push notifications and scene changes from a PC app to a mixed reality app.
MR and Azure 309: Application insights
Use the Azure Application Insights service to collect analytics on user behavior within a mixed reality app.
MR and Azure 310: Object detection
Train a machine learning model, and use the trained model to recognize similiar objects and their positions in the physical world.
MR and Azure 311: Microsoft Graph
Learn how to connect to Microsoft Graph services from within a mixed reality app.
MR and Azure 312: Bot integration
Create and deploy a bot using Microsoft Bot Framework v4, and communicate with it in a mixed reality app.
MR and Azure 313: IoT Hub Service
Learn how to implement Azure IoT Hub service on a virtual machine, and visualize the data on HoloLens.
The Microsoft Reactor in San Francisco, located at 680 Folsom in SOMA, serves as the flagship location for the Mixed Reality Capture Studio and the Mixed Reality Academy. It is a place where developers and creators can begin their journey building mixed reality experiences for Microsoft HoloLens and Windows Mixed Reality headsets.