Native development overview
3D engines like Unity or Unreal aren't the only Mixed Reality development paths open to you. You can also create Mixed Reality apps by directly coding to the Windows Mixed Reality APIs with DirectX 11 or DirectX 12. By leveraging the platform directly, you're essentially building your own middleware or framework.
If you have an existing WinRT project that you'd like to maintain, head over to our main WinRT documentation.
Use the following checkpoints to bring your Unity games and applications into the world of mixed reality.
1. Getting started
Windows Mixed Reality supports two kinds of apps:
- Mixed-reality applications (UWP or Win32) that use the HolographicSpace API or OpenXR API to render an immersive view to the user that fills the headset display
- 2D apps (UWP) that use DirectX, XAML, or another framework to render 2D views on slates in the Windows Mixed Reality home
The differences between DirectX development for 2D views and immersive views primarily concern holographic rendering and spatial input. Your UWP application's IFrameworkView or your Win32 application's HWND are required and remain largely the same. The same is true for the WinRT APIs that are available to your app. But you must use a different subset of these APIs to take advantage of holographic features. For example, the swapchain and frame present is managed by the system for holographic applications in order to enable a pose-predicted frame loop.
|What is OpenXR?||Begin your native development journey by getting acquainted with OpenXR and what it has to offer|
|Install the latest tools||Download and install the latest native development tools|
|Set up for HoloLens 2||Configure your device and environment for HoloLens 2 development|
|Set up for immersive headsets||Configure your device and environment for Windows Mixed Reality development|
|Try a sample app||Explore a UWP and Win32 version of the same basic OpenXR app on your device|
|Learn the OpenXR API||Watch a 60-minute walkthrough video that tours all key components of the OpenXR API in Visual Studio|
|Add the OpenXR loader||Add the OpenXR loader to an existing native project to get started developing|
2. Core building blocks
Windows Mixed Reality applications use the following APIs to build mixed-reality experiences for HoloLens and other immersive headsets:
|Gaze||Let users target holograms with by looking at them|
|Gesture||Add spatial actions to your apps|
|Holographic rendering||Draw a hologram at a precise location in the world around your users|
|Motion controller||Let your users take action in your Mixed Reality environments|
|Spatial mapping||Map your physical space with a virtual mesh overlay to mark the boundaries of your environment|
|Voice||Capture spoken keywords, phrases, and dictation from your users|
You can find upcoming and in-development core features in the OpenXR roadmap documentation.
3. Deploying and testing
You can develop using OpenXR on a HoloLens 2 or Windows Mixed Reality immersive headset on the desktop. If you don't have access to a headset, you can use the HoloLens 2 Emulator or the Windows Mixed Reality Simulator instead.
A developers job is never done, especially when learning a new tool or SDK. The following sections can take you into areas beyond the beginner level material you've already completed, along with helpful resources if you get stuck. Note that these topics and resources are not in any sequential order, so feel free to jump around and explore!
If you're looking to level up your OpenXR game, check out the links below: