DirectX development overview
Windows Mixed Reality apps use the holographic rendering, gaze, gesture, motion controller, voice and spatial mapping APIs to build mixed reality experiences for HoloLens and immersive headsets. You can create mixed reality apps using a 3D engine, such as Unity, or you can directly code to the Windows Mixed Reality APIs using DirectX 11 or DirectX 12. If you are leveraging the platform directly, you'll essentially be building your own middleware or framework. The Windows APIs support apps written in both C++ and C#. If you'd like to use C#, your application can leverage the SharpDX open source software library.
Windows Mixed Reality supports two kinds of apps:
- Mixed reality apps (UWP or Win32), which use the HolographicSpace API to render an immersive view to the user that fills the headset display.
- 2D apps (UWP), which use DirectX, XAML or other frameworks to render 2D views on slates in the Windows Mixed Reality home.
The differences between DirectX development for 2D views and immersive views are primarily related to holographic rendering and spatial input. Your UWP app's IFrameworkView or your Win32 app's HWND are still required and remain largely the same, as do the WinRT APIs available to your app. However, you use a different subset of these APIs to take advantage of holographic features. For example, the swap chain is managed by the system for holographic apps, and you work with the HolographicSpace API rather than DXGI to present frames.
To begin developing immersive apps:
- For UWP apps, create a new UWP project using the templates in Visual Studio. Based on your language, Visual C++ or Visual C#, you will find the UWP templates under Windows Universal > Holographic.
- For Win32 apps, start from the BasicHologram Win32 sample.
This is a great way to get the code you need to add holographic rendering support to an existing app or engine. Code and concepts are presented in the template in a way that's familiar to any developer of real-time interactive software.
The following topics discuss the base requirements of adding Windows Mixed Reality support to DirectX-based middleware:
- Creating a holographic DirectX project: The holographic app template coupled with the documentation will show you the differences between what you're used to, and the special requirements introduced by a device that's designed to function while resting on your head.
- Getting a HolographicSpace: You'll first need to create a HolographicSpace, which will provide your app the sequence of HolographicFrame objects that represent each head position from which you'll render.
- Rendering in DirectX: Since a holographic swap chain has two render targets, you'll likely need to make some changes to the way your application renders.
- Coordinate systems in DirectX: Windows Mixed Reality learns and updates its understanding of the world as the user walks around, providing spatial coordinate systems that apps use to reason about the user's surroundings, including spatial anchors and the user's defined spatial stage.
Adding mixed reality capabilities and inputs
To enable the best possible experience for users of your immersive apps, you'll want to support the following key building blocks:
- Head and eye gaze in DirectX
- Hands and motion controllers in DirectX
- Voice input in DirectX
- Spatial sound in DirectX
- Spatial mapping in DirectX
There are other key features that many immersive apps will want to use, which are also exposed to DirectX apps:
- Shared spatial anchors in DirectX
- Locatable camera in DirectX
- Keyboard, mouse, and controller input in DirectX
Send feedback about: