DirectX development overview
Windows Mixed Reality apps are Windows 10 Universal Windows Platform applications that use the holographic rendering, gaze, gesture, motion controller and voice APIs. You can create immersive apps using a game engine, such as Unity, or you can use Windows Mixed Reality APIs with DirectX 11. Please note that DirectX 12 is not currently supported. If you are leveraging the platform directly, you'll essentially be building your own middleware or framework. The Windows APIs support apps written in both C++ and C#. If you'd like to use C#, your application can leverage the SharpDX open source software library.
Windows Mixed Reality supports both desktop Universal Windows Platform applications, which show up as an application with a 2D view in the Windows Mixed Reality home, and mixed reality apps, which use the HolographicSpace API to render an immersive view to the user.
The differences between DirectX UWP development for 2D views and immersive views are primarily related to holographic rendering and spatial input. The IFrameworkView is still required and remains largely the same, as do the WinRT APIs available to your app. However, you use a different subset of these APIs to take advantage of holographic features. For example, the swap chain is managed by the system for holographic apps, and you work with the HolographicSpace API rather than DXGI to present frames.
To begin developing immersive apps, create a new project using the templates in Visual Studio. Based on your language, Visual C++ or Visual C#, you will find the templates under Windows > Universal > Holographic. This is a great way to get the code you need to add holographic rendering support to an existing app or engine. Code and concepts are presented in the template in a way that's familiar to any developer of real-time interactive software.
The following topics discuss the base requirements of adding Windows Mixed Reality support to DirectX-based middleware:
- Creating a holographic DirectX project: The holographic app template coupled with the documentation will show you the differences between what you're used to, and the special requirements introduced by a device that's designed to function while resting on your head.
- Getting a HolographicSpace: You'll first need to create a HolographicSpace, which will provide your app the sequence of HolographicFrame objects that represent each head position from which you'll render.
- Rendering in DirectX: Since a holographic swap chain has two render targets, you'll likely need to make some changes to the way your application renders.
- Coordinate systems in DirectX: Windows Mixed Reality learns and updates its understanding of the world as the user walks around, providing spatial coordinate systems that apps use to reason about the user's surroundings, including spatial anchors and the user's defined spatial stage.
Adding mixed reality capabilities and inputs
To enable the best possible experience for users of your immersive apps, you'll want to support the following key building blocks:
- Gaze, gestures, and motion controllers in DirectX
- Voice input in DirectX
- Spatial sound in DirectX
- Spatial mapping in DirectX
There are other key features that many immersive apps will want to use, which are also exposed to DirectX apps:
- Shared spatial anchors in DirectX
- Locatable camera in DirectX
- Keyboard, mouse, and controller input in DirectX