Camera in Unity
When you wear a mixed reality headset, it becomes the center of your holographic world. The Unity Camera component will automatically handle stereoscopic rendering and will follow your head movement and rotation when your project has "Virtual Reality Supported" selected with "Windows Mixed Reality" as the device (in the Other Settings section of the Windows Store Player Settings). This may be listed as "Windows Holographic" in older versions of Unity.
However, to fully optimize visual quality and hologram stability, you should set the camera settings described below.
These settings need to be applied to the Camera in each scene of your app.
By default, when you create a new scene in Unity, it will contain a Main Camera GameObject in the Hierarchy which includes the Camera component, but does not have the settings below properly applied.
Holographic vs. immersive headsets
The default settings on the Unity Camera component are for traditional 3D applications which need a skybox-like background as they don't have a real world.
- When running on an immersive headset, you are rendering everything the user sees, and so you'll likely want to keep the skybox.
- However, when running on a holographic headset like HoloLens, the real world should appear behind everything the camera renders. To do this, set the camera background to be transparent (in HoloLens, black renders as transparent) instead of a Skybox texture:
- Select the Main Camera in the Hierarchy panel
- In the Inspector panel, find the Camera component and change the Clear Flags dropdown from Skybox to Solid Color
- Select the Background color picker and change the RGBA values to (0, 0, 0, 0)
You can use script code to determine at runtime whether the headset is immersive or holographic by checking HolographicSettings.IsDisplayOpaque.
Positioning the Camera
It will be easier to lay out your app if you imagine the starting position of the user as (X: 0, Y: 0, Z: 0). Since the Main Camera is tracking movement of the user's head, the starting position of the user can be set by setting the starting position of the Main Camera.
Select Main Camera in the Hierarchy panel
In the Inspector panel, find the Transform component and change the Position from (X: 0, Y: 1, Z: -10) to (X: 0, Y: 0, Z: 0)
Camera in the Inspector pane in Unity
Rendering content too close to the user can be uncomfortable in mixed reality. You can adjust the near and far clip planes on the Camera component.
- Select the Main Camera in the Hierarchy panel
- In the Inspector panel, find the Camera component Clipping Planes and change the Near textbox from 0.3 to .85. Content rendered even closer can lead to user discomfort and should be avoided per the render distance guidelines.
When there are multiple Camera components in the scene, Unity knows which camera to use for stereoscopic rendering and head tracking by checking which GameObject has the MainCamera tag.
Recentering a seated experience
Both HoloLens and immersive headsets will reproject each frame your app renders to adjust for any misprediction of the user's actual head position when photons are emitted.
- Immersive headsets will perform positional reprojection, adjusting your holograms for misprediction in both position and orientation, if the app provides a depth buffer for a given frame. If a depth buffer is not provided, the system will only correct mispredictions in orientation.
- Holographic headsets like HoloLens will perform positional reprojection whether the app provides its depth buffer or not. Positional reprojection is possible without depth buffers on HoloLens as rendering is often sparse with a stable background provided by the real world.
If you know that you are building an orientation-only experience with rigidly body-locked content (e.g. 360-degree video content), you can explicitly set the reprojection mode to be orientation only by setting HolographicSettings.ReprojectionMode to HolographicReprojectionMode.OrientationOnly.
Sharing your depth buffers with Windows
Sharing your app's depth buffer to Windows each frame will give your app one of two boosts in hologram stability, based on the type of headset you're rendering for:
- Immersive headsets can perform positional reprojection when a depth buffer is provided, adjusting your holograms for misprediction in both position and orientation.
- Holographic headsets have a few different methods. HoloLens 1 will automatically select a focus point when a depth buffer is provided, optimizing hologram stability along the plane that intersects the most content. HoloLens 2 will stabilize content using Depth LSR (see Remarks).
To set whether your Unity app will provide a depth buffer to Windows:
- Go to Edit > Project Settings > Player > Universal Windows Platform tab > XR Settings.
- Expand the Windows Mixed Reality SDK item.
- Check or uncheck the Enable Depth Buffer Sharing check box. This will be checked by default in new projects created since this feature was added to Unity and will be unchecked by default for older projects that were upgraded.
Providing a depth buffer to Windows can improve visual quality so long as Windows can accurately map the normalized per-pixel depth values in your depth buffer back to distances in meters, using the near and far planes you've set in Unity on the main camera. If your render passes handle depth values in typical ways, you should generally be fine here, though translucent render passes that write to the depth buffer while showing through to existing color pixels can confuse the reprojection. If you know that your render passes will leave many of your final depth pixels with inaccurate depth values, you are likely to get better visual quality by unchecking "Enable Depth Buffer Sharing".## Next Development Checkpoint
Automatic Scene and Camera Setup with Mixed Reality Toolkit v2
Follow the step-by-step guide to add Mixed Reality Toolkit v2 to your Unity project and it will configure your project automatically. You can also manually configure the project without MRTK with the guide in the section below.
Next Development Checkpoint
If you're following the Unity development checkpoint journey we've laid out, you're in the midst of exploring the MRTK core building blocks. From here, you can proceed to the next building block:
Or jump to Mixed Reality platform capabilities and APIs:
You can always go back to the Unity development checkpoints at any time.