Exercise - Azure Cloud Services for HoloLens 2

Completed

With each consecutive chapter, you will add new Azure Cloud services to expand the application features and user experience, while teaching you the fundamentals of each Azure Cloud service.

Note

This module series will focus on the HoloLens 2, but due the cross-platform nature of Unity, most of these lessons will also apply for desktop and mobile applications.

Application goals

In this module series, you'll build a HoloLens 2 application that can detect objects from images and find its spatial location. We will refer to these objects as Tracked Objects.

The user can create a Tracked Object to associate a set of images via computer vision, a spatial location, or both. All data must be persisted into the cloud. Furthermore, some aspects of the application will be optionally controlled by a bot with natural language assistance.

Features

  • Basic managing of data and images
  • Image training and detection
  • Storing a spatial location and guidance to it
  • Bot assistant to use some features via natural language

Azure Cloud Services

You'll use the following Azure Cloud services to implement the above features:

Azure Storage

You will use Azure Storage to persist data. Azure Storage allows you to store data in a table and upload large binaries like images.

Azure Custom Vision

With Azure Custom Vision (part of the Azure Cognitive Services) you can associate a set of images to Tracked Objects, train a machine learning model on the set you created, and detect the Tracked Objects.

Azure Spatial Anchors

To store a Tracked Object location and give a guided directions to find it, you'll use Azure Spatial Anchors.

Azure Bot Service

Your application will mainly be driven by traditional UI, so you'll use the Azure Bot Service to add some personality and act as a new interaction method.

Create and prepare the Unity project

In this section, you'll' create a new Unity project and get it ready for MRTK development.

First, follow the steps in Initializing your project and first application, excluding the Build your application to your device instructions, which includes the following steps:

  1. Creating the Unity project and giving it a suitable name, for example, Azure Cloud Tutorials
  2. Switching the build platform
  3. Importing the TextMeshPro Essential Resources
  4. Importing the Mixed Reality Toolkit
  5. Configuring the Unity project
  6. Creating and configuring the scene and give the scene a suitable name, for example, AzureCloudServices

Then follow the Changing the Spatial Awareness Display Option instructions to ensure the MRTK configuration profile for your scene is DefaultXRSDKConfigurationProfile and change the display options for the spatial awareness mesh to Occlusion.

Install in-built Unity packages

  • In the Unity menu, select Window > Package Manager to open the Package Manager window, then verify that AR Foundation > 4.1.7 version is installed.

    Screenshot of Unity Package Manager with AR Foundation selected.

    Note

    You are installing the AR Foundation package because the Azure Spatial Anchors SDK requires it, which you will import in the next section.

Importing the tutorial assets

  1. Add AzurespatialAnchors SDK V2.10 to your project, to add the packages please follow this tutorial

  2. Download and import the following Unity custom packages in the order they are listed:

    After you have imported the tutorial assets your Project window should look similar to this:

    Screenshot of Unity Hierarchy, Scene, and Project windows after importing the tutorial assets.

Prepare the scene

In this section, you will prepare the scene by adding some of the tutorial prefabs.

  1. In the Project window, navigate to the Assets > MRTK.Tutorials.AzureCloudServices > Prefabs > Manager folder. While holding down the CTRL button, select SceneController, RootMenu and DataManager to select the three prefabs:

    Screenshot of Unity with SceneController, RootMenu, and DataManager prefabs selected.

  2. The SceneController (prefab) contains two scripts, SceneController (script) and UnityDispatcher (script). The SceneController script component contains several UX functions and facilitates the photo-capture functionality, while UnityDispatcher is a helper class to allow execute actions on the Unity main thread.

    The RootMenu (prefab) is the primary UI prefab that holds all UI windows connected to each other through various small script components and control the general UX flow of the application.

    The DataManager (prefab) is responsible for talking to Azure storage, and will be explained further in the next tutorial.

  3. Now with the three prefabs still selected, drag them into the Hierarchy window to add them to the scene:

    Screenshot of Unity with newly added SceneController, RootMenu and DataManager prefabs still selected.

  4. To focus in on the objects in the scene, you can double-click the RootMenu object, then zoom slightly out again:

    Screenshot of Unity with RootMenu object selected.

    Tip

    If you find the large icons in your scene, (for example, the large framed 'T' icons) distracting, you can hide these by toggling the Gizmos to the off position.

Configuring the scene

In this section, you will connect SceneManager, DataManager, and RootMenu together to have a working scene for the next Integrating Azure storage tutorial.

Connect the objects

  1. In the Hierarchy window, select the DataManager object:

    Screenshot of Unity with DataManager object selected.

  2. In the Inspector window, locate the DataManager (Script) component and you will see an empty slot on the On Data Manager Ready () event. Drag the SceneController object from the Hierarchy window to the On Data Manager Ready () event.

    Screenshot of Unity with DataManager event listener added.

  3. The event's dropdown menu is now active. Select the dropdown menu and navigate to SceneController and select the Init () option in the submenu:

    Screenshot of Unity with DataManager event action added.

  4. From the Hierarchy window, select the SceneController object. You will find the SceneController (script) component in the Inspector.

    Screenshot of Unity with SceneController selected.

  5. There are now several unpopulated fields; let's change that. Move the DataManager object from the Hierarchy into the Data Manager field, then move the RootMenu GameObject from the Hierarchy into the Main Menu field.

    Screenshot of Unity with SceneController configured.

  6. Now your scene is ready for the upcoming tutorials. Don't forget to save it to your project.

Prepare project build pipeline

Note

Building and testing on HoloLens 2 is not mandatory. You can test on the HoloLens 2 Emulator if you don't have a HoloLens device. You can purchase devices at HoloLens.com.

While the project yet has to be filled with content, you still have some preparation before the project is ready to build on the HoloLens 2.

1. Add additional required capabilities

  1. In the Unity menu, select Edit > Project Settings... to open the Project Settings window:

    Screenshot of Unity open Project Settings.

  2. In the Project Settings window, select Player and then Publishing Settings:

    Screenshot of Unity Publishing Settings.

  3. In Publishing Settings, scroll down to the Capabilities section and double-check that the InternetClient, Microphone, and SpatialPerception capabilities (which you enabled when you created the project at the beginning of the tutorial) are enabled. Then, enable the InternetClientServer, PrivateNetworkClientServer, and Webcam capabilities:

    Screenshot of Unity Capabilities

2. Deploy the app to your HoloLens 2

Note

Building and testing on HoloLens 2 is not mandatory. You can test on the HoloLens 2 Emulator if you don't have a HoloLens device. You can purchase devices at HoloLens.com.

You won't be able to run all the features that you will use in this tutorial series inside the Unity editor. Therefore, you need to be familiar with deploying the application to your HoloLens 2 device.

Tip

For a reminder on how to build and deploy your Unity project to HoloLens 2, you can refer to the Getting started tutorials - Build your application to your device instructions.

3. Run the app on your HoloLens 2 and follow the in-app instructions

Caution

All Azure Services uses the internet, so make sure your device is connected to the internet.

When the application is running on your device, grant access to the following requested capabilities:

  • Microphone
  • Camera

These capabilities are required for services like Chat Bot and Custom Vision to function properly.