使用 HolographicSpace API 撰寫全像遠端執行遠端應用程式Writing a Holographic Remoting remote app using the HolographicSpace API

重要

本檔說明如何使用 HOLOGRAPHICSPACE API建立 HoloLens 2 的遠端應用程式。This document describes the creation of a remote application for HoloLens 2 using the HolographicSpace API. HoloLens (第1代) 的遠端應用程式必須使用 NuGet 套件1.x 版。Remote applications for HoloLens (1st gen) must use NuGet package version 1.x.x. 這表示針對 HoloLens 2 撰寫的遠端應用程式不相容于 HoloLens 1,反之亦然。This implies that remote applications written for HoloLens 2 are not compatible with HoloLens 1 and vice versa. 您可以在 這裡找到 HoloLens 1 的檔。The documentation for HoloLens 1 can be found here.

全像的遠端處理應用程式可以將遠端轉譯的內容串流至 HoloLens 2 並 Windows Mixed Reality 沉浸式耳機。Holographic Remoting apps can stream remotely rendered content to HoloLens 2 and Windows Mixed Reality immersive headsets. 您也可以存取更多的系統資源,並將遠端 沉浸式觀點 整合到現有的桌上型電腦軟體。You can also access more system resources and integrate remote immersive views into existing desktop PC software. 遠端應用程式會從 HoloLens 2 接收輸入資料流程、在虛擬沉浸式視圖中轉譯內容,並將內容框架串流回 HoloLens 2。A remote app receives an input data stream from HoloLens 2, renders content in a virtual immersive view, and streams content frames back to HoloLens 2. 連接是使用標準 Wi-fi 進行的。The connection is made using standard Wi-Fi. 在桌面或 UWP 應用程式中,會透過 NuGet 封包新增全像的遠端處理。Holographic Remoting is added to a desktop or UWP app via a NuGet packet. 需要額外的程式碼,以處理連接並以沉浸式視圖呈現。Additional code is required which handles the connection and renders in an immersive view. 一般的遠端連線的延遲將會降到50毫秒。A typical remoting connection will have as low as 50 ms of latency. 播放程式應用程式可以即時報告延遲時間。The player app can report the latency in real time.

此頁面上的所有程式碼和工作專案都可在「全像 遠端範例」 github 存放庫中找到。All code on this page and working projects can be found in the Holographic Remoting samples github repository.

必要條件Prerequisites

良好的起點是以 HOLOGRAPHICSPACE API為目標的可運作 DirectX 型桌面或 UWP 應用程式。A good starting point is a working DirectX based Desktop or UWP app, which targets the HolographicSpace API. 如需詳細資訊,請參閱 DirectX 開發總覽For details see DirectX development overview. C + + 全息版專案範本是不錯的起點。The C++ holographic project template is a good starting point.

重要

使用「全像」遠端處理的任何應用程式都應該撰寫為使用 多執行緒的單元Any app using Holographic Remoting should be authored to use a multi-threaded apartment. 支援使用 單一執行緒的單元 ,但會導致效能不佳,而且可能會在播放期間間斷情形。The use of a single-threaded apartment is supported but will lead to sub-optimal performance and possibly stuttering during playback. 使用 c + +/WinRT WinRT:: init_apartment 多執行緒的單元是預設值。When using C++/WinRT winrt::init_apartment a multi-threaded apartment is the default.

取得全像 Remoting NuGet 套件Get the Holographic Remoting NuGet package

若要將 NuGet 套件新增至 Visual Studio 中的專案,必須執行下列步驟。The following steps are required to add the NuGet package to a project in Visual Studio.

  1. 在 Visual Studio 中開啟專案。Open the project in Visual Studio.
  2. 以滑鼠右鍵按一下專案節點,然後選取 [管理 NuGet 套件 ... ]Right-click the project node and select Manage NuGet Packages...
  3. 在出現的面板中,選取 [流覽] ,然後搜尋「全像的遠端處理」。In the panel that appears, select Browse and then search for "Holographic Remoting".
  4. 選取 [ Microsoft],並確定 挑選最新 的2.x 版,然後選取 [ 安裝]。Select Microsoft.Holographic.Remoting, ensure to pick the latest 2.x.x version and select Install.
  5. 如果出現 [ 預覽 ] 對話方塊,請選取 [確定]If the Preview dialog appears, select OK.
  6. 當 [授權合約] 對話方塊出現時,選取 [ 我接受 ]。Select I Accept when the license agreement dialog pops up.

注意

2.x 版的 NuGet 套件仍可供想要以 HoloLens 1 為目標的開發人員使用。Version 1.x.x of the NuGet package is still available for developers who want to target HoloLens 1. 如需詳細資訊,請參閱新增全像 遠端 (HoloLens (第1代) # B3 For details see Add Holographic Remoting (HoloLens (1st gen)).

建立遠端內容Create the remote context

在第一個步驟中,應用程式應該建立遠端內容。As a first step the application should create a remote context.

// class declaration
#include <winrt/Microsoft.Holographic.AppRemoting.h>

...

private:
    // RemoteContext used to connect with a Holographic Remoting player and display rendered frames
    winrt::Microsoft::Holographic::AppRemoting::RemoteContext m_remoteContext = nullptr;
// class implementation
#include <HolographicAppRemoting\Streamer.h>

...

CreateRemoteContext(m_remoteContext, 20000, false, PreferredVideoCodec::Default);

警告

全像是以遠端處理特定執行時間取代 Windows 一部分的 Windows Mixed Reality 執行時間,來進行全像的遠端處理工作。Holographic Remoting works by replacing the Windows Mixed Reality runtime which is part of Windows with a remoting specific runtime. 這是在建立遠端內容期間完成的。This is done during the creation of the remote context. 基於這個理由,在建立遠端內容之前,對任何 Windows Mixed Reality API 進行的任何呼叫都會導致非預期的行為。For that reason any call on any Windows Mixed Reality API before creating the remote context can result in unexpected behavior. 建議的方法是在與任何混合現實 API 互動之前儘早建立遠端內容。The recommended approach is to create the remote context as early as possible before interaction with any Mixed Reality API. 使用之後建立或抓取的物件來呼叫 CreateRemoteCoNtext 之前,絕對不要混合透過任何 Windows Mixed Reality API 建立或取出的物件。Never mix objects created or retrieved through any Windows Mixed Reality API before the call to CreateRemoteContext with objects created or retrieved afterwards.

接下來必須建立全像空間。Next the holographic space needs to be created. 不需要指定 CoreWindow。Specifying a CoreWindow isn't required. 沒有 CoreWindow 的桌面應用程式可以直接傳遞 nullptrDesktop apps that don't have a CoreWindow can just pass a nullptr.

m_holographicSpace = winrt::Windows::Graphics::Holographic::HolographicSpace::CreateForCoreWindow(nullptr);

連接到裝置Connect to the device

當遠端應用程式準備好轉譯內容時,可建立與播放機裝置的連線。When the remote app is ready for rendering content a connection to the player device can be established.

您可以透過下列兩種方式之一來完成連接。Connection can be done in one of two ways.

  1. 遠端應用程式會連接到在裝置上執行的播放程式。The remote app connects to the player running on the device.
  2. 在裝置上執行的播放程式會連接到遠端應用程式。The player running on the device connects to the remote app.

若要建立從遠端應用程式到 player 裝置的連線,請 Connect 在遠端內容上呼叫方法來指定主機名稱和埠。To establish a connection from the remote app to the player device call the Connect method on the remote context specifying the hostname and port. 全像遠端播放機使用的埠是 8265The port used by the Holographic Remoting Player is 8265.

try
{
    m_remoteContext.Connect(m_hostname, m_port);
}
catch(winrt::hresult_error& e)
{
    DebugLog(L"Connect failed with hr = 0x%08X", e.code());
}

重要

如同任何 c + +/WinRT API, Connect 可能會擲回需要處理的 WinRT:: hresult_error。As with any C++/WinRT API Connect might throw an winrt::hresult_error which needs to be handled.

提示

若要避免使用 c + +/WinRT 語言投射, build\native\include\<windows sdk version>\abi\Microsoft.Holographic.AppRemoting.h 可以包含位於全像「遠端處理 NuGet」套件內的檔案。To avoid using C++/WinRT language projection the file build\native\include\<windows sdk version>\abi\Microsoft.Holographic.AppRemoting.h located inside the Holographic Remoting NuGet package can be included. 它包含基礎 COM 介面的宣告。It contains declarations of the underlying COM interfaces. 不過,我們建議使用 c + +/WinRT。The use of C++/WinRT is recommended though.

您可以藉由呼叫方法來接聽遠端應用程式上的連入連線 ListenListening for incoming connections on the remote app can be done by calling the Listen method. 交握埠和傳輸埠都可以在此呼叫期間指定。Both the handshake port and transport port can be specified during this call. 交握埠用於初始交握。The handshake port is used for the initial handshake. 然後,資料會透過傳輸埠來傳送。The data is then sent over the transport port. 預設會使用 82658266By default 8265 and 8266 are used.

try
{
    m_remoteContext.Listen(L"0.0.0.0", m_port, m_port + 1);
}
catch(winrt::hresult_error& e)
{
    DebugLog(L"Listen failed with hr = 0x%08X", e.code());
}

重要

NuGet 套件內的包含適用于全像全像攝影的 build\native\include\HolographicAppRemoting\Microsoft.Holographic.AppRemoting.idl API 的詳細檔。The build\native\include\HolographicAppRemoting\Microsoft.Holographic.AppRemoting.idl inside the NuGet package contains detailed documentation for the API exposed by Holographic Remoting.

處理遠端處理特定事件Handling Remoting specific events

遠端內容會公開三個事件,這對監視連接的狀態非常重要。The remote context exposes three events, which are important to monitor the state of a connection.

  1. OnConnected:已成功建立裝置的連線時觸發。OnConnected: Triggered when a connection to the device has been successfully established.
winrt::weak_ref<winrt::Microsoft::Holographic::AppRemoting::IRemoteContext> remoteContextWeakRef = m_remoteContext;

m_onConnectedEventRevoker = m_remoteContext.OnConnected(winrt::auto_revoke, [this, remoteContextWeakRef]() {
    if (auto remoteContext = remoteContextWeakRef.get())
    {
        // Update UI state
    }
});
  1. OnDisconnected:如果建立的連接已關閉或無法建立連接,則會觸發。OnDisconnected: Triggered if an established connection is closed or a connection couldn't be established.
m_onDisconnectedEventRevoker =
    m_remoteContext.OnDisconnected(winrt::auto_revoke, [this, remoteContextWeakRef](ConnectionFailureReason failureReason) {
        if (auto remoteContext = remoteContextWeakRef.get())
        {
            DebugLog(L"Disconnected with reason %d", failureReason);
            // Update UI

            // Reconnect if this is a transient failure.
            if (failureReason == ConnectionFailureReason::HandshakeUnreachable ||
                failureReason == ConnectionFailureReason::TransportUnreachable ||
                failureReason == ConnectionFailureReason::ConnectionLost)
            {
                DebugLog(L"Reconnecting...");

                ConnectOrListen();
            }
            // Failure reason None indicates a normal disconnect.
            else if (failureReason != ConnectionFailureReason::None)
            {
                DebugLog(L"Disconnected with unrecoverable error, not attempting to reconnect.");
            }
        }
    });
  1. OnListening:接聽連入連線開始時。OnListening: When listening for incoming connections starts.
m_onListeningEventRevoker = m_remoteContext.OnListening(winrt::auto_revoke, [this, remoteContextWeakRef]() {
    if (auto remoteContext = remoteContextWeakRef.get())
    {
        // Update UI state
    }
});

此外,您也可以使用遠端內容上的屬性來查詢連接狀態 ConnectionStateAdditionally the connection state can be queried using the ConnectionState property on the remote context.

auto connectionState = m_remoteContext.ConnectionState();

處理語音事件Handling speech events

您可以使用遠端語音介面,向 HoloLens 2 註冊語音觸發程式,並讓它們遠端處理遠端應用程式。Using the remote speech interface it's possible to register speech triggers with HoloLens 2 and have them remoted to the remote application.

需要此額外成員才能追蹤遠端語音的狀態。This additional member is required to track the state of the remote speech.

winrt::Microsoft::Holographic::AppRemoting::IRemoteSpeech::OnRecognizedSpeech_revoker m_onRecognizedSpeechRevoker;

首先,必須抓取遠端語音介面。First the remote speech interface needs to be retrieved.

if (auto remoteSpeech = m_remoteContext.GetRemoteSpeech())
{
    InitializeSpeechAsync(remoteSpeech, m_onRecognizedSpeechRevoker, weak_from_this());
}

您可以使用非同步協助程式方法來初始化遠端語音。Using an asynchronous helper method you can then initialize the remote speech. 這應該會以非同步方式完成,因為初始化可能需要相當長的時間。This should be done asynchronously as initializing might take a considerable amount of time. 使用 c + +/WinRT 的並行和非同步作業 說明如何使用 c + +/winrt 撰寫非同步函式Concurrency and asynchronous operations with C++/WinRT explains how to author asynchronous functions with C++/WinRT.

winrt::Windows::Foundation::IAsyncOperation<winrt::Windows::Storage::StorageFile> LoadGrammarFileAsync()
{
    const wchar_t* speechGrammarFile = L"SpeechGrammar.xml";
    auto rootFolder = winrt::Windows::ApplicationModel::Package::Current().InstalledLocation();
    return rootFolder.GetFileAsync(speechGrammarFile);
}

winrt::fire_and_forget InitializeSpeechAsync(
    winrt::Microsoft::Holographic::AppRemoting::IRemoteSpeech remoteSpeech,
    winrt::Microsoft::Holographic::AppRemoting::IRemoteSpeech::OnRecognizedSpeech_revoker& onRecognizedSpeechRevoker,
    std::weak_ptr<SampleRemoteMain> sampleRemoteMainWeak)
{
    onRecognizedSpeechRevoker = remoteSpeech.OnRecognizedSpeech(
        winrt::auto_revoke, [sampleRemoteMainWeak](const winrt::Microsoft::Holographic::AppRemoting::RecognizedSpeech& recognizedSpeech) {
            if (auto sampleRemoteMain = sampleRemoteMainWeak.lock())
            {
                sampleRemoteMain->OnRecognizedSpeech(recognizedSpeech.RecognizedText);
            }
        });

    auto grammarFile = co_await LoadGrammarFileAsync();

    std::vector<winrt::hstring> dictionary;
    dictionary.push_back(L"Red");
    dictionary.push_back(L"Blue");
    dictionary.push_back(L"Green");
    dictionary.push_back(L"Default");
    dictionary.push_back(L"Aquamarine");

    remoteSpeech.ApplyParameters(L"", grammarFile, dictionary);
}

有兩種方式可以指定要辨識的片語。There are two ways of specifying phrases to be recognized.

  1. 語音文法 xml 檔案內的規格。Specification inside a speech grammar xml file. 如需詳細資料,請參閱 如何建立基本的 XML 文法See How to create a basic XML Grammar for details.
  2. 藉由將字典向量內的傳遞給來指定 ApplyParametersSpecify by passing them inside the dictionary vector to ApplyParameters.

在 OnRecognizedSpeech 回呼中,可以處理語音事件:Inside the OnRecognizedSpeech callback, the speech events can then be processed:

void SampleRemoteMain::OnRecognizedSpeech(const winrt::hstring& recognizedText)
{
    bool changedColor = false;
    DirectX::XMFLOAT4 color = {1, 1, 1, 1};

    if (recognizedText == L"Red")
    {
        color = {1, 0, 0, 1};
        changedColor = true;
    }
    else if (recognizedText == L"Blue")
    {
        color = {0, 0, 1, 1};
        changedColor = true;
    }
    else if (recognizedText == L"Green")
    {
        ...
    }

    ...
}

在本機預覽資料流程內容Preview streamed content locally

若要在傳送至裝置的遠端應用程式中顯示相同的內容, OnSendFrame 可以使用遠端內容的事件。To display the same content in the remote app that is sent to the device the OnSendFrame event of the remote context can be used. 每次全像全像遠端連結 OnSendFrame 庫將目前的框架傳送到遠端裝置時,就會觸發此事件。The OnSendFrame event is triggered every time the Holographic Remoting library sends the current frame to the remote device. 這是取得內容,並將它 array.blit 到桌面或 UWP 視窗的理想時間。This is the ideal time to take the content and also blit it into the desktop or UWP window.

#include <windows.graphics.directx.direct3d11.interop.h>

...

m_onSendFrameEventRevoker = m_remoteContext.OnSendFrame(
    winrt::auto_revoke, [this](const winrt::Windows::Graphics::DirectX::Direct3D11::IDirect3DSurface& texture) {
        winrt::com_ptr<ID3D11Texture2D> texturePtr;
        {
            winrt::com_ptr<ID3D11Resource> resource;
            winrt::com_ptr<::IInspectable> inspectable = texture.as<::IInspectable>();
            winrt::com_ptr<Windows::Graphics::DirectX::Direct3D11::IDirect3DDxgiInterfaceAccess> dxgiInterfaceAccess;
            winrt::check_hresult(inspectable->QueryInterface(__uuidof(dxgiInterfaceAccess), dxgiInterfaceAccess.put_void()));
            winrt::check_hresult(dxgiInterfaceAccess->GetInterface(__uuidof(resource), resource.put_void()));
            resource.as(texturePtr);
        }

        // Copy / blit texturePtr into the back buffer here.
    });

深度 ReprojectionDepth Reprojection

從版本 2.1.0開始,全像攝影遠端支援 深度 ReprojectionStarting with version 2.1.0, Holographic Remoting supports Depth Reprojection. 這需要從遠端應用程式將色彩緩衝區和深度緩衝區串流至 HoloLens 2。This requires both the color buffer and depth buffer to be streamed from the Remote application to the HoloLens 2. 預設會啟用深度緩衝區串流,並將其設定為使用色彩緩衝區的一半解析度。By default depth buffer streaming is enabled and configured to use half the resolution of the color buffer. 這可以變更如下:This can be changed as follows:

// class implementation
#include <HolographicAppRemoting\Streamer.h>

...

CreateRemoteContext(m_remoteContext, 20000, false, PreferredVideoCodec::Default);

// Configure for half-resolution depth.
m_remoteContext.ConfigureDepthVideoStream(DepthBufferStreamResolution::Half_Resolution);

請注意,如果不應該使用預設值,就 ConfigureDepthVideoStream 必須在建立 HoloLens 2 的連接之前呼叫。Note, if default values should not be used ConfigureDepthVideoStream must be called before establishing a connection to the HoloLens 2. 最好的地方就是建立遠端內容。The best place is right after you have created the remote context. DepthBufferStreamResolution 的可能值為:Possible values for DepthBufferStreamResolution are:

  • Full_ResolutionFull_Resolution
  • Half_ResolutionHalf_Resolution
  • Quarter_ResolutionQuarter_Resolution
  • 停用的 (在 2.1.3 版本中新增,如果未使用,則會建立其他深度的影片串流) Disabled (added with version 2.1.3 and if used no additional depth video stream is created)

請記住,使用完整的解析度深度緩衝區也會影響頻寬需求,而且需要在您提供的最大頻寬值中加以考慮 CreateRemoteContextKeep in mind that using a full resolution depth buffer also affects bandwidth requirements and needs to be accounted for in the maximum bandwidth value you provide to CreateRemoteContext.

除了設定解決方案之外,您也必須透過 HolographicCameraRenderingParameters. CommitDirect3D11DepthBuffer認可深度緩衝區。Beside configuring the resolution, you also have to commit a depth buffer via HolographicCameraRenderingParameters.CommitDirect3D11DepthBuffer.


void SampleRemoteMain::Render(HolographicFrame holographicFrame)
{
    ...

    m_deviceResources->UseHolographicCameraResources([this, holographicFrame](auto& cameraResourceMap) {
        
        ...

        for (auto cameraPose : prediction.CameraPoses())
        {
            DXHelper::CameraResources* pCameraResources = cameraResourceMap[cameraPose.HolographicCamera().Id()].get();

            ...

            m_deviceResources->UseD3DDeviceContext([&](ID3D11DeviceContext3* context) {
                
                ...

                // Commit depth buffer if available and enabled.
                if (m_canCommitDirect3D11DepthBuffer && m_commitDirect3D11DepthBuffer)
                {
                    auto interopSurface = pCameraResources->GetDepthStencilTextureInteropObject();
                    HolographicCameraRenderingParameters renderingParameters = holographicFrame.GetRenderingParameters(cameraPose);
                    renderingParameters.CommitDirect3D11DepthBuffer(interopSurface);
                }
            });
        }
    });
}

若要確認深度 reprojection 是否可在 HoloLens 2 上正確運作,您可以透過裝置入口網站啟用深度視覺化檢視。To verify if depth reprojection is correctly working on HoloLens 2, you can enable a depth visualizer via the Device Portal. 如需詳細資料,請參閱 驗證深度設定是否正確See Verifying Depth is Set Correctly for details.

選用:自訂資料通道Optional: Custom data channels

自訂資料通道可以用來透過已建立的遠端連線來傳送使用者資料。Custom data channels can be used to send user data over the already established remoting connection. 如需詳細資訊,請參閱 自訂資料通道See custom data channels for more information.

另請參閱See Also