注視輸入Gaze Input

在混合現實應用程式中看看輸入,全都是找出您的使用者所看到的內容。Gaze input in mixed reality apps is all about finding out what your users are looking at. 當您裝置上的眼睛追蹤攝影機與 Unreal 世界空間中的光線相符時,您使用者的資料行資料就會變成可用。When the eye tracking cameras on your device match up with rays in Unreal's world space, your user's line of sight data becomes available. 注視可用於藍圖和 c + +,而且是物件互動、尋找和攝影機控制項等機制的核心功能。Gaze can be used in both blueprints and C++, and is a core feature for mechanics like object interaction, way finding, and camera controls.

啟用眼睛追蹤Enabling eye tracking

  • 在 [ > HoloLens 的專案設定] 中,啟用 [ 注視輸入 ] 功能:In Project Settings > HoloLens, enable the Gaze Input capability:

已反白顯示注視輸入的 HoloLens 專案設定功能螢幕擷取畫面

  • 建立新的動作專案並將其新增至您的場景Create a new actor and add it to your scene

注意

Unreal 中的 HoloLens 眼睛追蹤只會有一個眼睛光線。HoloLens eye tracking in Unreal only has a single gaze ray for both eyes. 不支援 Stereoscopic 追蹤(需要兩個光線)。Stereoscopic tracking, which requires two rays, isn't supported.

使用眼球追蹤Using eye tracking

首先,請檢查您的裝置是否支援使用 IsEyeTrackerConnected 函式的眼睛追蹤。First, check that your device supports eye tracking with the IsEyeTrackerConnected function. 如果函式傳回 true,請呼叫 GetGazeData ,以找出使用者的眼睛在目前框架中的位置:If the function returns true, call GetGazeData to find where the user’s eyes are looking at in the current frame:

的藍圖是眼睛追蹤連接函數

注意

HoloLens 上無法使用 fixation 點和信賴值。The fixation point and the confidence value are not available on HoloLens.

使用線條追蹤中的「注視原點」和「方向」來精確找出您的使用者所查看的位置。Use the gaze origin and direction in a line trace to find out exactly where your users are looking. 「注視」值是向量,從注視原點開始,然後在原點結束時再以線條軌跡距離乘以:The gaze value is a vector, starting at the gaze origin and ending at the origin plus the gaze direction multiplied by the line trace distance:

Get-help Data 函數的藍圖

取得頭部方向Getting head orientation

您也可以使用前端掛接顯示器的旋轉 (HMD) 來代表使用者的標頭方向。You can also use the rotation of the Head Mounted Display (HMD) to represent the direction of the user’s head. 您可以取得使用者的前端方向,而不啟用注視輸入功能,但您不會收到任何眼睛追蹤資訊。You can get the users head direction without enabling the Gaze Input capability, but you won't get you any eye tracking information. 將藍圖的參考新增為世界內容,以取得正確的輸出資料:Add a reference to the blueprint as the world context to get the correct output data:

注意

取得 HMD 資料僅適用于 Unreal 4.26 和更新版本。Getting HMD Data is only available in Unreal 4.26 and onwards.

Get HMDData 函式的藍圖

使用 C++Using C++

  • 在您遊戲的 build.cs 檔案中,將 EyeTracker 新增至 PublicDependencyModuleNames 清單:In your game’s build.cs file, add EyeTracker to the PublicDependencyModuleNames list:
PublicDependencyModuleNames.AddRange(
    new string[] {
        "Core",
        "CoreUObject",
        "Engine",
        "InputCore",
        "EyeTracker"
});
  • 在檔案 /新的 c + + 類別 中,建立名為 EyeTracker 的新 c + + 執行者。In File/ New C++ Class, create a new C++ actor called EyeTracker
    • Visual Studio 的解決方案將會開啟新的 EyeTracker 類別。A Visual Studio solution will open up the new EyeTracker class. 建立並執行,以使用新的 EyeTracker 執行者開啟 Unreal 遊戲。Build and run to open the Unreal game with the new EyeTracker actor. 在 [ 放置 動作專案] 視窗中搜尋 "EyeTracker",並將類別拖放到遊戲視窗中,以將其新增至專案:Search for “EyeTracker” in the Place Actors window and drag and drop the class into the game window to add it to the project:

開啟動作專案視窗的動作專案螢幕擷取畫面

  • EyeTracker .cpp 中,新增 EyeTrackerFunctionLibraryDrawDebugHelpers 的 include:In EyeTracker.cpp, add includes for EyeTrackerFunctionLibrary, and DrawDebugHelpers:
#include "EyeTrackerFunctionLibrary.h"
#include "DrawDebugHelpers.h"

檢查您的裝置是否支援使用 UEyeTrackerFunctionLibrary:: IsEyeTrackerConnected 的眼睛追蹤,然後再嘗試取得任何眼睛的資料。Check that your device supports eye tracking with UEyeTrackerFunctionLibrary::IsEyeTrackerConnected before trying to get any gaze data. 如果支援眼睛追蹤,請從 UEyeTrackerFunctionLibrary:: GetGazeData 找出線條追蹤的光線起點和終點。If eye tracking is supported, find the start and end of a ray for a line trace from UEyeTrackerFunctionLibrary::GetGazeData. 從該處,您可以建立看看的向量,並將其內容傳遞給 LineTraceSingleByChannel ,以對任何光線搜尋結果進行處理:From there, you can construct a gaze vector and pass its contents to LineTraceSingleByChannel to debug any ray hit results:

void AEyeTracker::Tick(float DeltaTime)
{
    Super::Tick(DeltaTime);

    if(UEyeTrackerFunctionLibrary::IsEyeTrackerConnected())
    {
        FEyeTrackerGazeData GazeData;
        if(UEyeTrackerFunctionLibrary::GetGazeData(GazeData))
        {
            FVector Start = GazeData.GazeOrigin;
            FVector End = GazeData.GazeOrigin + GazeData.GazeDirection * 100;

            FHitResult Hit Result;
            if (GWorld->LineTraceSingleByChannel(HitResult, Start, End, ECollisionChannel::ECC_Visiblity))
            {
                DrawDebugCoordinateSystem(GWorld, HitResult.Location, FQuat::Identity.Rotator(), 10);
            }
        }
    }
}

下一個開發檢查點Next Development Checkpoint

依循我們配置的 Unreal 開發旅程,此時您會探索 MRTK核心建置組塊。If you're following the Unreal development journey we've laid out, you're in the midst of exploring the MRTK core building blocks. 接下來,您可以繼續進行下一個建置組塊:From here, you can continue to the next building block:

或者,直接跳到混合實境平台功能和 API 的主題:Or jump to Mixed Reality platform capabilities and APIs:

您可以隨時回到 Unreal 開發檢查點You can always go back to the Unreal development checkpoints at any time.

另請參閱See also