Unreal 中的手勢追蹤Hand tracking in Unreal

手追蹤系統會使用某人的手掌和手指做為輸入。The hand tracking system uses a person’s palms and fingers as input. 每個手指的位置和旋轉、完整的掌上和手手勢的資料都可以使用。Data on position and rotation of every finger, the entire palm, and hand gestures is available. 從 Unreal 4.26 開始,手動追蹤是以 Unreal HeadMountedDisplay 外掛程式為基礎,並在所有 XR 平臺和裝置上使用通用 API。Starting in Unreal 4.26, hand tracking is based on the Unreal HeadMountedDisplay plugin and uses a common API across all XR platforms and devices. Windows Mixed Reality 和 OpenXR 系統的功能都相同。Functionality is the same for both Windows Mixed Reality and OpenXR systems.

手姿勢Hand pose

手姿勢可讓您追蹤並使用使用者的手和手指作為輸入,可在藍圖和 c + + 中存取。Hand pose lets you track and use the hands and fingers of your users as input, which can be accessed in both Blueprints and C++. Unreal API 會將資料傳送為座標系統,並與 Unreal 引擎同步處理刻度。The Unreal API sends the data as a coordinate system, with ticks synchronized with the Unreal Engine.

手形基本架構

階層是由列舉所描述 EHandKeypointThe hierarchy is described by EHandKeypoint enum:

手 keypoint bluprint 選項的影像

您可以使用 Get 移動控制器資料 函式,從使用者的手中取得所有資料。You can get all this data from a user’s hands using the Get Motion Controller Data function. 該函數會傳回 XRMotionControllerData 結構。That function returns an XRMotionControllerData structure. 以下是範例藍圖腳本,它會剖析 XRMotionControllerData 結構以取得手入位置,並在每個聯合的位置繪製一個 debug 座標系統。Below is a sample Blueprint script that parses the XRMotionControllerData structure to get hand joint locations and draws a debug coordinate system at each joint’s location.

依通道函數連接到行追蹤的 get 注視資料函式藍圖

請務必檢查結構是否有效,以及是否正確。It's important to check if the structure is valid and that it's a hand. 否則,您可能會在存取位置、旋轉和半徑陣列時取得未定義的行為。Otherwise, you may get undefined behavior in access to positions, rotations, and radii arrays.

使用 即時連結外掛程式可對動畫公開手。Hand poses are exposed to Animation using the Live Link plugin.

如果已啟用 Windows Mixed Reality 和即時連結外掛程式:If the Windows Mixed Reality and Live Link plugins are enabled:

  1. 選取 [ Window > 即時連結 ] 以開啟 [即時連結編輯器] 視窗。Select Window > Live Link to open the Live Link editor window.
  2. 選取 來源 並啟用 Windows Mixed Reality 手追蹤來源Select Source and enable Windows Mixed Reality Hand Tracking Source

即時連結來源

啟用來源並開啟動畫資產之後,展開 [預覽場景] 索引標籤中的 [動畫] 區段,也會看到其他選項。After you enable the source and open an animation asset, expand the Animation section in the Preview Scene tab too see additional options.

即時連結動畫

手形動畫階層與中的相同 EWMRHandKeypointThe hand animation hierarchy is the same as in EWMRHandKeypoint. 動畫可以使用 WindowsMixedRealityHandTrackingLiveLinkRemapAsset 重定目標:Animation can be retargeted using WindowsMixedRealityHandTrackingLiveLinkRemapAsset:

即時連結動畫2

它也可以在編輯器中進行子類別化:It can also be subclassed in the editor:

即時連結重新對應

手上網格Hand Mesh

以手動方式作為追蹤幾何Hand Mesh as a Tracked Geometry

重要

在 OpenXR 中取得作為追蹤幾何的手勢需要您呼叫 Set 使用具有 啟用追蹤幾何手形網格Getting hand meshes as a tracked geometry in OpenXR requires you to call Set Use Hand Mesh with Enabled Tracking Geometry.

若要啟用該模式,您應該呼叫 Set 使用具有 啟用追蹤幾何手形網格To enable that mode you should call Set Use Hand Mesh with Enabled Tracking Geometry:

事件開始播放的藍圖已連線到 set use 手勢函式已啟用追蹤幾何模式

注意

這兩種模式都不可能同時啟用。It’s not possible for both modes to be enabled at the same time. 如果您啟用了一個,另一個會自動停用。If you enable one, the other is automatically disabled.

存取手形網格資料Accessing Hand Mesh Data

手上網格

在您可以存取手中的資料之前,您必須:Before you can access hand mesh data, you'll need to:

  • 選取您的 ARSessionConfig 資產,展開 [ AR 設定-> 世界對應 設定],然後核取 [ 從追蹤的幾何產生網格資料]。Select your ARSessionConfig asset, expand the AR Settings -> World Mapping settings, and check Generate Mesh Data from Tracked Geometry.

以下是預設的網格參數:Below are the default mesh parameters:

  1. 使用網格資料進行遮蔽Use Mesh Data for Occlusion
  2. 產生網格資料的衝突Generate Collision for Mesh Data
  3. 產生網格資料的 Nav 網格Generate Nav Mesh for Mesh Data
  4. 轉譯線框中的網格資料-debug 參數,可顯示產生的網格Render Mesh Data in Wireframe – debug parameter that shows generated mesh

這些參數值會當做空間對應網格和手邊網格預設值使用。These parameter values are used as the spatial mapping mesh and hand mesh defaults. 您可以隨時在任何網格的藍圖或程式碼中變更它們。You can change them at any time in Blueprints or code for any mesh.

C + + API 參考C++ API Reference

EEARObjectClassification 來尋找所有可追蹤物件中的手網格值。Use EEARObjectClassification to find hand mesh values in all trackable objects.

enum class EARObjectClassification : uint8
{
    // Other types
    HandMesh,
};

當系統偵測到任何可追蹤物件(包括手形網格)時,會呼叫下列委派。The following delegates are called when the system detects any trackable object, including a hand mesh.

class FARSupportInterface
{
    public:
    // Other params
    DECLARE_AR_SI_DELEGATE_FUNCS(OnTrackableAdded)
    DECLARE_AR_SI_DELEGATE_FUNCS(OnTrackableUpdated)
    DECLARE_AR_SI_DELEGATE_FUNCS(OnTrackableRemoved)
};

請確定您的委派處理常式遵循以下的函數簽章:Make sure your delegate handlers follow the function signature below:

void UARHandMeshComponent::OnTrackableAdded(UARTrackedGeometry* Added)

您可以透過下列步驟存取網格資料 UARTrackedGeometry::GetUnderlyingMeshYou can access mesh data through the UARTrackedGeometry::GetUnderlyingMesh:

UMRMeshComponent* UARTrackedGeometry::GetUnderlyingMesh()

藍圖 API 參考Blueprint API Reference

若要在藍圖中使用手形網格:To work with Hand Meshes in Blueprints:

  1. ARTrackableNotify 元件新增至藍圖執行者Add an ARTrackableNotify Component to a Blueprint actor

ARTrackable 通知

  1. 移至 [ 詳細資料 ] 面板,然後展開 [ 事件 ] 區段。Go to the Details panel and expand the Events section.

ARTrackable 通知2

  1. 在事件圖中使用下列節點覆寫新增/更新/移除追蹤幾何:Overwrite On Add/Update/Remove Tracked Geometry with the following nodes in your Event Graph:

ARTrackable 通知時

OpenXR 中的手中視覺效果Hand Mesh visualization in OpenXR

將手上網格視覺化的建議方式,是使用長篇的 XRVisualization 外掛程式搭配 Microsoft OpenXR 外掛程式The recommended way to visualize hand mesh is to use Epic’s XRVisualization plugin together with the Microsoft OpenXR plugin.

然後,在藍圖編輯器中,您應該使用來自 Microsoft OpenXR 外掛程式Set use 手型 函式,並搭配 Enabled XRVisualization 作為參數:Then in the blueprint editor, you should use Set Use Hand Mesh function from the Microsoft OpenXR plugin with Enabled XRVisualization as a parameter:

事件開始播放的藍圖已連線到 set use 手型函式已啟用 xrvisualization 模式

若要管理轉譯程式,您應該使用 XRVisualization 的 Render 移動控制器To manage the rendering process, you should use Render Motion Controller from XRVisualization:

連接到轉譯移動控制器函式的 get 運動控制器資料函式藍圖

結果:The result:

以真實人為重迭的數位手影像

如果您需要更複雜的資訊,例如繪製具有自訂著色器的手形網格,您需要取得網格作為追蹤幾何。If you need anything more complicated, such as drawing a hand mesh with a custom shader, you need to get the meshes as a tracked geometry.

手部光線 (Hand Ray)Hand rays

取得手姿勢適用于接近物件或按下按鈕的接近互動。Getting hand pose works for close interactions like grabbing objects or pressing buttons. 但是,有時候您需要處理離您的使用者更遠的全像影像。However, sometimes you need to work with holograms that are far away from your users. 這可以透過手光線來完成,這可以用來作為 c + + 和藍圖中的指標裝置。This can be accomplished with hand rays, which can be used as pointing devices in both C++ and Blueprints. 您可以從手中畫出光線到目前為止,並透過 Unreal 光線追蹤的一些協助,來選取不觸及的全像影像。You can draw a ray from your hand to a far point and, with some help from Unreal ray tracing, select a hologram that would otherwise be out of reach.

重要

因為所有的函式結果都會變更每個畫面格,所以全都變成可呼叫。Since all function results change every frame, they're all made callable. 如需有關純和 impure 或可呼叫函式的詳細資訊,請參閱「函 式上的藍圖使用者 guid」。For more information about pure and impure or callable functions, see the Blueprint user guid on functions.

若要取得手片的資料,您應該使用上一節中的 Get 移動控制器資料函數。To get the data for the hand rays, you should use the Get Motion Controller Data function from the previous section. 傳回的結構包含兩個參數,您可以使用這兩個參數來建立手光線– 目標位置目標旋轉The returned structure contains two parameters you can use to create a hand ray – Aim Position and Aim Rotation. 這些參數會形成您的肘線所導向的光線。These parameters form a ray directed by your elbow. 您應該採用它們,並尋找所指向的全像影像。You should take them and find a hologram being pointed by.

以下是判斷手光線是否叫用小工具和設定自訂命中結果的範例:Below is an example of determining whether a hand ray hits a Widget and setting a custom hit result:

取得移動控制器資料函式的藍圖

軌跡Gestures

HoloLens 2 會追蹤空間手勢,這表示您可以將這些手勢捕捉為輸入。The HoloLens 2 tracks spatial gestures, which means you can capture those gestures as input. 軌跡追蹤是以訂用帳戶模型為基礎。Gesture tracking is based on a subscription model. 您應該使用「設定手勢」函式來告訴裝置您要追蹤的手勢。 您可以在 HoloLens 2 基本使用 方式檔中找到手勢的詳細資料。You should use the “Configure Gestures” function to tell the device which gestures you want to track. You can find more details about gestures are the HoloLens 2 Basic Usage document.

Windows Mixed RealityWindows Mixed Reality

事件開始播放的藍圖已連線到設定手勢函式

然後,您應該加入程式碼以訂閱下列事件:Then you should add code to subscribe to the following events:

Windows 空間輸入的藍圖、點擊和左側操作手勢  螢幕擷取畫面: [詳細資料] 面板中的 [windows 空間輸入] 點擊手勢選項Blueprint of Windows spatial input hold, tap, and left manipulation gestures Screenshot of Windows spatial input tap gesture options in the details panel

OpenXROpenXR

在 OpenXR 中,手勢事件是透過輸入管線來追蹤。In OpenXR, gesture events are tracked through the input pipeline. 裝置可以使用手邊互動,自動辨識點擊和按住手勢,而不是其他筆勢。Using hand interaction, the device can automatically recognize Tap and Hold gestures, but not the others. 這些名稱會命名為 OpenXRMsftHandInteraction Select 和框對應。They are named as OpenXRMsftHandInteraction Select and Grip mappings. 您不需要啟用訂用帳戶,您應該在專案設定/引擎/輸入中宣告事件,就像這樣:You don’t need to enable subscription, you should declare the events in Project Settings/Engine/Input, just like this:

OpenXR 動作對應的螢幕擷取畫面

下一個開發檢查點Next Development Checkpoint

依循我們配置的 Unreal 開發旅程,此時您會探索 MRTK核心建置組塊。If you're following the Unreal development journey we've laid out, you're in the midst of exploring the MRTK core building blocks. 接下來,您可以繼續進行下一個建置組塊:From here, you can continue to the next building block:

或者,直接跳到混合實境平台功能和 API 的主題:Or jump to Mixed Reality platform capabilities and APIs:

您可以隨時回到 Unreal 開發檢查點You can always go back to the Unreal development checkpoints at any time.