Unreal 中的 HoloLens 照片/视频摄像头HoloLens Photo/Video Camera in Unreal

概述Overview

HoloLens 具有照片/视频 (PV) 摄像头,可用于混合现实捕获 (MRC),应用还可以使用它来访问真实的视觉对象。The HoloLens has a Photo/Video (PV) Camera that is used for both Mixed Reality Capture (MRC) and can be used by an app to access real-world visuals.

重要

全息远程不支持 PV 摄像头,但可使用电脑上附带的网络摄像头来模拟 HoloLens PV 摄像头功能。The PV Camera isn't supported with Holographic Remoting, but it's possible to use a webcam attached to your PC to simulate the HoloLens PV Camera functionality.

从 MRC 的 PV 摄像头渲染Render from the PV Camera for MRC

备注

这需要 Unreal Engine 4.25 或更高版本。This requires Unreal Engine 4.25 or newer.

系统和自定义 MRC 记录器通过将 PV 摄像头与沉浸式应用渲染的全息影像结合在一起,创建混合现实捕获。The system, and custom MRC recorders, create mixed reality captures by combining the PV Camera with holograms rendered by the immersive app.

默认情况下,混合现实捕获使用右眼的全息影像输出。By default, mixed reality capture uses the right eye's holographic output. 如果沉浸式应用选择从 PV 摄像头进行渲染,则会改用它。If an immersive app chooses to render from the PV Camera then that will be used instead. 这改进了真实世界与 MRC 视频中全息影像之间的映射。This improves the mapping between the real world and the holograms in the MRC video.

若要选择从 PV 摄像机进行渲染:To opt-in to rendering from the PV Camera:

  1. 调用 SetEnabledMixedRealityCamera 和 ResizeMixedRealityCamera Call SetEnabledMixedRealityCamera and ResizeMixedRealityCamera
    • 使用“尺寸 X”和“尺寸 Y”值设置视频尺寸。 Use the Size X and Size Y values to set the video dimensions.

第三人称摄像头

然后,Unreal 将处理 MRC 要从 PV 摄像头的角度进行渲染的请求。Unreal will then handle requests from MRC to render from the PV Camera's perspective.

备注

仅当触发混合现实捕获时,才会要求应用从照片/视频摄像头的角度进行渲染。Only when Mixed Reality Capture is triggered will the app be asked to render from the photo/video camera's perspective.

使用 PV 摄像头Using the PV Camera

在游戏中,可以在运行时检索网络摄像头纹理,但需要在编辑器的“编辑 > 项目设置”中启用它:The webcam texture can be retrieved in the game at runtime, but it needs to be enabled in the editor's Edit > Project Settings:

  1. 转到“平台 > HoloLens > 功能”,然后选中“网络摄像头”。 Go to Platforms > HoloLens > Capabilities and check Webcam.
    • 在运行时通过 StartCameraCapture 函数使用网络摄像头,通过 StopCameraCapture 函数停止录像。 Use the StartCameraCapture function to use the webcam at runtime and the StopCameraCapture function to stop recording.

摄像头开始/停止

渲染图像Rendering an image

渲染摄像头图像:To render the camera image:

  1. 基于项目中的材料创建动态材料实例,下方的屏幕截图将其命名为“PVCamMat”。Create a dynamic material instance based on a material in the project, which is named PVCamMat in the screenshot below.
  2. 将动态材料实例设置为“材料实例动态对象引用”变量。Set the dynamic material instance to a Material Instance Dynamic Object Reference variable.
  3. 设置场景中的对象的材料,它会将摄像头源渲染到这个新的动态材料实例。Set the material of the object in the scene that will render the camera feed to this new dynamic material instance.
    • 启动计时器,用于将摄像头图像与材料绑定。Start a timer that will be used to bind the camera image to the material.

摄像头渲染

  1. 为此计时器创建新函数(在本例中为 MaterialTimer),并调用 GetARCameraImage 从网络摄像头获取纹理。 Create a new function for this timer, in this case MaterialTimer, and call GetARCameraImage to get the texture from the webcam.
  2. 如果此纹理有效,则将着色器中的纹理参数设置为此图像。If the texture is valid, set a texture parameter in the shader to the image. 否则,请重新启动材质计时器。Otherwise, start the material timer again.

来自网络摄像头的摄像头纹理

  1. 请确保材料的参数与绑定到颜色条目的 SetTextureParameterValue 中的名称匹配,Make sure the material has a parameter matching the name in SetTextureParameterValue that's bound to a color entry. 否则将无法正确显示摄像头图像。Without this, the camera image can't be properly displayed.

摄像头纹理

下一个开发检查点Next Development Checkpoint

如果你遵循我们规划的 Unreal 开发检查点历程,则你处于探索混合现实平台功能和 API 的过程之中。If you're following the Unreal development checkpoint journey we've laid out, you're in the midst of exploring the Mixed Reality platform capabilities and APIs. 从这里,你可以进入下一主题:From here, you can proceed to the next topic:

或直接跳到在设备或模拟器上部署应用:Or jump directly to deploying your app on a device or emulator:

你可以随时返回到 Unreal 开发检查点You can always go back to the Unreal development checkpoints at any time.

另请参阅See also