D3D12_SHADER_MIN_PRECISION_SUPPORT Enumeration

Describes minimum precision support options for shaders in the current graphics driver.

Syntax

typedef enum D3D12_SHADER_MIN_PRECISION_SUPPORT {
  D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE,
  D3D12_SHADER_MIN_PRECISION_SUPPORT_10_BIT,
  D3D12_SHADER_MIN_PRECISION_SUPPORT_16_BIT
} ;

Constants

D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE The driver supports only full 32-bit precision for all shader stages.
D3D12_SHADER_MIN_PRECISION_SUPPORT_10_BIT The driver supports 10-bit precision.
D3D12_SHADER_MIN_PRECISION_SUPPORT_16_BIT The driver supports 16-bit precision.

Remarks

This enum is used by the D3D12_FEATURE_DATA_D3D12_OPTIONS structure.

The returned info just indicates that the graphics hardware can perform HLSL operations at a lower precision than the standard 32-bit float precision, but doesn’t guarantee that the graphics hardware will actually run at a lower precision.

Requirements

   
Header d3d12.h

See Also

Core Enumerations