D3D12_SHADER_MIN_PRECISION_SUPPORT enumeration (d3d12.h)
Describes minimum precision support options for shaders in the current graphics driver.
Syntax
typedef enum D3D12_SHADER_MIN_PRECISION_SUPPORT {
D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE = 0,
D3D12_SHADER_MIN_PRECISION_SUPPORT_10_BIT = 0x1,
D3D12_SHADER_MIN_PRECISION_SUPPORT_16_BIT = 0x2
} ;
Constants
D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE Value: 0 The driver supports only full 32-bit precision for all shader stages. |
D3D12_SHADER_MIN_PRECISION_SUPPORT_10_BIT Value: 0x1 The driver supports 10-bit precision. |
D3D12_SHADER_MIN_PRECISION_SUPPORT_16_BIT Value: 0x2 The driver supports 16-bit precision. |
Remarks
This enum is used by the D3D12_FEATURE_DATA_D3D12_OPTIONS structure.
The returned info just indicates that the graphics hardware can perform HLSL operations at a lower precision than the standard 32-bit float precision, but doesn’t guarantee that the graphics hardware will actually run at a lower precision.
Requirements
Requirement | Value |
---|---|
Header | d3d12.h |
See also
Feedback
https://aka.ms/ContentUserFeedback.
Coming soon: Throughout 2024 we will be phasing out GitHub Issues as the feedback mechanism for content and replacing it with a new feedback system. For more information see:Submit and view feedback for