Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Describes minimum precision support options for shaders in the current graphics driver.
Syntax
typedef enum D3D12_SHADER_MIN_PRECISION_SUPPORT {
D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE = 0,
D3D12_SHADER_MIN_PRECISION_SUPPORT_10_BIT = 0x1,
D3D12_SHADER_MIN_PRECISION_SUPPORT_16_BIT = 0x2
} ;
Constants
D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE Value: 0 The driver supports only full 32-bit precision for all shader stages. |
D3D12_SHADER_MIN_PRECISION_SUPPORT_10_BIT Value: 0x1 The driver supports 10-bit precision. |
D3D12_SHADER_MIN_PRECISION_SUPPORT_16_BIT Value: 0x2 The driver supports 16-bit precision. |
Remarks
This enum is used by the D3D12_FEATURE_DATA_D3D12_OPTIONS structure.
The returned info just indicates that the graphics hardware can perform HLSL operations at a lower precision than the standard 32-bit float precision, but doesn’t guarantee that the graphics hardware will actually run at a lower precision.
Requirements
Requirement | Value |
---|---|
Header | d3d12.h |