Indicates info about the precision of history buffer data used by the display miniport driver.


  UINT32 PrecisionBits;



The number of valid bits that are used in each time stamp. This number doesn't include bits used for junk values.

This precision value has three valid ranges:

Value Meaning
0 No bits contain useful data, and the DirectX graphics kernel subsystem will call the DxgkDdiFormatHistoryBuffer function to provide valid data to output to the Event Tracing for Windows (ETW) facility. When the driver processes this call, it sets a new precision value as the output parameter of the function.
32 The driver should log 32-bit time stamps using the full 32 bits of precision.
33–64 The driver should log 64-bit time stamps. This value defines the number of bits used to store data per time stamp.
To reduce the cost of formatting the data, the driver can include junk values in the 64-bit time stamps. For example, the driver could write 64-bit time stamps with 55 valid bits of precision. In this case the upper 9 bits are considered junk values and are stripped off by ETW.

Values between 0 and 32 are unsupported and invalid.

If the hardware supports 64-bit time stamps but only 32 bits are usable, the driver must ensure that the data is presented correctly to the DirectX graphics kernel subsystem. If the driver has no other alternatives to present the data, it should provide the precision value when the DxgkDdiFormatHistoryBuffer function is next called.


In a call to the DxgkDdiQueryAdapterInfo function, the output data size, DXGKARG_QUERYADAPTERINFO.OutputDataSize, is:

sizeof(DXGKARG_HISTORYBUFFERPRECISION) * m_DriverCaps.GpuEngineTopology.NbAsymetricProcessingNodes


Minimum supported client Windows 8.1,WDDM 1.3 and later
Minimum supported server Windows Server 2012 R2
Header d3dkmddi.h (include D3dkmddi.h)

See also