IReferenceClockTimerControl::SetDefaultTimerResolution method (strmif.h)
The SetDefaultTimerResolution method sets the minimum timer resolution.
Syntax
HRESULT SetDefaultTimerResolution(
[in] REFERENCE_TIME timerResolution
);
Parameters
[in] timerResolution
Minimum timer resolution, in 100-nanosecond units. If the value is zero, the reference clock cancels its previous request.
Return value
Returns an HRESULT value. Possible values include the following.
| Return code | Description |
|---|---|
|
Success. |
Remarks
The reference clock attempts to set the period of the timer to timerResolution. The actual period of the timer might differ, depending on the hardware. To find the minimum and maximum timer resolution, call the timeGetDevCaps function. The reference clock sets the timer resolution is set by calling timeBeginPeriod. If timerResolution is 0, the method cancels the previous timer request by calling timeEndPeriod. (When the reference clock is destroyed, it automatically cancels any previous request.)
If this method is not called, the reference clock sets the timer resolution to 1 millisecond. To get the best power management performance, it is recommended that you call this method with the value zero. This overrides the clock's default setting of 1 millisecond. If any filters in the graph require a higher timer resolution, they can call timeBeginPeriod individually. Typically only renderers should require a particular timer resolution.
Requirements
| Minimum supported client | Windows Vista [desktop apps only] |
| Minimum supported server | Windows Server 2008 [desktop apps only] |
| Target Platform | Windows |
| Header | strmif.h (include Dshow.h) |
| Library | Strmiids.lib |