GetInputMaxLatency method retrieves the maximum latency on a specified input stream.
HRESULT GetInputMaxLatency( DWORD dwInputStreamIndex, REFERENCE_TIME *prtMaxLatency );
Zero-based index of an input stream on the DMO.
Pointer to a variable that receives the maximum latency.
Returns an HRESULT value. Possible values include those in the following table.
||Invalid stream index.|
||Not implemented. Assume zero latency.|
The latency is the difference between a time stamp on the input stream and the corresponding time stamp on the output stream. The maximum latency is the largest possible difference in the time stamps. For a DMO, determine the maximum latency as follows:
- Process input buffers until the DMO can produce output.
- Process as many output buffers as possible.
- The maximum latency is the largest delta between input time stamps and output time stamps (taken as an absolute value).
For the special case where a DMO processes exactly one sample at a time, the maximum latency is simply the difference in time stamps.
Latency is defined only when samples have time stamps, and the time stamps increase or decrease monotonically. Maximum latency might depend on the media types for the input and output streams.
|Header||mediaobj.h (include Dmo.h)|