The GetGlyphIndices function translates a string into an array of glyph indices. The function can be used to determine whether a glyph exists in a font.
DWORD GetGlyphIndicesA( HDC hdc, LPCSTR lpstr, int c, LPWORD pgi, DWORD fl );
A handle to the device context.
A pointer to the string to be converted.
The length of both the length of the string pointed to by lpstr and the size (in WORDs) of the buffer pointed to by pgi.
This buffer must be of dimension c. On successful return, contains an array of glyph indices corresponding to the characters in the string.
Specifies how glyphs should be handled if they are not supported. This parameter can be the following value.
||Marks unsupported glyphs with the hexadecimal value 0xffff.|
If the function succeeds, it returns the number of bytes (for the ANSI function) or WORDs (for the Unicode function) converted.
If the function fails, the return value is GDI_ERROR.
This function attempts to identify a single-glyph representation for each character in the string pointed to by lpstr. While this is useful for certain low-level purposes (such as manipulating font files), higher-level applications that wish to map a string to glyphs will typically wish to use the Uniscribe functions.
|Minimum supported client||Windows 2000 Professional [desktop apps only]|
|Minimum supported server||Windows 2000 Server [desktop apps only]|
|Header||wingdi.h (include Windows.h)|