UIDLGLOGFONT.lfQuality Field

Definition

Specifies the output quality.

public: System::Byte lfQuality;
public: byte lfQuality;
byte lfQuality;
[System.Runtime.InteropServices.ComAliasName("Microsoft.VisualStudio.TextManager.Interop.BYTE")]
public byte lfQuality;
[<System.Runtime.InteropServices.ComAliasName("Microsoft.VisualStudio.TextManager.Interop.BYTE")>]
val mutable lfQuality : byte
Public lfQuality As Byte 

Field Value

Attributes

Remarks

The output quality defines how carefully the graphics device interface (GDI) must attempt to match the logical-font attributes to those of an actual physical font. It can be one of the following values.

Value Meaning
ANTIALIASED_QUALITY See note following table.
DEFAULT_QUALITY Appearance of the font does not matter.
DRAFT_QUALITY Appearance of the font is less important than when PROOF_QUALITY is used. For GDI raster fonts, scaling is enabled, which means that more font sizes are available, but the quality may be lower. Bold, italic, underline, and strikeout fonts are synthesized if necessary.
PROOF_QUALITY Character quality of the font is more important than exact matching of the logical-font attributes. For GDI raster fonts, scaling is disabled and the font closest in size is chosen. Although the chosen font size may not be mapped exactly when PROOF_QUALITY is used, the quality of the font is high and there is no distortion of appearance. Bold, italic, underline, and strikeout fonts are synthesized if necessary.

If neither ANTIALIASED_QUALITY nor NONANTIALIASED_QUALITY is selected, the font is antialiased only if the user chooses smooth screen fonts in Control Panel.

COM Signature

From uilocale.idl.

[C++]

Applies to