Touch (Windows Embedded CE 6.0)
User input is the means by which the user communicates with a device. The OEM determines the specific combination of input devices that are supported by the Windows Embedded CE-based device. Different devices support different input devices. For example, some devices support a touch screen for text entry, instead of a keyboard. Other platforms might include handwriting recognition software in place of or in addition to a keyboard.
Windows Embedded CE powered device devices support touch screens. Users can touch the screen with either a stylus or their fingers. Touch is used to move to new screens, interact with common controls, and input text by using an on–screen keyboard or handwriting recognition.
There are two kinds of touch events recognized in Windows Embedded CE. The first is a standard touch event in which you use your finger or stylus to select, drag, or manipulate an object on the screen. Examples of this are typing by using a Software-based Input Panel (SIP), clicking a link, opening an e-mail by selecting it, or scrolling by using a scrollbar. The second kind of touch event is a gesture. Specific, short, directional motions with the finger or stylus, called Touch Gestures, are mapped to different kinds of behavior such as pan, and select and hold.
The stylus or finger generates an input event when the user touches the screen. To an application, touch input is a subset of mouse input. When the user presses and releases the stylus on the screen, the application processes these events as a click of the left mouse button. When the user moves the stylus across the screen, the application processes this action as a movement of a mouse.
In This Section
- Touch OS Design Development
Provides information regarding the modules and components that implement the Touch Screen (Stylus) and related sysgen information.
- Touch Application Development
Describes how input events generated by the stylus are processed by the operating system, and provides information on APIs commonly used in touch screen applications.
- Touch Gestures
Describes touch gestures and provides information on APIs you can use to recognize and process touch gestures.
- Touch Reference
Provides a description of the Touch Screen (Stylus) programming elements.
- Touch Registry Settings
Provides information necessary to configure the behavior of the touch screen.
- User Interface
Provides information on the ways that a user can interact with a Windows Embedded CE-based device and its applications.
Provides information about support for mouse input in Windows Embedded CE.
- Software-based Input Panel
Provides information about how to input data through the Software Input Panel.