Touch Gestures (Windows Embedded CE 6.0)

1/6/2010

A touch gesture is a movement of a finger or stylus over a control or object on the screen. Windows Embedded CE 6.0 R3 supports single-touch gestures only.

Windows Embedded CE 6.0 R3 supports the following gestures.

Gesture Description

Flick

The user presses a finger on the screen, moves the finger in any direction, and then lifts up the finger to initiate scrolling that continues for a short time after the screen contact ends.

The gesture recognizer sends the application a single GID_SCROLL gesture message upon finger-up. A flick frequently occurs after a pan (one or more GID_PAN gesture messages followed by a GID_END message immediately before the flick).

Pan

The user presses and holds a finger on the screen and then drags the finger in any direction. This represents a mouse move event.

The gesture recognizer sends the application one or more GID_PAN gesture messages as the position changes, followed by a GID_END when the user lifts the finger. Mouse messages are interleaved for backward compatibility, but the gesture messages are always received before the corresponding mouse events.

Panning can occur after a hold gesture.

Tap

A tap represents the left click of a mouse.

The application receives a single GID_SELECT gesture message when the finger–down event and finger–up event occur within a defined time period and a specific distance. There can be several WM_MOUSEMOVE messages after the WM_LBUTTONDOWN event and before the GID_SELECT message.

Double Tap

A double-tap represents the left double-click of a mouse.

The application receives a GID_DOUBLESELECT message when two finger–up events occur within a defined time period and a specific distance.

Hold

The user presses and holds a finger on the screen. This represents the right-click of a mouse.

The application receives a single GID_HOLD message when the finger remains down for longer than a defined time period and does not move more than a specific distance. The GID_HOLD message is followed by a GID_END message on finger–up or at the end of the hold time threshold.

For more information about GID_* gesture messages, see GESTUREINFO.

When the user touches the screen, the touch screen driver recognizes a touch event and passes an array of touch points to the gesture engine. The gesture engine then passes the array of touch points to each registered gesture recognizer. When a recognizer identifies a gesture, it adds the gesture event into the same message queue that receives the touch events.

The gesture engine always delivers gesture events before any final finger–up event so that the window procedure can process the gesture by calling the GetGestureInfo function and by canceling any action planned for the finger–up touch event.

Physics Engine

You can use the physics engine APIs to calculate a stream of animation points to animate the response of a window to a gesture. The physics engine API handles the following:

  • One-dimensional and two-dimensional scrolling in response to a gesture.
  • Boundary animations (including a rubber band style animation and a bounce animation).
  • Calculating animation stopping points so that window controls are displayed correctly when the animation is complete.

For more information about the physics engine, see Physics Engine Overview.

Window Auto Gestures

You can enable automatic handling for pan and flick gestures in any standard scrollable Win32 window by using the window auto gesture API set. Window auto gestures only work with windows that are created with the WS_HSCROLL or WS_VSCROLL styles.

The windows auto gesture APIs use the physics engine to calculate animation points for pan and flick gestures.

Requirements

Header pwinuser.h

See Also

Other Resources

Touch
Gesture Reference
Physics Engine Reference
WindowAutoGesture Reference