Developing a Native Win32 Application for Windows 7 Touch
In a previous post, I discussed pros and cons of different development platforms for Windows 7 touch applications. In this post, I’ll continue elaborating on each development environment and discuss common development considerations and issues for a touch application when using native C++. If you haven’t read the Windows Touch Guidance whitepaper, please review this document for good general guidance on developing a touch application. Note: Some of this base material is repeated from my WPF post.
How Much “Touch” is Enough?
One benefit of Windows touch is single touch and multitouch gestures are converted to mouse messages if they aren’t handled explicitly by your application. Therefore, your application may be somewhat touch friendly already. Depending on the touch scenario you want to support, you may only need to tweak your application in a few areas and be able to provide a good touch user experience. A good reference to explain what makes an application touch “friendly”, “enabled” or “optimized” is the MSDN Touch Guidelines.
For the remainder of this post, I’ll focus on key touch considerations and suggest ways to implement using native C++.
Touch Input Panel (TIP)
The TIP can be very handy if you want to allow users to enter text without a keyboard. Several “slate” PC’s now exist and either have an external keyboard or allow the user to configure the notebook to hide the keyboard. The TIP has existed since Tablet PC edition and is available in all versions of Windows 7. For native code, the TIP is displayed when the system caret is displayed.
Here is sample code that displays the system caret in a window if you click the left mouse button.
CreateCaret(hWnd, NULL, 5, 20); xPos = LOWORD(lParam); yPos = HIWORD(lParam); SetCaretPos(xPos,yPos); ShowCaret(hWnd);
This is the behavior in an application:
Since the TIP is displayed when the caret is displayed, most native controls that allow text entry will display the TIP. If you are planning on using DirectX, the TIP will not display in full screen mode. You will need to implement your own text entry for full-screen mode DirectX applications.
Screen Orientation and Rotation
The Windows 7 Engineering Guidance for Slate PCs recommends PC’s to enable screen rotation automatically through sensors or a manual hardware button. If you plan on targeting mobile touch PC’s, you should consider supporting screen rotation for your touch application. To support rotation, you generally detect the screen dimensions have changed and then change the layout. If your application supports window resizing, you can leverage that same functionality for screen orientation and rotation.
For native C++ applications, you can determine landscape or portrait orientation by using the GetSystemMetrics function to retrieve the height and width of the full screen. For screen rotation, you can listen for the WM_DISPLAYCHANGE message in your WindowProc function's switch statement. For code samples on detecting screen orientation and rotation, see Detecting Screen Orientation and Screen Rotation in Tablet PC Applications.
Gestures, Manipulations, and Inertia
Developing in native C++ allows you full access to all of the Windows 7 touch API’s. Therefore, you can implement simple gestures or very complex custom manipulations. Here’s a quick summary of each item.
- For raw touch information, you can register your application to receive WM_TOUCH messages. WM_TOUCH messages combine down, up, move and other states into one message.
- If you plan on just implementing well know gestures such as pinch zoom, rotate, pan, etc., you can use the WM_GESTURE message which is generated for common gestures.
- For manipulations, the application needs to implement the IManipulationProcessor interface.
- For inertia, the IInertiaProcessor interface can be used in conjunction with the IManipulationProcessor interface.