Microsoft Edge implements touch event interfaces to enable the ability to interpret touch interaction. Interactions can be made with touch screens via finger or stylus, or trackpads. These interfaces can even support multi-touch interactions so that applications can take advantage of more complex gestures.
A touch, represented by the
Touch object, is a single touch point and contains info about the position of the touch point relative to the browser viewport and the x,y coordinates relative to the screen.
During a touch interaction, an application receives touch events during the start, move and end phases. A touch interaction ends when the fingers are removed from the surface.
Microsoft Edge supports the following touch interfaces:
|Touch||Represents an individual touch point for a touch event.|
|TouchEvent||Used for the
|TouchList||Provides the list of individual points of contact for a touch event.|