Touch Support
Touch support is divided into low-level support in the form of Touchscreen
and into high-level support in the form of the Touch
class.
Touch input is supported on Android, iOS, UWP and Windows.
Touchscreen
Device
At the lowest level, a touch screen is represented by a Touchscreen
device which captures the raw state of the touchscreen. Touch screens are based on the Pointer
layout.
The last used or last added touch screen can be queried with Touchscreen.current
.
Controls
Additional to the controls inherited from Pointer
, touch screen devices implement the following controls:
Control | Type | Description |
---|---|---|
primaryTouch |
TouchControl |
A touch control representing the primary touch of the screen (the primary touch is the touch driving the Pointer representation of the device). |
touches |
ReadOnlyArray<TouchControl> |
An array of touch controls, representing all the touches on the device. |
As you can see, a touch screen device consists of multiple TouchControls
. Each of these represents a potential finger touching the device. The primaryTouch
control will represent the touch which is currently driving the Pointer
representation, and which should be used to interacting with the UI. This is usually the first finger to have touched the screen. primaryTouch
will always be identical to one of the entries in the touches
array. The touches
array contains all touches the system can track. Note that this array has a fixed size of TouchscreenState.MaxTouches
- regardless of how many fingers are currently active. If you need an API which only represents active touches, you can look at the higher-level Touch
class.
Each TouchControl
on the device (including primaryTouch
is made up of the following child controls.
Control | Type | Description |
---|---|---|
position |
Vector2Control |
Absolute position on the touch surface. |
delta |
Vector2Control |
The difference in position since the last frame. |
startPosition |
Vector2Control |
The position where the finger first touched the surface. |
startTime |
DoubleControl |
The time when the finger first touched the surface. |
press |
ButtonControl |
Whether the finger is pressed down. |
pressure |
AxisControl |
Normalized pressure with which the finger is currently pressed while in contact with the pointer surface. |
radius |
Vector2Control |
The size of the area where the finger touches the surface. |
touchId |
IntegerControl |
The id of the touch, used to distinguish individual touches. |
phase |
TouchPhaseControl |
A control reporting the current TouchPhase of the touch. |
tap |
ButtonControl |
A button control which reports whether the OS recognizes a "tap" gesture from this touch. |
tapCount |
IntegerControl |
If tap is reported as pressed, tapCount will report the number of consecutive taps as recognized by the OS. Can be used to detect double- and multi-tap gestures. |
Using Touch with Actions
Touch input can be used with actions like any other Pointer
device, by binding to <Pointer>/press
, <Pointer>/delta
, etc. This will get you input from the primary touch (as well as from any other non-touch pointer devices). However, if you care about getting input from multiple touches in your action, you can bind to individual touches by using bindings like <Touchscreen>/touch3/press
, or use a wildcard binding to bind one action to all touches like this: <Touchscreen>/touch*/press
. If you bind a single action to input from multiple touches like that, you will likely want to set the action type to Pass-Through, so the action get callbacks for each touch, instead of just from one.
Touch
Class
Enhanced touch support is provided by the EnhancedTouch.Touch
class. To enable it, call EnhancedTouchSupport.Enable()
.
using UnityEngine.InputSystem.EnhancedTouch;
// ...
// Can be called from MonoBehaviour.Awake(), for example. Also from any
// RuntimeInitializeOnLoadMethod code.
EnhancedTouchSupport.Enable();
NOTE:
Touchscreen
does NOT requireEnhancedTouchSupport
to be enabled. This also means that touch in combination with actions works fine withoutEnhancedTouchSupport
. CallingEnhancedTouchSupport.Enable()
is only required if you want to use theEnhancedTouch.Touch
API.
The touch API is designed to provide access to touch information along two dimensions:
By finger.
Each finger is defined as the Nth contact source on a
Touchscreen
. You can use Touch.activeFingers to get an array of all currently active fingers.By touch.
Each touch is a single finger contact with at least a beginning point (
PointerPhase.Began
) and an endpoint (PointerPhase.Ended
orPointerPhase.Cancelled
). In-between those two points may be arbitrary manyPointerPhase.Moved
and/orPointerPhase.Stationary
records. All records in a touch will have the sametouchId
. You can use Touch.activeTouches to get an array of all currently active touches. This lets you track how a specific touch has moved over the screen, which is useful if you want to implement e.g. recognition of specific gestures.
See the Scripting API Reference for the EnhancedTouch.Touch
API for more info.
NOTE: The
Touch
andFinger
API is written in a way that does not generate GC garbage and does not require object pooling either. The bulk of the data is stored in unmanaged memory that is indexed by wrapper structs. All arrays are pre-allocated.