Version: 2021.3
Language : English
Set up an XR scene
XR Origin

XR input options

The main options to handle input in an XRAn umbrella term encompassing Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) applications. Devices supporting these forms of interactive applications can be referred to as XR devices. More info
See in Glossary
game or application include:

  • The XR Interaction Toolkit
  • OpenXR interaction profiles
  • “Traditional” input through the Input System or Input Manager
  • The XR.InputDevice and XR.Node APIs
  • Third-party input libraries

In some cases, you might use more than one of these options at the same time. For example, you could use the XR Interaction Toolkit to allow the user to pick up objects in the environment, use the Input System to bind a pause function to a controller button, and use the XR.Node API to read the hardware state so that you can animate the controller GameObjectThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary
.

Note: The OpenXR plug-inA set of code created outside of Unity that creates functionality in Unity. There are two kinds of plug-ins you can use in Unity: Managed plug-ins (managed .NET assemblies created with tools like Visual Studio) and Native plug-ins (platform-specific native code libraries). More info
See in Glossary
, which supports many different XR devices and controllers, provides its own, additional way to access XR input and tracking data. You can still use the XR Interaction Toolkit, the Input System, or the XR.InputDevice and XR.Node APIs. (The legacy Input Manager is not supported when you use the OpenXR plugin.) See Input in Unity OpenXR for more information.

XR Interaction Toolkit

The XR Interaction Toolkit builds on the Input System and the base UnityEngine.XR API to support XR input. It provides a near ready-to-use set of components for handling XR input and defining interactions between the user and the environment and the sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary
UI(User Interface) Allows a user to interact with your application. Unity currently supports three UI systems. More info
See in Glossary
. Even if you choose not to use the toolkit’s interaction system, you can use the input components as a starting point to save set up effort. The toolkit provides two basic approaches to handling input:

  • Action-based input provides the most flexibility and makes it easier to support multiple controller and input schemes. The XR Interaction Toolkit defines actions for common XR input needs. To use these actions, add the XR Controller (Action-Based) component to your controller GameObjects. This component is already part of XR Origin configuration that you can add to your XR scene (unless you choose the device-based option).
  • Device-based input is simpler to set up than action-based input, but is not as flexible. To use device-based input with the XR Interaction Toolkit, add the XR Controller (Device-Based) component to your controller GameObjects. This component is already part of the XR Origin configuration when you add it to the scene with the GameObject > XR > Device-based > XR Origin menu command.

See Actions for more information about Input System Actions.

Tip: The Starter Assets available in the XR Interaction Toolkit provide presets and input actions that remove most of the rote setup work involved in using action-based input.

The XR Interaction Toolkit defines the following base interactions:

  • Select: intended for selecting InteractablesA UI component property that determines whether the component can accept input. More info
    See in Glossary
    in the environment
  • Activate: intended for activating Interactables in the environment
  • UI Press: intended for pressing a control, such as a button, in a UI
  • Rotate anchor: intended for rotating an object held at a distance (with the X-Ray Interactor)
  • Translate anchor: intended for moving an object held at a distance (with the X-Ray Interactor)

The XR Interaction Toolkit uses these interactions with additional components to let the user interact with the environment. For example, if you add a GrabInteractable component to an object, the user can trigger Select to grab it. By default, select is bound to the grip button of an XR controller, but you can change the binding as you see fit. You can also use your own actions and bindings alongside those defined by the toolkit.

Input System or Input Manager

You can access the controls of XR Controllers, such as buttons and joysticks, in the same ways you would access any game controllerA device to control objects and characters in a game.
See in Glossary
. To access tracking data, use the XR.InputTracking API in code. You can also use the Input System TrackedPoseDriver component to control a GameObject with a tracked device such as a HMD or controller. The TrackedPoseDriver component is also available from the Legacy Input Helpers package, in case you are not using the Input System.

Note: When you use the OpenXR plug-in, you must use the Input System. The Input ManagerSettings where you can define all the different input axes, buttons and controls for your project. More info
See in Glossary
is not supported.

XR Input APIs

The XR Input APIs provide direct access to XR input. The API lets you find connected XR devices and read their tracking data and state of their input hardware.

See Unity XR Input for more information about the XR input API.

Third-party input APIs

Device makers and other third parties often provide their own input and interaction APIs that you can use instead of or in addition to those provided by Unity.

Set up an XR scene
XR Origin