The main options to handle input in an XRAn umbrella term encompassing Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) applications. Devices supporting these forms of interactive applications can be referred to as XR devices. More info
See in Glossary game or application include:
In some cases, you might use more than one of these options at the same time. For example, you could use the XR Interaction Toolkit to allow the user to pick up objects in the environment, use the Input System to bind a pause function to a controller button, and use the XR.Node API to read the hardware state so that you can animate the controller GameObjectThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary.
Note: The OpenXR plug-inA set of code created outside of Unity that creates functionality in Unity. There are two kinds of plug-ins you can use in Unity: Managed plug-ins (managed .NET assemblies created with tools like Visual Studio) and Native plug-ins (platform-specific native code libraries). More info
See in Glossary, which supports many different XR devices and controllers, provides its own, additional way to access XR input and tracking data. You can still use the XR Interaction Toolkit, the Input System, or the XR.InputDevice
and XR.Node
APIs. (The legacy Input Manager is not supported when you use the OpenXR plugin.) Refer to Input in Unity OpenXR for more information.
The XR Interaction Toolkit builds on the Input System and the base UnityEngine.XR
API to support XR input and interaction. It provides a near ready-to-use set of components for handling XR input and defining interactions between the user and the environment and the sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary UI(User Interface) Allows a user to interact with your application. Unity currently supports three UI systems. More info
See in Glossary.
The XR Interaction Toolkit provides:
You can access the controls of XR Controllers, such as buttons and joysticks, in the same ways you would access any game controllerA device to control objects and characters in a game.
See in Glossary. To access tracking data, use the XR.InputTracking API in code. You can also use the Input System TrackedPoseDriver component to control a GameObject with a tracked device such as a HMD or controller. The TrackedPoseDriver component is also available from the Legacy Input Helpers package, in case you are not using the Input System.
Note: When you use the OpenXR plug-in, you must use the Input System. The Input ManagerSettings where you can define all the different input axes, buttons and controls for your project. More info
See in Glossary is not supported.
The XR Hands package provides access to hand tracking data from XR devices that support it. To access this data, you must also use an XR provider plug-in that has been updated to support hand tracking, such as OpenXR version 1.12.
The XR Hands package provides the following:
Your input options on visionOS depend on whether your app is running in windowed mode, mixed realityMixed Reality (MR) combines its own virtual environment with the user’s real-world environment and allows them to interact with each other.
See in Glossary mode, or virtual realityVirtual Reality (VR) immerses users in an artificial 3D world of realistic images and sounds, using a headset and motion tracking. More info
See in Glossary mode.
In windowed mode, the user’s gaze and pinch gestures are translated into touch events by the operating system. Your app doesn’t have access to the raw input data. visionOS reports a maximum of two touch points.
In mixed reality mode, the input options further depend on whether your app is running in a shared space with other apps or in an immersive space. In a shared space, the situation is similar to that of a windowed app; the operating system translates the user’s gaze and hand movements into touch gestures. In this case, you can use the Spatial Pointer Device to access 3D touch data rather than just 2D. In an immersive space, you also have access to 3D ray origin and direction of the user’s gaze at the start of the gesture. In addition, you have access to ARKit data such as head and hand tracking, plane detection, scene reconstruction meshes, and image tracking.
In virtual reality mode, you have access to ARKit data such as head and hand tracking, plane detection, scene reconstruction meshes, and image tracking. (You do not have access to the Spatial Pointer Device or other PolySpatial-specific components.)
For more information, refer to PolySpatial visionOS: Input.
The XR Input APIs provide direct access to XR input. The API lets you find connected XR devices and read their tracking data and state of their input hardware.
Refer to Unity XR Input for more information about the XR input API.
Device makers and other third parties often provide their own input and interaction APIs that you can use instead of or in addition to those provided by Unity.
Did you find this page useful? Please give it a rating:
Thanks for rating this page!
What kind of problem would you like to report?
Thanks for letting us know! This page has been marked for review based on your feedback.
If you have time, you can provide more information to help us fix the problem faster.
Provide more information
You've told us this page needs code samples. If you'd like to help us further, you could provide a code sample, or tell us about what kind of code sample you'd like to see:
You've told us there are code samples on this page which don't work. If you know how to fix it, or have something better we could use instead, please let us know:
You've told us there is information missing from this page. Please tell us more about what's missing:
You've told us there is incorrect information on this page. If you know what we should change to make it correct, please tell us:
You've told us this page has unclear or confusing information. Please tell us more about what you found unclear or confusing, or let us know how we could make it clearer:
You've told us there is a spelling or grammar error on this page. Please tell us what's wrong:
You've told us this page has a problem. Please tell us more about what's wrong:
Thank you for helping to make the Unity documentation better!
Your feedback has been submitted as a ticket for our documentation team to review.
We are not able to reply to every ticket submitted.
When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.
More information
These cookies enable the website to provide enhanced functionality and personalisation. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies then some or all of these services may not function properly.
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.
These cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising. Some 3rd party video providers do not allow video views without targeting cookies. If you are experiencing difficulty viewing a video, you will need to set your cookie preferences for targeting to yes if you wish to view videos from these providers. Unity does not control this.
These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.