VR development shares common workflows and design considerations with any real-time 3D development in Unity. However, distinguishing factors include:
To get started with VR development, use the XR Plug-in Management system to install and enable XR provider plug-ins for the devices you want to support. See XR Project set up for more information.
A basic VR sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary should contain an XR Origin, which defines the 3D origin for tracking data. This collection of GameObjectsThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary and components also contains the main scene Camera and the GameObjects representing the user’s controllers. See Set up an XR scene for instructions on setting up a basic VR scene.
Beyond the basics, you typically need a way for the user to move around and to interact with the 3D world you have created. The XR Interaction Toolkit provides components for creating interactions like selecting and grabbing objects. It also provides a customizable locomotion system. You can use the Input System in addition to or instead of the XRAn umbrella term encompassing Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) applications. Devices supporting these forms of interactive applications can be referred to as XR devices. More info
See in Glossary Interaction Toolkit.
Most of the features and APIs used for VR development in Unity are provided through packages. These packages include:
To build VR apps in Unity, use the XR Plug-inA set of code created outside of Unity that creates functionality in Unity. There are two kinds of plug-ins you can use in Unity: Managed plug-ins (managed .NET assemblies created with tools like Visual Studio) and Native plug-ins (platform-specific native code libraries). More info
See in Glossary Management system to add and enable provider plug-ins for the devices you want to support. See XR Project set up for instructions.
The VR provider plug-ins supported by Unity include:
Note: Many headset makers are working toward using the OpenXR runtime as a standard. However, this process is not complete and there can be feature discrepancies between OpenXR and a headset maker’s own provider plug-in or SDK
The XR Interaction Toolkit can make it easier and faster to develop VR applications. The XR Interaction Toolkit provides:
The XR Core Utilities package contains software utilities used by other Unity XR plug-ins and packages. Typically, this package gets installed in your project as a dependency of other XR packages.
The Unity Input System package not only supports accessing user input from VR controller buttons and joysticks, but also provides access to XR tracking data and haptics. The Input System package is required if you use the XR Interaction Toolkit or the OpenXR provider plug-in.
Unity’s VR Project Template provides a starting point for virtual reality development in Unity. The template configures project settingsA broad collection of settings which allow you to configure how Physics, Audio, Networking, Graphics, Input and many other areas of your project behave. More info
See in Glossary, pre-installs the right packages, and includes a sample scene with various pre-configured example assets to demonstrate how to set up a project that is ready for VR. Access the VR template through the Unity Hub when you create a new project. Refer to Create a new project for information about creating a project with the template.
For more information about the template assets and how the sample scene is set up, refer to About the VR Project Template.