Running an XRAn umbrella term encompassing Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) applications. Devices supporting these forms of interactive applications can be referred to as XR devices. More info
See in Glossary app developed in Unity works the same as running any type of app on a device. In most cases, you can use the Unity Build and Run function to both build and run an app on a connected device.
Note: The method you use to install an already built app on a device varies by platform, and you should refer to the platform’s documentation for detailed information.
To build your game or application and run it on a device:
Tip: After you have configured your build settings, you can build and run your project directly with the File > Build And Run menu command.
See Publishing Builds for more information about building Unity projects.
On the Windows platform, some XR provider plug-insA set of code created outside of Unity that creates functionality in Unity. There are two kinds of plug-ins you can use in Unity: Managed plug-ins (managed .NET assemblies created with tools like Visual Studio) and Native plug-ins (platform-specific native code libraries). More info
See in Glossary support a “hybrid” play mode in which the project runs on a connected XR device when you enter Play mode. The Unity Game view mirrors the headset display. You have the choice of the following options for the Game view:
Tip: When developing for the Quest 2 or Pro, you can switch to the Windows platform and use the Quest Link to take advantage of the faster iteration turn around provided by Play mode.
The Mock HMD package provides a simulated HMD in the Unity Game view in Play mode. The Mock HMD is a provider plug-in that simulates the presence of a head-mounted device. Enable the Mock HMD in the XR Plug-in Management settings section. Input and tracking are not simulated.
See the Mock HMD documentation for more information.
Tip: The XR Interaction Toolkit package provides an XR Device Simulator that translates keyboard and mouse input into movement and interaction.
Most XR devices use one of the existing OS platforms. For additional information about running projects on these platforms, see:
Did you find this page useful? Please give it a rating:
Thanks for rating this page!
What kind of problem would you like to report?
Thanks for letting us know! This page has been marked for review based on your feedback.
If you have time, you can provide more information to help us fix the problem faster.
Provide more information
You've told us this page needs code samples. If you'd like to help us further, you could provide a code sample, or tell us about what kind of code sample you'd like to see:
You've told us there are code samples on this page which don't work. If you know how to fix it, or have something better we could use instead, please let us know:
You've told us there is information missing from this page. Please tell us more about what's missing:
You've told us there is incorrect information on this page. If you know what we should change to make it correct, please tell us:
You've told us this page has unclear or confusing information. Please tell us more about what you found unclear or confusing, or let us know how we could make it clearer:
You've told us there is a spelling or grammar error on this page. Please tell us what's wrong:
You've told us this page has a problem. Please tell us more about what's wrong:
Thank you for helping to make the Unity documentation better!
Your feedback has been submitted as a ticket for our documentation team to review.
We are not able to reply to every ticket submitted.