Starting a new visionOS project from scratch
This page describes how to start a project from scratch using one or more of the available modes.
Requirements
Before starting, ensure you meet the Hardware and Software Requirements.
Windowed App
- Open the Build Profiles window (menu: File > Build Profiles).
- Select the visionOS platform.
- If necessary, click Switch Platform to change to the visionOS platform.
- Add and select any Scenes you want to include in the build. (For example, SampleScene.)
- Click the Build button.
By default, Unity builds that target visionOS will be set up to run in windowed mode. If you install XR or PolySpatial support (by following steps 1-8 from Fully Immersive Virtual Reality below), you need to manually configure your App Mode in order to build and deploy a 2D windowed application:
- Open Project Settings.
- Change the app mode under
XR Plug-in Management > Apple visionOS > App Mode
toWindowed - 2D Window
.
Windowed Apps use Unity's own rendering pipeline, such as the Built-in Render Pipeline or Universal Render Pipeline. See Windowed Apps for details.
Metal Rendering with Compositor Services (Fully Immersive Virtual and Mixed Reality)
- Open the Project Settings window (menu:Edit > Project Settings).
- Select the XR Plug-in Management section.
- If necessary, click the button to Install XR Plug-in Management.
- Select the tab for the visionOS target build platform.
- Enable the Apple visionOS Plug-in Provider.
- Open the Package Manager to ensure that the latest version of
com.unity.xr.visionos
has been installed. This should beX.Y.Z
. If not, then upgrade to this version by usingInstall Package By Name...
and specifying the version. - Return to the Project Settings window and select the Apple visionOS settings section under XR Plug-in Management.
- Set the App Mode to Metal Rendering with Compositor Services.
- Open the Build Profiles window (menu: File > Build Profiles).
- Select the visionOS platform.
- If necessary, click Switch Platform to change to the visionOS platform.
- Add and select any Scenes you want to include in the build. (For example, SampleScene.)
- Under Platform Settings, set Target SDK to Device SDK to run on the Apple Vision Pro device or Simulator SDK to run on the simulator.
- Click the Build button.
Your app will render a full immersive space and you should see the Unity skybox (or your app) running in the Apple Vision Pro simulator.
Refer to Metal-based Apps on visionOS docs for more information
RealityKit with PolySpatial (Shared and Immersive MR Spaces)
For bounded apps, your app can exist alongside other apps in the shared space. For unbounded apps, your app will be the only content visible.
- Follow steps from Metal Rendering with Compositor Services above up until setting the App Mode.
- Reopen the Package Manager. Using
Install Package By Name...
installcom.unity.polyspatial.visionos
, specifying the latest version (X.Y.Z
). Alternatively, you can use the following link: com.unity.polyspatial.visionos. Installing this package will automatically install the other required packages with the appropriately matching versions (com.unity.polyspatial
,com.unity.polyspatial.visionos
, andcom.unity.polyspatial.xr
). - In the project properties, under XR Plug-in Management, select the Apple visionOS setting section and switch the App Mode to RealityKit with PolySpatial.
- Create a Volume Camera in your scene.
- From the GameObject > XR > Setup menu or the XR Building Blocks overlay, click Volume Camera.
- Add a VolumeCameraWindowConfiguration asset to your project with Create > PolySpatial > Volume Camera Window Configuration. You must store this asset in one of your project's Resources folders. (Refer to Special Folders for more information about Resources folders.)
- Assign the volume camera window configuration to the Volume Window Configuration of the volume camera.
- Configure the volume camera window configuration for bounded or unbounded mode and adjust the output dimensions (if bounded).
- Output Dimensions adjust the rendering scale of your content.
- For bounded apps, make sure something is visible within the dimensions of the volume camera.
- Depending on which mode you would to use when your app starts, set the Default Volume Camera Window Config in Project Settings to the VolumeCameraWindowConfiguration which should be used at startup.
- Open the Project Settings window (menu: Edit > Project Settings) and select the PolySpatial section.
- Drag the desired VolumeCameraWindowConfiguration asset into the Default Volume Camera Window Config field (or use the asset browser).
- Open the Build Profiles window (menu: File > Build Profiles).
- Select the visionOS platform.
- If necessary, click Switch Platform to change to the visionOS platform.
- Add and select any Scenes you want to include in the build. (For example, SampleScene.)
- Under Platform Settings, set Target SDK to Device SDK to run on the Apple Vision Pro device or Simulator SDK to run on the simulator.
- Click the Build button.
Unbounded apps
For unbounded apps that use ARKit features, add the com.unity.xr.arfoundation package to your project. To use skeletal hand tracking data, add the com.unity.xr.hands package to your project. Refer to XR packages for more information about Unity's XR packages.
Note
The Apple Vision Pro simulator does not provide any ARKit data, so planes, meshes, tracked hands, etc. do not work in the simulator.
Refer to RealityKit apps on visionOS docs for more information
Hybrid apps
Hybrid apps combine the capabilities of Metal and RealityKit apps. To create a Hybrid app, follow the above same steps for RealityKit with PolySpatial, but set App Mode to Hybrid - Switch between Metal and RealityKit.
To switch to Metal mode, use a VolumeCameraWindowConfiguration with Mode set to Metal. If you would like the app to start in Metal mode, set a VolumeCameraWindowConfiguration with Mode set to Metal as the Default Volume Camera Window Config in Project Settings.
Refer to PolySpatial Hybrid Apps docs for more information