Starting a new visionOS project from scratch
This page describes how to start a project from scratch using one or more of the available modes.
Requirements
Before starting, ensure you meet the Hardware and Software Requirements.
Windowed App
- Open the Build Settings window (menu: File > Build Settings).
- Select the visionOS platform.
- If necessary, click Switch Platform to change to the visionOS platform.
- Add and select any Scenes you want to include in the build. (For example, SampleScene.)
- Click the Build button.
By default, Unity builds that target visionOS will be set up to run in windowed mode. If you install XR or PolySpatial support (by following steps 1-8 from Fully Immersive Virtual Reality below), you need to manually configure your App Mode in order to build and deploy a 2D windowed application:
- Open Project Settings.
- Change the app mode under
XR Plug-in Management > Apple visionOS > App Mode
toWindowed - 2D Window
.
Windowed Apps use Unity's own rendering pipeline, such as the Built-in Render Pipeline or Universal Render Pipeline. See Windowed Apps for details.
Fully Immersive Virtual Reality
- Open the Project Settings window (menu:Edit > Project Settings).
- Select the XR Plug-in Management section.
- If necessary, click the button to Install XR Plug-in Management.
- Select the tab for the visionOS target build platform.
- Enable the Apple visionOS Plug-in Provider.
- Select the Apple visionOS settings section under XR Plug-in Management.
- Set the App Mode to Virtual Reality - Fully Immersive Space.
- Set the Target SDK to run on the device or simulator.
- Open the Project Settings window (menu: Edit > Project Settings) and select the Player section.
- Under Other Settings > Configuration, set Target SDK to Device SDK to run on the Apple Vision Pro device or Simulator SDK to run on the simulator.
- Open the Build Settings window (menu: File > Build Settings).
- Select the visionOS platform.
- If necessary, click Switch Platform to change to the visionOS platform.
- Add and select any Scenes you want to include in the build. (For example, SampleScene.)
- Click the Build button.
Your app will render a full immersive space and you should see the Unity skybox (or your app) running in the Apple Vision Pro simulator.
Refer to Fully Immersive VR docs for more information
Mixed Reality and Shared Space
For bounded apps, your app can exist alongside other apps in the shared space. For unbounded apps, your app will be the only content visible.
Follow steps 1-8 from above, this time setting App Mode to Mixed Reality - Volume or Immersive Space.
This should automatically install the required packages, com.unity.polyspatial, com.unity.polyspatial.visionos, and com.unity.polyspatial.xr.
Create a Volume Camera in your scene.
- From the GameObject > XR > Setup menu or the XR Building Blocks overlay, click Volume Camera.
- Add a VolumeCameraWindowConfiguration asset to your project with Create > PolySpatial > Volume Camera Window Configuration. You must store this asset in one of your project's Resources folders. (Refer to Special Folders for more information about Resources folders.)
- Assign the volume camera window configuration to the Volume Window Configuration of the volume camera.
Configure the volume camera for bounded or unbounded mode and adjust the dimensions (if bounded).
- Dimensions adjust the rendering scale of your content.
- For bounded apps, make sure something is visible within the dimensions of the volume camera.
Open the Build Settings window (menu: File > Build Settings).
- Select the visionOS platform.
- If necessary, click Switch Platform to change to the visionOS platform.
- Add and select any Scenes you want to include in the build. (For example, SampleScene.)
- Click the Build button.
Unbounded apps For unbounded apps that use ARKit features, add the com.unity.xr.arfoundation package to your project. To use skeletal hand tracking data, add the com.unity.xr.hands package to your project. Refer to XR packages for more information about Unity's XR packages.
Note
The Apple Vision Pro simulator does not provide any ARKit data, so planes, meshes, tracked hands, etc. do not work in the simulator.
Refer to PolySpatial MR Apps docs for more information