Starting a new visionOS project from scratch
Make sure to switch the build platform for visionOS (experimental)
Fully Immersive Virtual Reality
Make sure you have the com.unity.xr.visionos package installed
- Select Edit > Project Settings…
- Open the XR Plug-in Manager menu
- Check the vision OS check box
- Select File > Build Settings…
- Add Scenes (SampleScene)
- Select Build.
Your app will render a full immersive space and you should see the Unity skybox (or your app) running in the Apple Vision Pro simulator.
Refer to Fully Immersive VR docs for more information
Mixed Reality and Shared Space
Make sure you have the com.unity.polyspatial, com.unity.polyspatial.visionos, and com.unity.polyspatial.xr packages installed
- Create a Volume Camera in your scene
1. Open the scene tooling / XR Building Blocks Menu and click Volume Camera
2. Create an empty GameObject and add a Volume Camera component - Configure the volume camera for bounded or unbounded mode and adjust the dimensions
1. Dimensions will adjust the rendering scale of your content
2.. For bounded apps make sure something is visible within the dimensions of the volume camera - Open Project Settings > PolySpatial...
- check the Enable PolySpatial Runtime box
Unbounded apps
For unbounded apps that want to use ARKit features you will need to enable visionOS in the XR Plug-in Management settings and make sure you have the AR Foundation package in your project. For ARKit Hands make sure you have XR Hands package in your project.
- Select File > Build Settings…
- Add Scenes (SampleScene)
- Select Build.
For bounded apps your app can exist alongside other apps in the shared space, for unbounded apps your app will be the only content visible.
Note: the Apple Vision Pro simulator does not provide any ARKit data so planes, meshes, tracked hands, etc will not work.
Refer to PolySpatial MR Apps docs for more information