Meshing sample scenes
Meshing samples demonstrate AR Foundation Meshing functionality. You can open these samples in Unity from the Assets/Scenes/Meshing
folder.
Tip
AR Foundation samples are supported on platforms that support the demonstrated feature. To check whether a feature is supported on your chosen platform, refer to the Platform support table and the documentation for that feature.
To understand each of the meshing sample scenes, refer to the following sections:
Sample | Description |
---|---|
Normal meshes | Renders an overlay on top of the real world scanned geometry illustrating the normal of the surface. |
Classification meshes (ARKit) | Demonstrates mesh classification functionality. |
Occlusion meshes | Demonstrates how to use meshes of real world geometry to occlude virtual content. |
Requirements
The meshing sample scenes use features of some devices to construct meshes from scanned data of real world surfaces. These meshing scenes will not work on all devices.
ARKit requirements
On ARKit, this functionality requires iOS 13.4 or newer, and iPadOS 13.4 or newer. Meshing is supported on devices with a LiDAR scanner.
Note
To check whether a device has a LiDAR scanner, refer to Apple's Tech Specs.
Normal meshes scene
The Normal Meshes
scene renders an overlay on top of the real world scanned geometry illustrating the normal of the surface.
Classification meshes scene (ARKit)
The Classification Meshes
scene demonstrates mesh classification functionality.
With mesh classification enabled, each triangle in the mesh surface is identified as one of several surface types. This sample scene creates submeshes for each classification type and renders each mesh type with a different color.
This scene is available on ARKit only on iOS 13.4 or newer, and iPadOS 13.4 or newer. Meshing is supported on devices with a LiDAR scanner.
Occlusion meshes scene
The Occlusion Meshes
scene demonstrates how to use meshes of real world geometry to occlude virtual content.
At first, this scene might appear to be doing nothing. However, it's rendering a depth texture on top of the scene based on the real world geometry. This allows for the real world to occlude virtual content. The scene has a script on it that fires a red ball into the scene when you tap. To observe occlusion working, fire the red balls into a space and move the iPad camera behind a real world object. You will see that the virtual red balls are occluded by the real world object.
Apple and ARKit are trademarks of Apple Inc., registered in the U.S. and other countries and regions.