About XR SDK for Windows MR | Package Manager UI website
docs.unity3d.com

    About XR SDK for Windows MR

    This package provides an XR SDK implementation of Windows Mixed Reality support for Unity.

    Supported XR SDK Subsystems

    Please see the XR SDK documentation for information on all subsystems implemented here.

    Session

    Subsystem implementation provides for initialization and management of the global Windows Mixed Reality state. Also provides handling for handling pause/resume state changes.

    This subsystem is required to successfully initialize before initializing any other subsystem defined below.

    Display

    Subsystem implementation of the Display subsystem to allow for rendering of the VR view in the HMD or on HoloLens device.

    Input

    Subsystem implementation of the Input subsystem to allow for tracking of device position/orientation, controller data, etc. Information provided to the subsystem from the provider is surfaced through Tracked Pose Driver.

    Experience

    Subsystem implementation provides for access to information about the current user experience. This includes the following:

    • Boundary Points - The boundary, as defined by the user, that describes the safe play area.
    • Experience Type - What type of reference frame the user is currently running within. The return values are:
      • Local - Seated, device relative coordinates.
      • Bounded - Standing experience limited to a bounded play area. Coordinates are relative to the "floor".
      • Unbounded - Standing experience with no bounded play are limits. Coordinates are relative to the "floor".

    Gesture

    Subsystem implementation to provide for recognition and tracking of gestures provided from the appropriate device. This subsystem relies on the com.unity.xr.interactionsubsystem package for it's core implementation (see that package's documentation for further details/types).

    The WindowsMRGestureSubsystem component manages a low-level interface for polling for Windows Mixed Reality gesture changes. If this component is added to a scene, it is possible to poll for any gesture events each frame. The following gestures and gesture data are provided:

    • WindowsMRHoldGestureEvent - Event that fires when the state of a hold gesture changes. Hold gesture indicating that a air tap or button has been held down my the user.
      • id - Unique GestureId that identifies this gesture.
      • state - GestureState that indicates the state of this gesture (Started, Updated, Completed, Canceled or Discrete).
    • WindowsMRTappedGestureEvent - Event that fires when the state of a tapped gesture changes. Tapped gestures indicate an air tap or button has been tapped by the user.
      • id - Unique GestureId that identifies this gesture.
      • state - GestureState that indicates the state of this gesture (Started, Updated, Completed, Canceled or Discrete).
      • tappedCount - Number of times that the tap has occurred.
    • WindowsMRManipulationGestureEvent - Manipulation gestures can be used to move, resize or rotate an object when it should react 1:1 to the user's hand movements. One use for such 1:1 movements is to let the user draw or paint in the world. The initial targeting for a manipulation gesture should be done by gaze or pointing. Once the tap and hold starts, any manipulation of the object is then handled by hand movements, freeing the user to look around while they manipulate.
      • id - Unique GestureId that identifies this gesture.
      • state - GestureState that indicates the state of this gesture (Started, Updated, Completed, Canceled or Discrete).
      • cumulativeDelta - Vector3 indicating the total distance moved since the beginning of the manipulation gesture.
    • WindowsMRNavigationGestureEvent - Navigation gestures operate like a virtual joystick, and can be used to navigate UI widgets, such as radial menus. You tap and hold to start the gesture and then move your hand within a normalized 3D cube, centered around the initial press. You can move your hand along the X, Y or Z axis from a value of -1 to 1, with 0 being the starting point.
      • id - Unique GestureId that identifies this gesture.
      • state - GestureState that indicates the state of this gesture (Started, Updated, Completed, Canceled or Discrete).
      • normalizedOffset - Vector3 indicating the normalized offset, since the navigation gesture began, of the input within the unit cube for the navigation gesture.

    Additionally, the WindowsMRGestures component can be used to provide a simpler polling mechanism for a simpler, event-based interface for listening to gestures. This component provides a number of events that be hooked into to detect gesture events when they occur:

    • onHoldChanged - Occurs whenever a hold gesture changes state.
    • onManipulationChanged - Occurs whenever a manipulation gesture changes state.
    • onNavigationChanged - Occurs whenever a navigation gesture changes state.
    • onTappedChanged - Occurs whenever a tapped gesture changes state.
    • onActivate - Occurs whenever the cross-platform activate gesture occurs. See the com.unity.xr.interactionsubsystems package documentation for more details.

    Also see the relevant Microsoft documentation about Gestures for supported device information.

    Reference Point

    Subsystem implementation provides support for ephemeral (non-stored) reference points, known as Anchors in official Windows Mixed Reality documentation.

    Successful initialization and start of the subsystem allows the user to Add reference points, Remove reference points and Query for all known reference points. Current subsystem definition does not provide for storage or retrieval of stored reference points.

    See the relevant Microsoft documentation about Anchors for supported device information.

    Meshing

    Subsystem implementation provides access to the meshing constructs the that HoloLens hardware produces. This subsystem only works on devices that actually support meshing (HoloLens) and should either be null or in a non-running state for other devices.

    See the relevant Microsoft documentation about Spatial Mapping for supported device information as well as what to expect in regards to data from this subsystem.

    Additional support outside of XR SDK

    There are a number of features that Windows Mixed Reality supports that are not provided for in XR SDK. These are provided for use through the following extensions:

    Meshing Subsystem Extensions

    Meshing subsystem provides only one means for setting a bounding volume for spatial mapping: SpatialBoundingVolumeBox. This API provides for settings a bounding volume as an Axis Aligned Bounding Box at a given position given specific extents. Windows Mixed Reality additionally provides for setting a bounding volume as an Oriented Bounding Box, a Sphere or a Frustum.

    • SetBoundingVolumeOrientedBox - Similar to SpatialBoundingVolumeBox but also allows for setting a given orientation to the volume.
    • SetBoundingVolumeSphere - Set a bounding volume to a sphere at some origin point and with the given radius.
    • SetBoundingVolumeFrustum - Set the bounding volume to the frustum defined by the 6 planes passed in. Each plane is defined as a point offset from the head, with a given orientation. The easiest way to set this is to use the GeometryUtility.CalculateFrustumPlanes Unity API and use that to populate the data for this call. The plane ordering passed in matches the plane ordering from this API.

    Reference Points Subsystem Extensions

    XR SDK Management support

    While not required to use Windows Mixed Reality XR SDK provider, integration with XR SDK Management provides for a simpler and easier way of using this (and other) providers within Unity. This package provides for the following XR SDK Management support:

    • Runtime Settings - Provides for setting runtime settings to be used by the provider instance. These settings are per-supported platform.
    • Build Settings - Provides for setting build settings to be used by the Unity build system. These settings are platform specific and are used to enable boot time settings as well as copy appropriate data to the build target.
    • Lifecycle management - This package provides a default XR SDK Loader instance that can be used either directly or with the XR Management global manager. It provides for automated (or manual) lifetime management for all the subsystems that are implemented by this provider.
    • Integration with Unity Settings UI - Custom editors and placement within the Unity Unified Settings UI within the top level XR settings area.

    Build Settings

    • Use Primary Window - Toggle to set the provider instance to immediately initialize XR SDK using the primary UWP window spawned by Unity. Set enabled by default. WSA/UWP Only.

    Runtime Settings

    • Shared Depth Buffer - Enabled or disable support for using a shared depth buffer. This allows Unity and the Mixed Reality Subsystem to use a common depth buffer. This allows the Windows Mixed Reality system to provide better stabilization and integrated overlay support. Disabled by default.

    • Depth Buffer Format - Switch to determine the bit-depth of the depth buffer when sharing is enabled. Possible depth formats are 16bit and 24 bit.

    XR SDK Management Loader

    The default loader provided by the Windows Mixed Reality XR SDK implementation is setup to use all the subsystems provided by this implementation. The only required subsystem is Session which means that failure to initialize Session will cause the loader to fail init and fall through to the next expected loader.

    If Session successfully initializes, then it is still possible for starting the subsystem could fail. If starting fails then the loader will clear all the subsystems and the app will fall through to standard Unity non-VR view.

    All other subsystems depend on session but, unlike session, failure to initialize or start will not cause the whole provider to fail.

    Back to top
    Copyright © 2023 Unity Technologies — Terms of use
    • Legal
    • Privacy Policy
    • Cookies
    • Do Not Sell or Share My Personal Information
    • Your Privacy Choices (Cookie Settings)
    "Unity", Unity logos, and other Unity trademarks are trademarks or registered trademarks of Unity Technologies or its affiliates in the U.S. and elsewhere (more info here). Other names or brands are trademarks of their respective owners.
    Generated by DocFX on 18 October 2023