About XR SDK for Windows MR
Notice: Microsoft has transitioned support of Windows MR devices to OpenXR in Unity 2021, and recommends using Unity's OpenXR plugin. As such, this Windows XR plugin is marked as deprecated and will be removed in the 2021.2 release. It will continue to be supported in the 2020 LTS.
This package provides an XR SDK implementation of Windows Mixed Reality support for Unity.
Supported rendering modes
Windows Mixed Reality supports only one rendering mode for XR SDK: Single Pass Instancing (SPI). This is a limitation of the current underlying Windows API. Supporting any other rendering mode requires generating intermediate textures and blit-ing those textures to the current SPI buffers, introducing very expensive intermediate copy operations. For this reason, there is no means for changing rendering mode for the XR SDK Windows MR plug-in.
Viewport scaling limitations
Unity's XR system provides the ability to change the render viewport scaling factor between 0.0 exclusive and 1.0 inclusive. This is done through the XRSettings.renderViewportScale API. Windows Mixed Reality supports viewport scaling, but within the following limitations:
Windows Mixed Reality - Immersive : Lower limit of 0.58f. Windows Mixed Reality - HoloLens : Lower limit of 0.5f. Windows Mixed Reality - HoloLens V2 : Lower limit of 0.44f.
The XRSettings.renderViewportScale API won't stop you from setting values lower than these, but internally the implementation and the OS will put lower bounds on the viewport scale as outlined above.
For more info see https://docs.microsoft.com/en-us/windows/mixed-reality/develop/platform-capabilities-and-apis/rendering#supported-resolutions-on-hololens-2.
Supported XR SDK Subsystems
Please see the XR SDK documentation for information on all subsystems implemented here.
Session
Subsystem implementation provides for initialization and management of the global Windows Mixed Reality state. Also provides handling for handling pause/resume state changes.
This subsystem is required to successfully initialize before initializing any other subsystem defined below.
Session also provides two properties that reflect tracking state information. These properties are driven by the SpatialLocatability state of the SpatialLocator instance. We map these states as follows:
Tracking State and Not Tracking Reason should map Locatability like this
SpatialLocatability | Tracking State | Not Tracking Reason |
---|---|---|
Unavailable | Limited | None |
PositionalTrackingActivating | None | Initializing |
OrientationOnly | Tracking | Initializing |
PositionalTrackingInhibited | None | Relocalizing |
PositionalTrackingActive | Tracking | None |
Anything else | None | None |
Display
Subsystem implementation of the Display subsystem to allow for rendering of the VR view in the HMD or on HoloLens device.
Input
Subsystem implementation of the Input subsystem to allow for tracking of device position/orientation, controller data, etc. Information provided to the subsystem from the provider is surfaced through Tracked Pose Driver.
Hand tracking for HoloLens has been implemented differently than in Unity Legacy VR. With the advent of HoloLens v2 and true hand tracking the need and desire to maintain inferred handtracking is diminished. Inferred hand tracking has been a source of issues for Unity and developers and while it did provide some benefits the issues outweighed those significantly. For XR SDK the decision has been made to rely on the system to tell us what hands are being tracked and, if that information on handedness is missing, to just produce a generic tracked hand device with no handedness inference.
To use the new hand model, the user needs to talk to the input system directly becuase the current Tracked Pose Driver system does not natively support non-handed hands. A user can do this by providing an implementation of BasePoseProvider from the Interaction package to a Tracked Posed Driver component and using that as the pose provider for unhanded situations. The Interaction package is installable through either the Package Manager ui or through the XR Plugin Management/Input Helpers settings pane in Project Settings.
An example implementation of a hand pose provider is shown below. This example is limited to tracking one of two hands consisting of the first two devices that support hand tracking as reported by the input subsystem. This is specifically for hand tracked devices that do not support handedness as the current Tracked Pose Driver does recognize
using System.Collections.Generic;
using System.Linq;
using UnityEngine;
using UnityEngine.Experimental.XR.Interaction;
using UnityEngine.SpatialTracking;
using UnityEngine.XR;
public class HandTrackingProvider : BasePoseProvider
{
// Choose between first or second han dtracked devices.
[Range(0,1)]
public int controllerIndex;
public override PoseDataFlags GetPoseFromProvider(out Pose output)
{
output = default(Pose);
var device = default(InputDevice);
var flags = PoseDataFlags.NoData;
List<InputDevice> devices = new List<InputDevice>();
InputDevices.GetDevicesWithCharacteristics(InputDeviceCharacteristics.HandTracking, devices);
if (devices.Count < controllerIndex + 1)
{
return flags;
}
device = devices.ElementAt(controllerIndex);
if (!device.isValid)
return flags;
// Check for handedness
// bool leftHand = ((device.characteristics & InputDeviceCharacteristics.Left) == InputDeviceCharacteristics.Left);
// bool rightHand = ((device.characteristics & InputDeviceCharacteristics.Right) == InputDeviceCharacteristics.Right);
// Do something here with handedness if you need to
if (device.TryGetFeatureValue(CommonUsages.devicePosition, out output.position))
flags |= PoseDataFlags.Position;
if (device.TryGetFeatureValue(CommonUsages.deviceRotation, out output.rotation))
flags |= PoseDataFlags.Rotation;
return flags;
}
}
Gesture
Subsystem implementation to provide for recognition and tracking of gestures provided from the appropriate device. This subsystem relies on the com.unity.xr.interactionsubsystem
package for it's core implementation (see that package's documentation for further details/types).
The WindowsMRGestureSubsystem
component manages a low-level interface for polling for Windows Mixed Reality gesture changes. If this component is added to a scene, it is possible to poll for any gesture events each frame. The following gestures and gesture data are provided:
- WindowsMRHoldGestureEvent - Event that fires when the state of a hold gesture changes. Hold gesture indicating that a air tap or button has been held down my the user.
- id - Unique
GestureId
that identifies this gesture. - state -
GestureState
that indicates the state of this gesture (Started
,Updated
,Completed
,Canceled
orDiscrete
).
- id - Unique
- WindowsMRTappedGestureEvent - Event that fires when the state of a tapped gesture changes. Tapped gestures indicate an air tap or button has been tapped by the user.
- id - Unique
GestureId
that identifies this gesture. - state -
GestureState
that indicates the state of this gesture (Started
,Updated
,Completed
,Canceled
orDiscrete
). - tappedCount - Number of times that the tap has occurred.
- id - Unique
- WindowsMRManipulationGestureEvent - Manipulation gestures can be used to move, resize or rotate an object when it should react 1:1 to the user's hand movements. One use for such 1:1 movements is to let the user draw or paint in the world. The initial targeting for a manipulation gesture should be done by gaze or pointing. Once the tap and hold starts, any manipulation of the object is then handled by hand movements, freeing the user to look around while they manipulate.
- id - Unique
GestureId
that identifies this gesture. - state -
GestureState
that indicates the state of this gesture (Started
,Updated
,Completed
,Canceled
orDiscrete
). - cumulativeDelta -
Vector3
indicating the total distance moved since the beginning of the manipulation gesture.
- id - Unique
- WindowsMRNavigationGestureEvent - Navigation gestures operate like a virtual joystick, and can be used to navigate UI widgets, such as radial menus. You tap and hold to start the gesture and then move your hand within a normalized 3D cube, centered around the initial press. You can move your hand along the X, Y or Z axis from a value of -1 to 1, with 0 being the starting point.
- id - Unique
GestureId
that identifies this gesture. - state -
GestureState
that indicates the state of this gesture (Started
,Updated
,Completed
,Canceled
orDiscrete
). - normalizedOffset -
Vector3
indicating the normalized offset, since the navigation gesture began, of the input within the unit cube for the navigation gesture.
- id - Unique
Additionally, the WindowsMRGestures
component can be used to provide a simpler polling mechanism for a simpler, event-based interface for listening to gestures. This component provides a number of events that be hooked into to detect gesture events when they occur:
- onHoldChanged - Occurs whenever a hold gesture changes state.
- onManipulationChanged - Occurs whenever a manipulation gesture changes state.
- onNavigationChanged - Occurs whenever a navigation gesture changes state.
- onTappedChanged - Occurs whenever a tapped gesture changes state.
- onActivate - Occurs whenever the cross-platform activate gesture occurs. See the
com.unity.xr.interactionsubsystems
package documentation for more details.
Also see the relevant Microsoft documentation about Gestures for supported device information.
Anchor
Subsystem implementation provides support for anchors within the Microsoft anchor system.
Successful initialization and start of the subsystem allows the user to Add anchors, Remove anchors and Query for all known anchors. Current subsystem definition does not provide for storage or retrieval of stored anchors.
See the relevant Microsoft documentation about Anchors for supported device information.
The subsystem provided the ability to get the actual native Windows MR object that backs the anchor. Below is an example of getting that data:
TrackableChanges<XRAnchor>? currentRps = RPSubsystem?.GetChanges(Allocator.Temp) ?? null;
if (currentRps != null)
{
foreach(var rp in currentRps?.added)
{
#if ENABLE_WINMD_SUPPORT // Necessary since the Windows Types are only valid through WinMD projection.
AnchorData data = Marshal.PtrToStructure<AnchorData>(rp.nativePtr);
SpatialAnchor anchor = data.spatialAnchor as SpatialAnchor;
if (anchor != null)
{
// Do something with the anchor here.
}
#endif
}
}
NOTE: The data returned is only valid in between calls to XRAnchor.GetChanges!
Meshing
Subsystem implementation provides access to the meshing constructs the that HoloLens hardware produces. This subsystem only works on devices that actually support meshing (HoloLens) and should either be null or in a non-running state for other devices.
See the relevant Microsoft documentation about Spatial Mapping for supported device information as well as what to expect in regards to data from this subsystem.
Additional support outside of XR SDK
There are a number of features that Windows Mixed Reality supports that are not provided for in XR SDK. These are provided for use through the following extensions:
Input Subsystem Extensions
Specific to this package is the ability to get the native WindowsMR SpatialInteractionSourceState that backs the input data.
- GetCurrentSourceStates - Retrieve an array of all the currently known source states the input system knows about. These are returned as an array of System.Object that can be converted to the appropriate SpatialInteractionSourceState instance to use.
Here is an example:
List<System.Object> states = new List<System.Object>();
...
InputSubsystemInstance?.GetCurrentSourceStates(states) ?? null;
#if ENABLE_WINMD_SUPPORT
foreach (var s in states)
{
SpatialInteractionSourceState sourceState = s as SpatialInteractionSourceState;
if (sourceState == null)
{
// Could not convert.
continue;
}
string sourceInfo = $"Source Info for state\n";
sourceInfo += $"\tGrasped: {sourceState.IsGrasped}\n";
sourceInfo += $"\tMenu Pressed: {sourceState.IsMenuPressed}\n";
sourceInfo += $"\tPressed: {sourceState.IsPressed}\n";
sourceInfo += $"\tSelect Pressed: {sourceState.IsSelectPressed}\n";
sourceInfo += $"\tSelect Press Value: {sourceState.SelectPressedValue}\n";
Debug.Log(sourceInfo);
}
#endif
Meshing Subsystem Extensions
Meshing subsystem provides only one means for setting a bounding volume for spatial mapping: SpatialBoundingVolumeBox. This API provides for settings a bounding volume as an Axis Aligned Bounding Box at a given position given specific extents. Windows Mixed Reality additionally provides for setting a bounding volume as an Oriented Bounding Box, a Sphere or a Frustum.
- SetBoundingVolumeOrientedBox - Similar to SpatialBoundingVolumeBox but also allows for setting a given orientation to the volume.
- SetBoundingVolumeSphere - Set a bounding volume to a sphere at some origin point and with the given radius.
- SetBoundingVolumeFrustum - Set the bounding volume to the frustum defined by the 6 planes passed in. Each plane is defined as a point offset from the head, with a given orientation. The easiest way to set this is to use the GeometryUtility.CalculateFrustumPlanes Unity API and use that to populate the data for this call. The plane ordering passed in matches the plane ordering from this API.
Specific to this package is the ability to get the native WindowsMR SpatialSurfaceMesh and SpatialSurfaceInfo the back the generated mesh data. In the future meshing will support the same native pointer access that anchors does but for now you can use this package specific mechanism to get the same data.
WindowsMRExtensions.MeshingData - Container struct that holds data for the native mesh information. Contains the following data members:
- int version - The version of this struct. This will let you know what version the provider is returning to you so you know what to expect.
- System.Object surfaceInfo - A pointer to the native SpatialSurfaceInfo info for this mesh. Must be case to SpatialSurfaceInfo to be useful.
- System.Object surfaceMesh - A pointer to the native SpatialSurfaceMesh info for this mesh. Must be case to SpatialSurfaceMesh to be useful.
GetMeshingDataForMesh - Extension method on XRMeshSubsystem. Given a mesh id will populate the passed in instance of MeshingData with the native mesh information.
- ReleaseMeshingData - Extension method on XRMeshSubsystem. Used to release the data in a MeshingData instance returned from GetMeshingDataForMesh.
Here is an example:
var meshing = m_XrManagerSettings.activeLoader.GetLoadedSubsystem<XRMeshSubsystem>();
WindowsMRExtensions.MeshingData meshData = new WindowsMRExtensions.MeshingData();
...
/// Get mesh info instance
...
meshing.GetMeshingDataForMesh(meshInfo.MeshId, out meshData);
#if ENABLE_WINMD_SUPPORT
SpatialSurfaceInfo surfaceInfo = meshData.surfaceInfo as SpatialSurfaceInfo;
SpatialSurfaceMesh surfaceMesh = meshData.surfaceMesh as SpatialSurfaceMesh;
Debug.Log($"Spatial Surface Info: ID:{surfaceInfo.Id} UpdateTime: {surfaceInfo.UpdateTime}");
Debug.Log($"Spatial Surface Mesh Info: Vertex Count:{surfaceMesh.VertexPositions.ElementCount} Normals Count:{surfaceMesh.VertexNormals.ElementCount} Indices Count: {surfaceMesh.TriangleIndices.ElementCount}");
#endif
meshing.ReleaseMeshingData(ref meshData);
Anchor Subsystem Extensions
XR Management support
While not required to use Windows Mixed Reality XR Plugin, integration with XR Management provides for a simpler and easier way of using this (and other) providers within Unity. This package provides for the following XR Management support:
- Runtime Settings - Provides for setting runtime settings to be used by the provider instance. These settings are per-supported platform.
- Build Settings - Provides for setting build settings to be used by the Unity build system. These settings are platform specific and are used to enable boot time settings as well as copy appropriate data to the build target.
- Lifecycle management - This package provides a default XR SDK Loader instance that can be used either directly or with the XR Management global manager. It provides for automated (or manual) lifetime management for all the subsystems that are implemented by this provider.
- Integration with Unity Settings UI - Custom editors and placement within the Unity Unified Settings UI within the top level XR settings area.
Installing the Windows Mixed Reality XR Plugin :
To install the Windows Mixed Reality XR Plugin, do the following:
- Install the XR Management package from Package Manager 2) Follow the instrucions for Installing an XR Plugin using XR Management in the End User Documentation section.
Note that the XR settings tab now has a dropdown for "Windows Mixed Reality". Navigate to the XR Plugin Management -> Windows Mixed Reality settings window in Project Settings to create a Windows MR XR Plugin specific settings asset. This asset is editable from the Windows Mixed Reality window and can toggle settings such as Shared Depth buffer support.
Windows Standalone Settings
Build Settings
- Holographic Remoting - Enable or disable support for Holographic Remoting in the built application.
Runtime Settings
Shared Depth Buffer - Enabled or disable support for using a shared depth buffer. This allows Unity and the Mixed Reality Subsystem to use a common depth buffer. This allows the Windows Mixed Reality system to provide better stabilization and integrated overlay support. Disabled by default.
Depth Buffer Format - Switch to determine the bit-depth of the depth buffer when sharing is enabled. Possible depth formats are 16bit and 24 bit.
Windows UWP Settings
Build Settings
Use Primary Window - Toggle to set the provider instance to immediately initialize XR SDK using the primary UWP window spawned by Unity. Set enabled by default. WSA/UWP Only.
Holographic Remoting - Enable or disable support for Holographic Remoting in the built application.
Runtime Settings
Shared Depth Buffer - Enabled or disable support for using a shared depth buffer. This allows Unity and the Mixed Reality Subsystem to use a common depth buffer. This allows the Windows Mixed Reality system to provide better stabilization and integrated overlay support. Disabled by default.
Depth Buffer Format - Switch to determine the bit-depth of the depth buffer when sharing is enabled. Possible depth formats are 16bit and 24 bit.
XR Management Loader
The default loader provided by the Windows Mixed Reality XR Plugin implementation is setup to use all the subsystems provided by this implementation. The only required subsystem is Session which means that failure to initialize Session will cause the loader to fail init and fall through to the next expected loader.
If Session successfully initializes, then it is still possible for starting the subsystem could fail. If starting fails then the loader will clear all the subsystems and the app will fall through to standard Unity non-VR view.
All other subsystems depend on session but, unlike session, failure to initialize or start will not cause the whole provider to fail.
Remoting
Remoting allows you to connect to a HoloLens 2 device running the "Holographic Remoting Player" to serve as the head mounted display for your application.
Editor
For Play in Editor remoting support, the Emulation Window is the fastest option. It can be opened using the menu item Window -> XR -> Windows XR Plugin Remoting. A manual connection can also be established in the editor by using the method described in the Runtime section below.
With the Emulation Window open, select the Emulation Mode "Remote to Device". You can use this window to set the emulation mode, device name to attach to, and the remaining settings. A connection attempt will automatically be made when entering play mode and the connection will be automatically closed when you exit play mode. The "Connection Status" will automatically update during the connection process.
Emulation Mode - Toggle what mode of emulation you are targeting.
Remote Machine - The IP address of the HoloLens 2 device you are connecting with.
Enable Video - Allow video to be sent to device.
Enable Audio - Allow audio to be sent to device.
Max Bitrate - The birate at which to send data to the HoloLens 2 device.
Connection Status - The current state of the connection to the specified remote machine.
Runtime
Runtime/Application remoting is possible with x64 Desktop and UWP applications. For Runtime/Application support, you must use the WindowsMRRemoting scripting API to establish a remoting connection prior to starting the Windows Mixed Reality XR Plug-in. For an example of how to use the WindowsMRRemoting scripting API, please see the "RemotingConnect" script in the Remoting sample found in the package. The example script shows you how to establish a connection and start the XR Plug-in as well as stop the XR Plug-in and close the connection. The sample can be accessed through the Package Manager Window by navigating to the Windows Mixed Reality package.
The remoting connection must be established prior to starting the Windows Mixed Reality XR Plug-in. The "Initialize XR on Startup" toggle must be disabled in the XR Management Settings in order to accomplish this. Enabling remoting in the Windows Mixed Reality Settings pane will automatically switch this setting off for you.
If you are using runtime/scripting API in a built application to remote to a device, additional settings are required. To enable remoting in built applications, you must enable "Holographic Remoting" in the Windows Mixed Reality Build Settings and disable Use Primary Window(see below). When remoting is selected in the UI, Use Primary Window will be automatically disabled. It will remain disabled as long as you have remoting enabled.
For applications targeting the UWP paltform you will also need to enable the networking capabilities to allow for network connections to/from the device. Capabilities can be set from Project Settings -> Player -> Publishing Settings -> Capabilities. Make sure to enable the settings shown below.