Namespace Mechatronics.SensorSDK
Classes
ACESToneMapping
The role of this class is to apply the unity implementation of the ACES tonemapping to convert an HDR image to a LDR which include color correction and gamma correction.
AsyncRequestStatus
Represents the status of an asynchronous request for a GPU resource.
AzimuthCorrection
Corrects the azimuth offset in an image using a pixel shift.
AzimuthCorrectionNode
BeamLight
Provides a description for the light source used when sampling the simulation.
BlueFilter
The role of this class is to keep only the blue channel information from the incoming render texture.
BrownConradyLens
Lens that implements post processing distortion following the brown-conrady model.
ButtonBinding
Trigger binding, exposes a button in the inspector to press.
CameraBuilder
This class is responsible to set up a unity camera corresponding to a sensor description. This is the way to sample the scene for sensor array.
CameraController
The role of this controller is to send triggers to a photosensor array at the rate defined by the FPS input port.
CameraSensingComponent
This class implement the sensing part of the photosensor.
This implementation relies on a Unity camera to measure depth and intensity. Thus, all incoming rays have to converge to a single point, the camera effective pinhole. Consequences:
- Origin of individual beams (e.g. for multi-beam lidars) will be ignored and assumed to be at the camera pinhole.
- Photosensor translation cannot be interpolated; if the lidar is moving in the scene, all samples taken during a Unity update will use the final lidar position, introducing translational error.
- Lidars usually have constant angular difference between consecutive beams, while cameras have constant translational difference between consecutive pixels. Thus, some approximation must be made, leading to slight depth inaccuracies compared to a ray tracing implementation.
For the last point, use the Max Approx Error field to configure the maximum approximation angular error, that is, the maximum angle between the desired lidar beam and the closest camera ray going through a pixel center. This is used to compute the texture size used by the camera.
The code also uses a few assumptions:
- Beams in beamPoses are roughly oriented along the positive Z axis.
- Photosensor angular speed is constant, which avoid having to look at all the stamps.
- Looking at the photosensor pose at the previous frame and at the current frame is enough to compute a bounding box.
- Photosensor turns less than 120 degrees between consecutive updates, otherwise some lidar points may be missing.
Warning: this will not work for lidar using beam patterns (e.g. MEMS) instead of physically moving the photosensor, since it violates the first assumption.
CameraTrigger
A structure that indicates the photosensor array must render the simulation on the next Update. Empty for now.
CatPattern
The role of this class is to describe a sweep pattern for a lidar following cat pattern shape.
CirclePattern
The role of this class is to describe a sweep pattern for a lidar following circular pattern shape.
CircularStructuredBuffer<T>
Represents a StructuredBuffer<T>, along with the index of the next entry to be written to.
CommandBufferExtensions
Provides extensions methods to CommandBuffer.
ComputeShaderInstance
This class encapsulate what compose a compute shader program.
CorrectPointCloudMotion
Represents a transcoder taking an input point cloud distorted by motion, and generating the corrected output point cloud.
CorrectPointCloudMotionNode
Exposes as a node in SystemGraph the process of correcting a point cloud for motion distortion.
CubemapSampler
Implements a way to sample the scene to a cube map.
CustomPassCallback
This class is used to hook rendering command with a camera.
CustomPassDescriptor
Provide a way to define a custom pass. Useful to build one or list them.
CustomPassVolumeBuilder
This class is responsible to set up a unity custom pass volume.
CustomRendererSampler
This sampler provide a way to render the scene by bypassing the normal step of the camera rendering.
DCMotor
The role of this class is to emulate an axial DC Motor rotating on the Y axis (Up vector).
DefaultSampler
Implements a way to sample the scene using the Unity standard camera.
DemosaickingBoxFilter
A box filter demosaicking algorithm is a digital image process used to reconstruct a full color image from the incomplete color samples output from a photosensorArray. Reference : https://en.wikipedia.org/wiki/Demosaicing#:~:text=A%20demosaicing%20(also%20de%2Dmosaicing,CFA%20interpolation%20or%20color%20reconstruction.
ExchangeBuffer<T>
Represents a synchronized collection in a CPU list and a GPU ComputeBuffer.
FishEyeLens
Lens that implements equidistant fisheye mapping model.
FrameBundler
A node responsible for collecting incoming photosensor samples corresponding to a full lidar sweep and packaging them out into a sensor frame.
FrustumLaserConfig
Define the position and orientation of each beam.
GameObjectExtensions
Provides extension methods for class GameObject.
GreenFilter
The role of this class is to keep only the green channel information from the incoming render texture.
GridLaserConfig
Define the position and orientation of each beam.
IntensityBufferToTexture
Provide a way to take an intensity buffer stored in a compute buffer and copy them a a 2D texture.
IntensityBufferToTextureNode
Expose the process of copying an intensity buffer into a render texture in SystemGraph.
LaserConfigAsset
Description of the laser distribution.
LaserConfigAssetImporter
Imports .laserconfig files containing laser distribution.
LaserConfigMonoBehaviour
An base class used to implement ILaserConfig as a MonoBehaviour, to allow changing options in the Inspector.
LensComponent
Provides a common interface for all lenses that can be bound to a photosensor array. This is used to delegate and allow the definition of new lens models for the photosensor array.
LensComponentBinding
Declares LensComponent as a type to be used as a binding.
LensSystemDesc
Use to describe a lens system.
LerpTrajectoryGenerator
Provides the default trajectory generator for the Trajectory class.
LidarController
A LidarController is a device that control an electric motor and a photosensor to throw laser beams to measure the distance and/or intensity.
ListExtensions
Provides extension methods to List.
LuminanceFilter
The role of this class is to compute the luminance from the rgb channel and store it in the output RenderTexture.
LutMapping
Represents a node that transforms an input texture by converting each pixel via a lookup table.
MEMS
Represents a micro-electromechanical system (MEMS) that moves its bound game object according to a bound scan pattern.
MakeLaserConfigAsset
Adds the option to create a laser config in the asset creation menu.
MathUtils
Provides methods for miscellaneous mathematical operations.
MosaickingFilter
The role of this class is to keep only the color channel information that match the mosaic pattern provided.
ParallelStereo
The role of this class is to extract a PointCloud data from 2 incoming RenderTexture taken from parallel camera.
ParallelStereoNode
Provide a node to expose the parallelStero feature in the SystemGraph.
PathtraceSampler
Implements a way to sample the scene using the Unity path tracing camera. This is not realtime sampling, so a frame is produced over several frames of simulation.
PathtracedPhotosensors
Serves to execute the sampling of all path traced photosensor.
PathtracedSensingComponent
This class implement the sensing part of the photosensor.
Limitations One must know that the current implementation of the photosensor uses dx12 and DXR API through unity interfaces to measure the scene. For this reason, SensorSDK requires hardware that supports DXR to work. Otherwise, the user will experience no output coming from the photosensor.
Photosensor
Requirement : This node uses DXR so an NVidia card GTX 1060 or above is required.
A Photosensor is an electronic component that detects the presence of visible light, nfrared transmission (IR), and/or ultraviolet (UV) energy. This node responsible to take measurement from the simulation as a laser paired with a photosensor would do. In the photosensor, there is an implicit laser firing a beam originating from the photosensor. If no beam light node is specified in the graph, the photosensor will consider the laser as if it was a white light and will take only the depth. No intensity will be measured. See beam light node for more information on how to define the laser light beam.
There is an heuristic inside this node implementation that assumes it will rotate on itself.The sampling shader interpolates the rotation of the photosensor to accommodate its usage in a lidar kind of sensor.
PhotosensorArray
The role of this class is to emulate the aggregation of multiple photosensor grouped into a 2D array. This is also called a CCD or CMOS sensors. Used to capture light on each photosensor one-by-one.
Here are some interesting references to understand better the implementation: The path tracing code in HDRP: https://blog.selfshadow.com/publications/s2016-shading-course/unity/s2016_pbs_unity_hdri_notes.pdf
The metric of tracing rays with thick lens model is being referenced: https://www.cs.utexas.edu/~fussell/courses/cs395t/lens.pdf
PhotosensorArray.PhotosensorArrayDescription
Describe an array of photosensors on a plane just like what is used for cameras.
PhotosensorArrayNode
Expose the photosensor array, in deferred rendering mode, implementation as a node in SystemGraph.
PhotosensorController
A PhotosensorController is a device that controls a non-rotating photosensor to throw laser beams to measure the distance and/or intensity.
PhotosensorData
Used to collect all information required for the sampling on the GPU.
PhotosensorEncoder
The PhotosensorEncoder is a specialized transcoded used when doing data processing on the output of the photosensor. Most of the photosensor encoders will pack or format samples measured by the photosensor in a way that correspond to what a device api/firmware would do.
Every time it receives an encodedPeriod signal, which is the time elapsed since the last encoding, it reads the output buffer of the photosensor and fills its output buffer according to the format selected by the developer.
PhotosensorOutputBuffer
Represents the raw output of a photosensor, as a circular structured buffer of PhotosensorRawData.
PhotosensorOutputBufferProperty
Provides capability to add the PhotosensorOutputBuffer type as a property in a SystemGraph.
PhotosensorSensingComponent
Provides a common interface for all photosensor sensing components. This is used to delegate and allow the definition of new sensing components for the photosensor node.
PhotosensorSensingComponentBinding
Declares PhotosensorSensingComponent as a bindable type.
PhotosensorToInstanceSegmentationNode
Maps object IDs to a specific IdLabel configuration. Instance segmentation will assign a new ID per game object instance.
PhotosensorToInstanceSegmentationV2
Maps object IDs to label IDs from a IdLabelConfig
configuration.
Instance segmentation assigns a distinct label ID per game object instance.
PhotosensorToIntensityBuffer
The role of this class is to interpret the data incoming from the photosensor into an intensity buffer. The internal shader does extract the intensity information and format it into a compute buffer according to the frame information.
PhotosensorToIntensityBufferNode
Expose the conversion of photosensor sampling data to an intensity buffer as a node in SystemGraph.
PhotosensorToIntensityBufferV2
The role of this class is to interpret the data incoming from the photosensor into an intensity buffer. The internal shader does extract the intensity information and format it into a compute buffer according to the frame information.
PhotosensorToIntensityNode
Expose the conversion of photosensor sampling data to an intensity 2D texutre as a node in SystemGraph.
PhotosensorToIntensityV2
Expose the conversion of photosensor sampling data to an intensity 2D texture as a node in SystemGraph.
PhotosensorToPointCloud
The role of this class is to interpret the data incoming from the photosensor into a PointCloud data format to be consumed by the point cloud viewer node.
PhotosensorToPointCloudNode
Expose the conversion of photosensor sampling data to point cloud buffer as a node in SystemGraph.
PhotosensorToPointCloudV2
The role of this class is to interpret the data incoming from the photosensor into a PointCloud data format to be consumed by the point cloud viewer node.
PhotosensorToRangeNode
Expose the conversion of photosensor sampling data to a range image as a node in SystemGraph.
PhotosensorToRangeV2
Expose the conversion of photosensor sampling data to a range image as a node in SystemGraph.
PhotosensorToSegmentation
Expose the conversion of photosensor sampling data to a segmentation 2D texture as a node in SystemGraph.
PhotosensorToSegmentationV2
Expose the conversion of photosensor sampling data to a segmentation 2D texture as a node in SystemGraph.
PhotosensorToSemanticSegmentationNode
Maps object IDs to a specific SemanticSegmentationLabel configuration. Semantic segmenation would map the objects to semantic class groupings, multiple objects maybe mapped to the same class.
PhotosensorToSemanticSegmentationV2
Maps object IDs to colors of semantic groups from a SemanticSegmentationLabelConfig
configuration.
Semantic segmentation assigns the same color to all instances of the same semantic group.
PhotosensorToTexture
The role of this class is to interpret a component of the data incoming from the photosensor into a texture. The internal shader does extract the information and format it into a 2D texture according to the frame information.
PhotosensorToTextureV2
The role of this class is to interpret a component of the data incoming from the photosensor into a texture. The internal shader does extract the information and format it into a 2D texture according to the frame information.
PhotosensorTranscoder
The PhotosensorTranscoder is a specialized transcoder used when doing data processing on the output of the photosensor. Most of the photosensor transcoders will pack or format samples measured by the photosensor in a way that correspond to what a device API/firmware would do.
Every time it receives an inTranscode signal, it reads the output buffer of the photosensor and fills its output buffer according to the format selected by the developer.
PhotosensorV2
Requirement : This node uses DXR so an NVidia card GTX 1060 or above is required.
A Photosensor is an electronic component that detects the presence of visible light, nfrared transmission (IR), and/or ultraviolet (UV) energy. This node responsible to take measurement from the simulation as a laser paired with a photosensor would do. In the photosensor, there is an implicit laser firing a beam originating from the photosensor. If no beam light node is specified in the graph, the photosensor will consider the laser as if it was a white light and will take only the depth. No intensity will be measured. See beam light node for more information on how to define the laser light beam.
There is an heuristic inside this node implementation that assumes it will rotate on itself.The sampling shader interpolates the rotation of the photosensor to accommodate its usage in a lidar kind of sensor.
PointCloud
Represents a point cloud as a GPU 1d buffer of point structures, with (x, y, z) coordinates and intensity.
PointCloudProperty
Provides capability to add the PointCloud type as a property in a SystemGraph.
PointCloudToDepth
The role of this class is to project the PointCloud data inside a frustum described by the input port onto a 2DTexture.
PointCloudToDepthNode
Expose the process of converting a point cloud into a depth buffer into in SystemGraph.
PointCloudToFile
Provide a way to share saving point cloud to a file.
PointCloudToFileNode
Expose functionality to save a point cloud into a file in SystemGraph.
PointCloudViewer
The role of this class is to instanciate a vfx on the main camera that will display the PointCloud data received on its input port.
PoseExtensions
Provides extension methods for the Pose struct.
RandomUtils
This class provide different random distribution function.
RedFilter
The role of this class is to keep only the red channel information from the incoming render texture.
RegisterLensModelAttribute
This attribute provide a way to register different lens model.
RenderTextureToFile
The role of this class is to take an incoming render texture and save it into an image file.
RenderTextureToFileNode
Expose functionality to save a render texture into a file in SystemGraph.
RenderTextureTranscoder
The role of this class is to share everything common to all transcoder that take a RenderTexture as input to produce a RenderTexture as output.
One must understand when the gamma correction should be applied.The image processing pipeline works with linear color by default.
The only place where the gamma correction is automatically applied is in the tone mapping node. If one output a render texture, he must
consider to create an output RenderTexture with a sRGB support. (see color space parameter on PrepareOutputBuffer())
RenderTextureViewer
The role of this class is to instantiate a raw image viewer to display on screen the RenderTexture received on its input port.
SamplingRequest
Provides the necessary information to generate all the rays that were fired between the previous Unity Update and the current Unity Update.
ScanPatterns
The role of this class is to provide common tools used by all scan pattern.
SceneSampler
This class provide a common interface and functionality for all type of samplers.
SegmentationAsBuffer
The role of this class is to take an incoming render texture, extract the index from the color channel and output an index buffer accessible on CPU side.
SegmentationAsBufferNode
Provide SegmentationAsBuffer features as a node.
SegmentationBuffer
Define a segmentation buffer.
SegmentationLens
Implements a lens component to extract segmentation data from the simulation.
SensorUtils
Provides some useful ad hoc functions.
ShotNoise
Provide a custom noise on lidar intensity.
ShotNoiseNode
Expose the application of the noise process on the intensity as a node in SystemGraph.
SingleForwardLaserConfig
Used as default laser configuration when there is no configuration set. By default there is only a single laser looking forward (Z+) and 0 seconds of delay before firing.
SphericalLaserConfig
Define the position and orientation of each beam.
StructuredBuffer<T>
Provides a user-friendly and safe interface over ComputeBuffer.
TextureExtensions
Provides extension methods for class Texture.
ThinLens
Implements a lens component that reproduces a thin lens model. This camera model is commonly used in game development.
Trajectory
Represents a high-resolution trajectory, that is, a series of poses that happens in-between two frames.
Transcoder
Provide a way to share everything common to all transcoder. Mainly the compute shader management.
TransformExtensions
Provides extension methods for the Transform class.
VelodynePhotosensorToPointCloud
The role of this class is to interpret the data incoming from the photosensor into a PointCloud data format to be consumed by the point cloud viewer node, and also into a PointCloud in the Velodyne output format.
VelodynePhotosensorToPointCloudNode
Expose the conversion of photosensor sampling data to Velodyne point cloud buffer as a node in SystemGraph.
VelodynePhotosensorToPointCloudV2
The role of this class is to interpret the data incoming from the photosensor into a PointCloud data format to be consumed by the point cloud viewer node, and also into a PointCloud in the Velodyne output format.
Visualization
Class that encapsulates read-only values about the sensor visualization layer.
WhiteBalance
This node applies the white balancing in linear color space.
Structs
BeamIndices
Identifies a specific lidar beam in a series of lidar frames.
CameraDesc
Describe the required parameter to build a camera.
EmptyType
A struct used as placeholder to inherit from binding.
FishEyeLens.DistortionParameters
Distortion Parameters.
OrientedPoint
Provides a description of a point with a direction into space.
PhotosensorRawData
Stores the measurement done by the photosensor during the sampling.
PointXYZI
Represents a point in a point cloud.
ProjectedImageOnSensorRequest
Provides a graphic context and the return value of a request made by the photosensor array to the lens component.
SceneSamplerDesc
Use to describe the size and format of the sampling output.
Signal
Type used purely for change events, not meant to carry any data.
Interfaces
ILaserConfig
Interface to get the photosensor's beam configuration. (ie position, orientation, when it is fired)
IPhotosensorArrayDescription
Describe an array of photosensors on a plane just like what is used for cameras.
ITrajectoryGenerator
Provides an algorithm to the Trajectory class to evaluate poses at high frequency.
Enums
FovAxis
Enumeration of possible axis to define a fov.
PointCloudFileFormat
Enumeration of supported file format to save a PointCloud.
PointCloudOutputMode
Enumeration of way to save the data.
RenderTextureToFileEncoder
Enumeration of available file format to save a render texture.
ResizeOptions
Options for functions that change the logical size (count) of a buffer.
SphericalLaserConfig.TimeBetweenFiring
Possible way to define how the timing between laser firing is defined.
Delegates
CustomPassCallback.ExecuteCustomPassAction<T1>
Declare a delegate for a callback function to execute the photosensor array image processing pipeline.
IPhotosensorArrayDescription.ExposureTimeChangedEventHandler
Type of event handler used when exposure time changes.
IPhotosensorArrayDescription.GainChangedEventHandler
Type of event handler used when gain changes.
PathtracedPhotosensors.ExecuteSamplingAction
Declare a delegate type for photosensor sampling.