docs.unity3d.com
    Show / Hide Table of Contents

    Namespace Mechatronics.SensorSDK

    Classes

    ACESToneMapping

    The role of this class is to apply the unity implementation of the ACES tonemapping to convert an HDR image to a LDR which include color correction and gamma correction.

    BeamLight

    Provides ports to describe a laser light used inside a sensor device.

    BilateralDenoising

    The role of this class is to make a simple bilateral blur on the incoming texture to reduce the noise coming from a pathtraced image.

    BlueFilter

    The role of this class is to keep only the blue channel information from the incoming render texture.

    BrownConradyLens

    Lens that implements post processing distortion following the brown-conrady model.

    BusDriver

    Base class for a bus driver using a port filter.

    CameraBuilder

    This class is responsible to set up a unity camera corresponding to a sensor description. This is the way to sample the scene for sensor array.

    CameraSensingComponent

    This class implement the sensing part of the photosensor.

    This implementation relies on a Unity camera to measure depth and intensity. Thus, all incoming rays have to converge to a single point, the camera effective pinhole. Consequences:

    • Origin of individual beams (e.g. for multi-beam lidars) will be ignored and assumed to be at the camera pinhole.
    • Photosensor translation cannot be interpolated; if the lidar is moving in the scene and multiple sampling requests are resolved in the same Unity update, they will all use the final lidar position, introducing translational error.
    • Lidars usually have constant angular difference between consecutive beams, while cameras have constant translational difference between consecutive pixels. Thus, some approximation must be made, leading to slight depth inaccuracies compared to a ray tracing implementation.

    For the last point, use the Max Approx Error field to configure the maximum approximation angular error, that is, the maximum angle between the desired lidar beam and the closest camera ray going through a pixel center. This is used to compute the texture size used by the camera.

    The code also uses a few assumptions:

    • Beams in samplingOffsets are roughly oriented along the positive Z axis.
    • Photosensor angular speed is constant, which avoid having to look at all the stamps.
    • Photosensor has no roll rotation, otherwise bounding boxes will be incorrect.
    • Photosensor turns less than 120 degrees between consecutive updates, otherwise some lidar points may be missing.

    Warning: this will not work for lidar using beam patterns (e.g. MEMS) instead of physically moving the photosensor, since it violates the first assumption.

    CatPattern

    The role of this class is to describe a sweep pattern for a lidar following cat pattern shape.

    CirclePattern

    The role of this class is to describe a sweep pattern for a lidar following circular pattern shape.

    CommBusDriver

    Driver specification for handling CommMessage.

    CorrectPointCloudMotion

    Represents a transcoder taking an input point cloud distorted by motion, and generating the corrected output point cloud.

    CorrectPointCloudMotionNode

    Exposes as a node in SystemGraph the process of correcting a point cloud for motion distortion.

    CustomPassCallback

    This class is used to hook rendering command with a camera.

    CustomPassDescriptor

    Provide a way to define a custom pass. Useful to build one or list them.

    CustomPassVolumeBuilder

    This class is responsible to set up a unity custom pass volume.

    CustomRendererSampler

    This sampler provide a way to render the scene by bypassing the normal step of the camera rendering.

    DCMotor

    The role of this class is to emulate an axial DC Motor rotating on the Y axis (Up vector).

    DefaultSampler

    Implements a way to sample the scene using the Unity standard camera.

    DemoCameraController

    DemosaickingBoxFilter

    A box filter demosaicking algorithm is a digital image process used to reconstruct a full color image from the incomplete color samples output from a photosensorArray. Reference : https://en.wikipedia.org/wiki/Demosaicing#:~:text=A%20demosaicing%20(also%20de%2Dmosaicing,CFA%20interpolation%20or%20color%20reconstruction.

    DepthToPointCloud

    The role of this class is to convert a depth texture into PointCloud data that can be consumed by the point cloud viewer node.

    DepthToPointCloudNode

    Expose the process of converting a depth buffer into a point cloud in SystemGraph.

    FrustumDistribution

    The role of this class is describe the distribution of OrientedPoints like a camera frustum. Useful to define how light beams are positioned for a time of flight (ToF) camera.

    GenericRGBCamera

    The role of this controller is to create a sampling request for a photosensor array at the rate defined by the FPS input port.

    GreenFilter

    The role of this class is to keep only the green channel information from the incoming render texture.

    GridDistribution

    The role of this class is describe the distribution of OrientedPoints like a grid pattern. Useful to define how light beams are positioned to scan a specific area.

    GridScanController

    A GridScanController is a device that scan using range finder devices scattered in a grid shape. the distance and/or intensity.

    HTTPService

    IntensityBufferToTexture

    Provide a way to take an intensity buffer stored in a compute buffer and copy them a a 2D texture.

    IntensityBufferToTextureNode

    Expose the process of copying an intensity buffer into a render texture in SystemGraph.

    IScanPattern

    The role of this class is to provide a common interface for all type of scan pattern so the user can set the scan patter he wants on an input port.

    LaserScanCamera

    A LaserScanCamera is a device that scan using a frustum distribution, just like camera, to sample the intensity and depth.

    LensComponent

    Provides a common interface for all lenses that can be bound to a photosensor array. This is used to delegate and allow the definition of new lens models for the photosensor array.

    LensComponentBinding

    Declares LensComponent as a type to be used as a binding.

    LensSystemDesc

    Use to describe a lens system.

    LuminanceFilter

    The role of this class is to compute the luminance from the rgb channel and store it in the output RenderTexture.

    MathUtils

    Provides methods for miscellaneous mathematical operations.

    MechanicalLidarController

    A MechanicalLidarController is a device that control an electric motor and a photosensor to throw laser beams to measure the distance and/or intensity.

    MEMSLidarController

    A MEMSLidarController is a device that control a photosensor using miro electro-mechanical systems to throw laser beams to measure the distance and/or intensity.

    MosaickingFilter

    The role of this class is to keep only the color channel information that match the mosaic pattern provided.

    OusterOS0Controller

    This controller emulate the internal behavior of the Ouster OS0 lidar device to measure the distance and/or intensity.

    OusterOS0StreamingController

    This controller emulate the internal behavior of the Ouster OS0 lidar device to measure the distance and/or intensity.

    ParallelStereo

    The role of this class is to extract a PointCloud data from 2 incoming RenderTexture taken from parallel camera.

    ParallelStereoNode

    Provide a node to expose the parallelStero feature in the SystemGraph.

    PathtracedSensingComponent

    This class implement the sensing part of the photosensor.

    Limitations One must know that the current implementation of the photosensor uses dx12 and DXR API through unity interfaces to measure the scene. For this reason, SensorSDK requires hardware that supports DXR to work. Otherwise, the user will experience no output coming from the photosensor.

    PathtraceSampler

    Implements a way to sample the scene using the Unity path tracing camera. This is not realtime sampling, so a frame is produced over several frames of simulation.

    Photosensor

    Requirement : This node uses DXR so an NVidia card GTX 1060 or above is required.

    A Photosensor is an electronic component that detects the presence of visible light, nfrared transmission (IR), and/or ultraviolet (UV) energy. This node responsible to take measurement from the simulation as a laser paired with a photosensor would do. In the photosensor, there is an implicit laser firing a beam originating from the photosensor. If no beam light node is specified in the graph, the photosensor will consider the laser as if it was a white light and will take only the depth. No intensity will be measured.See beam light node for more information on how to define the laser light beam.

    There is an heuristic inside this node implementation that assumes it will rotate on itself.The sampling shader interpolates the rotation of the photosensor to accommodate its usage in a lidar kind of sensor.

    PhotosensorArray

    The role of this class is to emulate the aggregation of multiple photosensor grouped into a 2D array. This is also called a CCD or CMOS sensors. Used to capture light on each photosensor one-by-one.

    Here are some interesting references to understand better the implementation: The path tracing code in HDRP: https://blog.selfshadow.com/publications/s2016-shading-course/unity/s2016_pbs_unity_hdri_notes.pdf

    The metric of tracing rays with thick lens model is being referenced: https://www.cs.utexas.edu/~fussell/courses/cs395t/lens.pdf

    PhotosensorArray.PhotosensorArrayDescription

    Describe an array of photosensors on a plane just like what is used for cameras.

    PhotosensorArrayNode

    Expose the photosensor array, in deferred rendering mode, implementation as a node in SystemGraph.

    PhotosensorData

    Used to collect all information required for the sampling on the GPU.

    PhotosensorEncoder

    The PhotosensorEncoder is a specialized transcoded used when doing data processing on the output of the photosensor. Most of the photosensor encoders will pack or format samples measured by the photosensor in a way that correspond to what a device api/firmware would do.

    Every time it receives an encodedPeriod signal, which is the time elapsed since the last encoding, it reads the output buffer of the photosensor and fills its output buffer according to the format selected by the developer.

    PhotosensorSensingComponent

    Provides a common interface for all photosensor sensing components. This is used to delegate and allow the definition of new sensing components for the photosensor node.

    PhotosensorSensingComponentBinding

    Declares PhotosensorSensingComponent as a bindable type.

    PhotosensorSharedCustomPassVolume

    Serves as a hook for a photosensor to sample the scene simulation. The photosensor attaches to the main camera and their sampling process initializes on the BeforePostProcess.

    PhotosensorToDepth

    The role of this class is to interpret the data incoming from the photosensor into an depth texture. The internal shader does a simple conversion of the z axis value between [minRange, maxRange] -> [0.0, 1.0].

    PhotosensorToDepthNode

    Expose the conversion of photosensor sampling data to a depth buffer as a node in SystemGraph.

    PhotosensorToIntensity

    The role of this class is to interpret the data incoming from the photosensor into an intensity texture. The internal shader does extract the intensity information and format it into a 2D texture according to the frame information.

    PhotosensorToIntensityBuffer

    The role of this class is to interpret the data incoming from the photosensor into an intensity buffer. The internal shader does extract the intensity information and format it into a compute buffer according to the frame information.

    PhotosensorToIntensityBufferNode

    Expose the conversion of photosensor sampling data to an intensity buffer as a node in SystemGraph.

    PhotosensorToIntensityNode

    Expose the conversion of photosensor sampling data to an intensity 2D texutre as a node in SystemGraph.

    PhotosensorToPointCloud

    The role of this class is to interpret the data incoming from the photosensor into a PointCloud data format to be consumed by the point cloud viewer node.

    PhotosensorToPointCloudNode

    Expose the conversion of photosensor sampling data to point cloud buffer as a node in SystemGraph.

    PointCloudToDepth

    The role of this class is to project the PointCloud data inside a frustum described by the input port onto a 2DTexture.

    PointCloudToDepthNode

    Expose the process of converting a point cloud into a depth buffer into in SystemGraph.

    PointCloudToFile

    Provide a way to share saving point cloud to a file.

    PointCloudToFileNode

    Expose functionality to save a point cloud into a file in SystemGraph.

    PointCloudViewer

    The role of this class is to instanciate a vfx on the main camera that will display the PointCloud data received on its input port.

    PolarDistribution

    The role of this class is describe the distribution of OrientedPoints on a sphere surface.

    PoseExtensions

    Provides extension methods for class Transform and struct Pose.

    RandomUtils

    This class provide different random distribution function.

    RangeFinderController

    A RangeFinder is a device that throw only one laser beam to measure the distance and/or intensity.

    RedFilter

    The role of this class is to keep only the red channel information from the incoming render texture.

    RegisterLensModelAttribute

    This attribute provide a way to register different lens model.

    RenderTextureToFile

    The role of this class is to take an incoming render texture and save it into an image file.

    RenderTextureToFileNode

    Expose functionality to save a render texture into a file in SystemGraph.

    RenderTextureTranscoder

    The role of this class is to share everything common to all transcoder that take a RenderTexture as input to produce a RenderTexture as output.

    One must understand when the gamma correction should be applied.The image processing pipeline works with linear color by default.
    The only place where the gamma correction is automatically applied is in the tone mapping node. If one output a render texture, he must consider to create an output RenderTexture with a sRGB support. (see color space parameter on PrepareOutputBuffer())

    RenderTextureViewer

    The role of this class is to instantiate a raw image viewer to display on screen the RenderTexture received on its input port.

    SamplingRequest

    The sampling request class describes the sampling origin and how many samples are taken. The sampling origin is defined by the current photosensor position if isSynchronized is set. If isSynchronized is not set, the sampling origin is interpolated from the previous photosensor's position and the current position. In all cases, we add an offset from the samplingOffsets list.

    SamplingStamps

    Manages a list of sample stamps.

    ScanPatterns

    The role of this class is to provide common tools used by all scan pattern.

    SceneSampler

    This class provide a common interface and functionality for all type of samplers.

    SegmentationAsBuffer

    The role of this class is to take an incoming render texture, extract the index from the color channel and output an index buffer accessible on CPU side.

    SegmentationAsBufferNode

    Provide SegmentationAsBuffer features as a node.

    SegmentationBuffer

    Define a segmentation buffer.

    SegmentationLens

    Implements a lens component to extract segmentation data from the simulation.

    SensorUtils

    Provides some useful ad hoc functions.

    ShotNoise

    Provide a custom noise on lidar intensity.

    ShotNoiseNode

    Expose the application of the noise process on the intensity as a node in SystemGraph.

    TextureExtensions

    Provides extension methods for class Texture.

    ThinLens

    Implements a lens component that reproduces a thin lens model. This camera model is commonly used in game development.

    Transcoder

    Provide a way to share everything common to all transcoder. Mainly the compute shader management.

    UDPStream

    The role of this class is to stream the incoming data from a datagram socket into a RawData type to be outputed inside the SystemGraph. The data is copied from the internal circular buffer into the RawData when a signal event happen on the Read input port.

    VelodynePhotosensorToPointCloud

    The role of this class is to interpret the data incoming from the photosensor into a PointCloud data format to be consumed by the point cloud viewer node, and also into a PointCloud in the Velodyne output format.

    VelodynePhotosensorToPointCloudNode

    Expose the conversion of photosensor sampling data to Velodyne point cloud buffer as a node in SystemGraph.

    VelodynePuckController

    This controller emulate the internal behavior of the Velodyne Puck lidar device to measure the distance and/or intensity.

    Visualization

    Class that encapsulates read-only values about the sensor visualization layer.

    WhiteBalance

    This node applies the white balancing in HDR color space

    YDLidarX4ToPhotosensorOutputBuffer

    This transcoder transform a raw buffer with samples, stored into a raw YDLidar data format, into a compatible buffer to be viewed by the SensorSDK point cloud viewer.

    YDLidarX4ToPhotosensorOutputBufferNode

    Expose the conversion from the YDLidarX4 raw data into a PhotosensorOutputBuffer as a node in SystemGraph.

    YDLidarX4ToPointCloud

    This transcoder transform a raw buffer with samples, stored into a raw YDLidar data format, into a compatible buffer to be viewed by the SensorSDK point cloud viewer.

    YDLidarX4ToPointCloudNode

    Expose the conversion from the YDLidarX4 raw data into a pointcloud as a node in SystemGraph.

    YDLidarX4VizController

    This controller emulate the internal behavior of the YD Lidar X4 device to measure the distance.

    Structs

    CameraDesc

    Describe the required parameter to build a camera.

    CommMessage

    Messsage type used to pass IP communication around.

    OusterCommandDescriptor

    PhotosensorRawData

    Stores the measurement done by the photosensor during the sampling.

    PointXYZI

    Represents a point in a point cloud.

    ProjectedImageOnSensorRequest

    Provides a graphic context and the return value of a request made by the photosensor array to the lens component.

    SampleStamp

    Used to store sampling request internally.

    SceneSamplerDesc

    Use to describe the size and format of the sampling output.

    Interfaces

    IPhotosensorArrayDescription

    Describe an array of photosensors on a plane just like what is used for cameras.

    Enums

    FovAxis

    Enumeration of possible axis to define a fov.

    OusterCommands

    PointCloudFileFormat

    Enumeration of supported file format to save a PointCloud.

    PointCloudOutputMode

    Enumeration of way to save the data.

    RenderTextureToFileEncoder

    Enumeration of available file format to save a render texture.

    ScanMode

    Enumeration of scan mode on the Ouster OS0-128 lidar.

    Delegates

    BusDriver.MessageHandler

    Message handling delegate usued to send callbacks for with listened messages.

    CommBusDriver.CommMessageHandler

    Special handler for the comm message type.

    CustomPassCallback.ExecuteCustomPassAction<T1>

    Declare a delegate for a callback function to execute the photosensor array image processing pipeline.

    HTTPService.Listener

    Delegate used to handle Http requests on specific URIs. This Delegate will always be called from a different thread!

    IPhotosensorArrayDescription.ExposureTimeChangedEventHandler

    Type of event handler used when exposure time changes.

    IPhotosensorArrayDescription.GainChangedEventHandler

    Type of event handler used when gain changes.

    PhotosensorData.ExecuteSamplingAction<T1, T2, T3>

    Declare a delegate type for photosensor sampling.

    Back to top
    Terms of use
    Copyright © 2023 Unity Technologies — Terms of use
    • Legal
    • Privacy Policy
    • Cookies
    • Do Not Sell or Share My Personal Information
    • Your Privacy Choices (Cookie Settings)
    "Unity", Unity logos, and other Unity trademarks are trademarks or registered trademarks of Unity Technologies or its affiliates in the U.S. and elsewhere (more info here). Other names or brands are trademarks of their respective owners.
    Generated by DocFX on 18 October 2023