Class Photosensor
Requirement : This node uses DXR so an NVidia card GTX 1060 or above is required.
A Photosensor is an electronic component that detects the presence of visible light, nfrared transmission (IR), and/or ultraviolet (UV) energy. This node responsible to take measurement from the simulation as a laser paired with a photosensor would do. In the photosensor, there is an implicit laser firing a beam originating from the photosensor. If no beam light node is specified in the graph, the photosensor will consider the laser as if it was a white light and will take only the depth. No intensity will be measured.See beam light node for more information on how to define the laser light beam.
There is an heuristic inside this node implementation that assumes it will rotate on itself.The sampling shader interpolates the rotation of the photosensor to accommodate its usage in a lidar kind of sensor.
Inherited Members
Namespace: Mechatronics.SensorSDK
Syntax
[NodeCategory("Sensor", "Photosensor", NodeTick.Asynchronous, LifeCycle.Update, 0F, NodeMode.Standard, false)]
[HelpURL("https://docs.unity3d.com/Packages/com.unity.sensorsdk@1.0/manual/DeveloperGuide/PhotosensorNode.html")]
public class Photosensor : NodeRuntime
Constructors
Photosensor()
Constructs the photosensor and initializes the relative error curve to be displayed on the node.
Declaration
public Photosensor()
Fields
maxRange
Define the maximum range of measurement. This act as a maximum range clamp over the sensitivity threshold one. Every sample outside the max range will have a distance of 0 meter.
Declaration
[Tooltip("Define the maximum range of measurement. This act as a maximum range clamp over the sensitivity threshold one.")]
[Field("Max range", PortDirection.Left, FieldExtra.Read)]
[SerializeField]
protected PortType<float> maxRange
Field Value
Type | Description |
---|---|
PortType<Single> |
minRange
Define the minimum range of measurement. The measurement will start at this distance from the photosensor object in the scene. Everything before this range won't be perceived by the device.
Declaration
[Tooltip("Define the minimum range of measurement. The measurement will start at this distance from the photosensor object in the scene.")]
[Field("Min range", PortDirection.Left, FieldExtra.Read)]
[SerializeField]
protected PortType<float> minRange
Field Value
Type | Description |
---|---|
PortType<Single> |
outputData
A compute buffer structured this way : Vector3 sphericalCoordinates, float intensity, uint cellIndex.
Declaration
[Tooltip("Photosensor samples.")]
[Field("RawData", PortDirection.Right, FieldExtra.Write)]
[SerializeField]
protected PortType<PhotosensorOutputBuffer> outputData
Field Value
Type | Description |
---|---|
PortType<PhotosensorOutputBuffer> |
outTranscode
This is a structure contains a command buffer used to execute the sampling. This is required to apply any gpu processing pipelined right after the sampling. This can be seen also as a latch signal for the next node to add it’s processing to the gpu command buffer.
Declaration
[Field("OutTranscode", PortDirection.Right, FieldExtra.Write)]
[SerializeField]
protected PortType<CustomPassContext> outTranscode
Field Value
Type | Description |
---|---|
PortType<CustomPassContext> |
referenceToWorldTransform
Transform used to return points acquired by the photosensor.
Declaration
[Binding("Reference frame")]
[SerializeField]
protected Binding<Transform> referenceToWorldTransform
Field Value
Type | Description |
---|---|
Binding<Transform> |
relativeDepthError
The relative depth error input is a curve that provides the relative error over the distance. The lidar range [minDistance, maxDistance] is normalized on the X axis between [0, 1]. This curve is uniformly sampled, by the photosensor, and sent to the shader as a 1D texture of 2048 samples.
To compute the error, in the shader, we sample the relative depth error textured mentioned above based on the distance.We take a normally distributed random number and we scale it according to the relative error.Then we apply the error on the measured distance.
Declaration
[Tooltip("Apply the relative error corresponding to the curve with a normal distribution. (0 means no relative error).")]
[Field("Relative depth error", PortDirection.Left, FieldExtra.Read)]
[SerializeField]
public PortType<AnimationCurve> relativeDepthError
Field Value
Type | Description |
---|---|
PortType<AnimationCurve> |
samplingRequest
The input structure will be stored into the node until the next simulation rendering where the node will execute the sampling on the simulation.
The sampling request structure can be found into photosensor.cs.You will find the list of variables inside the struct and a description of what they are used for.
Declaration
[Tooltip("List of orientations where to take a sample from.")]
[Field("SamplingRequest", PortDirection.Left, FieldExtra.Read | FieldExtra.ChangeEvent)]
[SerializeField]
protected PortType<SamplingRequest> samplingRequest
Field Value
Type | Description |
---|---|
PortType<SamplingRequest> |
sensingComponent
Defines what sampler to use for the simulation. It allows the user to specify a sampler, or create their own.
Declaration
[Binding("Sensing")]
[SerializeField]
protected Binding<PhotosensorSensingComponent> sensingComponent
Field Value
Type | Description |
---|---|
Binding<PhotosensorSensingComponent> |
sensitivity
Minimum radiant energy threshold (watts/m^2). Under this threshold, the sampling fails. Which means the return position and intensity will be 0.0. The default value for this parameter is 0, which means it always succeeds if bounced back to the photosensor.
Declaration
[Tooltip("Minimum radiant energy threshold (watts/m^2). Under this threshold, the sampling fails. (0 (default) means always succeed if bounced back to photo detector).")]
[Field("Sensitivity", PortDirection.Left, FieldExtra.Read)]
[SerializeField]
protected PortType<float> sensitivity
Field Value
Type | Description |
---|---|
PortType<Single> |
wavelength
This defines the wavelength perceived by the photosensor. So if there are many sensors in the scene and some are pointing toward the same spot on an object or towards each other, the photosensor will be reading the contribution made by the other sensors. So interference across sensors is emulated.
Declaration
[Tooltip("Wavelength perceived by the photosensor.")]
[Field("Wavelength (nm)", PortDirection.Left, FieldExtra.Read)]
[SerializeField]
protected PortType<uint> wavelength
Field Value
Type | Description |
---|---|
PortType<UInt32> |
Methods
Disable()
Unregister the photosensor to the main camera to sample the scene and release graphic resources and clear sampling requests.
Declaration
public override void Disable()
Overrides
Enable(Scheduler.ClockState)
Register the photosensor to the main camera to sample the scene, initialize random buffer for the depth measurement error, setup sensor and sampling request list.
Declaration
public override void Enable(Scheduler.ClockState clockState)
Parameters
Type | Name | Description |
---|---|---|
Scheduler.ClockState | clockState | The parameters of the waveform associated to this node |
Overrides
OnSamplingRequested()
Callback when a controller send a new sampling request.
Declaration
public void OnSamplingRequested()
Update()
Update sensor parameters, resize and/or update buffers required for executing the sampling on GPU. Note: No emulation code should be done here.
Declaration
public override void Update()