GPU Sampling: GPU sampling is the action of collecting information from the simulated scene at render time. There are two nodes that do just that in the SensorSDK: photosensor and photosensor arrays. The sampling speed is tied to the simulation rendering speed (aka FPS). It can be controlled by the Global System Graph component.
Sensor: A sensor is a component-based system built on top of SystemGraph with the purpose of modeling a real-world sensor - such as Lidar or Camera - retrieving data from a synthetic scene built on Unity that follows closely the format and nature of the data generated by its real-world counterpart.
Sensor Developer: (a.k.a. System Developer) A sensor developer is an engineer who knows well how sensors work and use the SensorSDK to create a digital twin or a prototype of an existing sensor. Their work consists of:
Assembling available nodes;
Creating new nodes;
Validating the emulated sensor to match the characteristic of the real sensor.
Sensor user: (a.k.a. Application Developer) A sensor user is a system developer who wants to gather information from the scene or simulation according to an existing sensor on the market. Their main roles are:
Adding existing sensors in their scene or simulation by drag-and-dropping them from the SensorSDK library;
Changing the specifications from an existing sensor if needed;