Deploy locally to run individual tests and visualizations
After creating the simulation in Unity you can turn it into a local build. We will walk through how to create this independent executable that will run the simulation instance. You can optionally connect to multiple GPUs for distributed rendering if the scene has complex geometry or you need multiple display outputs - but you will need to have access to all the computing locally.
- Share this simulation with other members of your team to use locally
- Visualize live results in the simulator as they happen.
- Interact with the simulation in real-time (e.g. flight or driving simulators).
The Scenario
Let’s assume we want to test the performance of a localization algorithm. One method to analyze the performance of a specific localization algorithm is to use it for a navigation task and compare the ground truth position and orientation of a robot with an estimate computed from the localization algorithm.
Here, a navigation task refers to positioning the robot at a specified starting pose, sending velocity commands to the robot controller that drives the robot to reach the desired goal pose, and tracking the state of the robot navigation task, such as the goal reached, stuck at obstacles or moving.
To execute such tasks to test the localization algorithm on the real robot would require running a full robot stack for autonomous driving on the robot in a designated environment that reflects the dynamics of a real-world use-case scenario. This setting has various performance dependencies on the software (different layers within the robot stack) and the robot's hardware (sensors, wheels). We would also need an external sensor or an approach to produce the ground truth data.
Instead, we can isolate a specific algorithm to test and analyze through a robot simulation. Through a Unity project, we can simulate the interaction between the robot and the environment (using URDF / Sensors / Permutation packages). This allows testing for the robustness of an algorithm using various settings of driving hyperparameters. For example, some variables specific to navigation tasks are the starting and goal positions, the locations, sizes, and shapes of obstacles in the environment, and thresholds that define the state of the robot throughout the task.
A Docker container captures programs run on the robot PC. For this demonstration, let’s assume we are building the robot stack using ROS1. The robot PC would contain 1) a robot application to test and 2) a test rig. Therefore, it could look like two separate ROS launch files running in the Docker container. A robot application refers to all ROS nodes necessary to run the robot and execute the navigation tasks. This includes but is not limited to the controller/driver for the robot and embedded sensors, path planner, and simultaneous localization and mapping (SLAM).
Note that the robot stack will look different when running in simulation using Unity versus running the physical robot with sensors in the real world. The robot controller, such as a differential driver for Jackal, and sensory feedback from the likes of camera modules would be simulated in Unity. During real test cases, you would also have to launch drivers for the robot and sensors.
And a test rig refers to other ROS nodes or programs necessary to store the data of interest and run an analysis. This test rig is expected to be configurable and may run with simulation or real tests.
Creating your local custom test is as simple as:
1) (following the authoring sections provided as part of this guide) creating a scene and robot instances in Unity 2) putting together a docker container with a robot and test stack 3) writing a bash script that runs the former two processes (in a loop if desired) and their communication protocol. In our case of using ROS, Unity provides ROS-TCP-Endpoint packages for each Unity and ROS side.
The bash script aims to automate the launching and quitting of each Unity and docker session. It should be possible to use a bash script to automate test sessions with varying values of the driving parameters. For example, there could be a configuration file in JSON that stores the values of hyperparameters.
Start the simulation - Local Play Mode for quick testing Once you have the Simulation authored in Unity, you can hit the 'Play' button at the top of your scene to start the simulation from within the Editor. After it starts with the necessary robotics packages, you can then start your ROS stack, as detailed below.
Launching the ROS stack
The optimal approach here is to have a containerized version of your ROS stack ready to go.- Create a Dockerfile that sets up your own ROS application using your launch/config files (setting up your own ROS stack is beyond the scope of this guide.)
- Build the container image using the docker build commands, with the right tag per your needs.
- Run the container, and wait for it to start without errors.
At this stage, the Unity Simulator should be able to listen for commands from the ROS stack, react, and provide state updates back.
The log files can be used to capture the data flowing back and forth for replay or analysis.