Welcome to Unity Simulation
Here we have a bunch of usage guides that teach you—the simulation engineer, robotics engineer, or AV engineer—how to develop simulations that allow you to test, analyze, generate data, and more.
Your simulation workflow involves two main steps:
1. Author a simulation
Check out this guide to learn how to create an accurate simulation. This usually involves importing and tuning a vehicle model (robot, car, drone, autonomous vehicle, etc.), setting up some controllers so that the vehicle model can be commanded, setting up some sensors so that the vehicle model can make observations, importing or creating a virtual environment that your vehicle model can move around in and interact with, and, finally, connecting your application.
2. Build your simulation
Check out this next section of the guide to learn how to build your simulation to run on Linux without any x-server needs, if that fits your need/usecase.
3. Deploy your simulation
Some simulations are intended to run on desktops as part of a development workflow, while others are destined for a CI system and require some kind of job scheduling. Click here.
If you want to jump right in, go ahead and click into the guides above! Otherwise, read on to learn the mental model for Unity Simulation.
Unity Simulation can be broken into three core pieces:
- Your application
This is the system you are testing, analyzing, or generating data for. This is the reason you came to simulation in the first place.
- Unity Simulation Authoring Tools
These tools enable engineers to build accurate simulations. We currently have libraries for sensors, controllers, and environment randomizers, each designed to help you create useful simulations in Unity.
- Unity Simulation Engine
These core pieces—Physics, Rendering, and Scripting—form the foundation that everything else sits on. Unity Simulation is built with accurate and scalable physics and rendering, suitable for industrial and professional simulation use.
Your Application
Unity Simulation excels at machine-to-machine simulation. These simulations are built so that engineers can answer questions about their autonomous systems—think robots, drones, autonomous vehicles, or manufacturing cells.
- Will my mobile robot localize correctly if the stack of boxes is moved 3" after the initial mapping?
- Can the autonomous forklift handle tall ceiling heights?
- Will my service robot navigate well in crowded settings?
- How well will my autonomous vehicle perception system behave under various lighting conditions?
- Will this fleet of mobile robots behave well in a large-scale environment?
We call these "what-if" questions. Engineers can test their software against many more "what-ifs" in simulation than in the real world. Simulation can't (yet) fully match the resolution and fidelity of the real world, but it's still great for gaining confidence that a system will operate correctly before deploying it to the real world.
The Unity Simulation Engine
On the left, in green, we highlight the Unity Simulation Engine. This engine has three core pieces:
Rendering
Rendering is where we keep track of all 3D geometry, lighting, and materials to ensure Unity can realistically simulate the world. The Unity Simulation Engine can run really large simulation workloads by deploying across multiple GPUs and by running in headless mode.
See LHS & DR documentation to get started with Unity Simulation Pro for rendering acceleration.
Physics
Physics engines use numerical solvers to compute the state of each body in space given the forces and torques applied to those bodies. The Unity Simulation engine is built on the PhysX 4.1 physics engine, which can scale to handle a huge number of calculations on a large number of bodies, on compute ranging from a single laptop to a large server.
PhysX is maintained and documented by Nvidia. In Unity, you can create GameObjects to represent the physical objects in the Scene, and then add Components to each of the objects. There are a set of components, including cloth, Mesh Collider, and Articulation Body, that when added to a GameObject cause that object's state to be updated by the physics engine.
Warning
Not all physics components play nicely together.
Be sure to check out our guides on how to set up vehicle models. There are several "gotchas" to be aware of. For example, ArticulationBody and RigidBody components should never be used in the same hierarchy together.
The most commonly used component for machine-to-machine simulation is the Articulation Body. You can use this component to link several GameObjects together to create articulated systems. In simulations, modeling systems as a series of bodies with constraints between them (revolute, planar, linear, et.c) is a really powerful tool! Autonomous vehicles, mobile robots, and robot arms are all able to be modeled using joints as a core concept.
Scripting
Scripting is the final piece of the puzzle. Scripting in Unity is done in C# and allows you to modify anything. You can make people move and objects interact, add buttons to elevators, and more. For an intro to scripting in Unity, check out the roll-a-ball tutorial.
For more details on exactly how this system works, check out the Unity reference documentation on scripting.
The Unity Simulation Authoring Tools
The core pieces of the engine are always available for you to dig into and extend if you need to, but for most use cases you can use a Unity Simulation authoring tool to get started more quickly. These authoring tools are built on top of the core engine components by teams of engineers that specialize in Unity core engine components.