diff --git a/CHANGELOG.md b/CHANGELOG.md index 030a2569..cb26ee54 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,21 @@ # Change Log +## 2024-04-23 + +* `Build systems`: updated to C++20 standard, CMake 3.29.2, Clang 17, GCC 13, Python 3.12 +* Added support for the latest [Unreal Engine 5.4](https://www.unrealengine.com/en-US/blog/all-the-big-news-from-the-state-of-unreal-at-gdc-24)! + ## 2024-01-25 -* +* The `autonomysim` Python package has undergone a complete overhaul! `AutonomyLib` is next. +* `Windows`: We now provide separate Batch/Command and PowerShell build systems. Both are tested in CI/CD. +* `Documentation`: A new system has been rolled out that also generates Python and C++ API docs. +* Support for `Unity Engine`, `Gazebo`, and `ROS1` has been deprecated to focus on `Unreal Engine`, `ROS2`, `ArduPilot/PX4`, `qGroundControl`, `PyTorch`, and real-time applications of `AutonomyLib` via software- and hardware-in-the-loop. +* `Linux`: added `ROS2` support for `Ubuntu 22.04 LTS` (Jammy Jellyfish). + +## Antiquity + +* `Unreal Engine` version 5.0 brought powerful new features including [Nanite](https://www.unrealengine.com/en-US/blog/understanding-nanite---unreal-engine-5-s-new-virtualized-geometry-system) and [Lumen](https://www.unrealengine.com/en-US/tech-blog/unreal-engine-5-goes-all-in-on-dynamic-global-illumination-with-lumen), while deprecating support for the [PhysX](https://developer.nvidia.com/physx-sdk) backend. +* `macOS`: `Unreal Engine` version 5.2 brought native support for Apple/ARM M-series silicon. +* The `master` branch supports `Unreal Engine` version 5.03 and above. For version 4.27, you can use the `ue4.27` branch. +* The `Omniverse Unreal Engine Connector` makes it possible to sync `Unreal Engine` data with an `Omniverse Nucleus` server, which can then sync with any `Omniverse Connect` application including `IsaacSim`. diff --git a/README.md b/README.md index bbe4e5e5..ce6c602c 100644 --- a/README.md +++ b/README.md @@ -45,12 +45,12 @@
- - + +
@@ -60,39 +60,46 @@
++ +
+ ## Announcements -* We are currently adding support for [Unreal Engine 5.4](https://www.unrealengine.com/en-US/blog/all-the-big-news-from-the-state-of-unreal-at-gdc-24)! -* The `autonomysim` Python package has undergone a complete overhaul! `AutonomyLib` is next. -* `Windows`: We now provide separate Batch/Command and PowerShell build systems. Both are tested in CI/CD. -* A new documentation system has been rolled out that covers the Python and C++ APIs. -* `Unreal Engine` version 5.0 brought powerful new features including [Nanite](https://www.unrealengine.com/en-US/blog/understanding-nanite---unreal-engine-5-s-new-virtualized-geometry-system) and [Lumen](https://www.unrealengine.com/en-US/tech-blog/unreal-engine-5-goes-all-in-on-dynamic-global-illumination-with-lumen), while deprecating support for the [PhysX](https://developer.nvidia.com/physx-sdk) backend. -* `macOS`: `Unreal Engine` version 5.2 brought native support for Apple/ARM M-series silicon. -* The `master` branch supports `Unreal Engine` version 5.03 and above. For version 4.27, you can use the `ue4.27` branch. -* Support for `Unity Engine`, `Gazebo`, and `ROS1` has been deprecated to focus on `Unreal Engine`, `ROS2`, `ArduPilot/PX4`, `qGroundControl`, `PyTorch`, and real-time applications of `AutonomyLib` via software- and hardware-in-the-loop. -* `Linux`: added `ROS2` support for `Ubuntu 22.04 LTS` (Jammy Jellyfish). -* The `Omniverse Unreal Engine Connector` makes it possible to sync `Unreal Engine` data with an `Omniverse Nucleus` server, which can then sync with any `Omniverse Connect` application including `IsaacSim`. +### AutonomySim + +- `Build systems`: updated to C++20 standard, `CMake` 3.29.2, `Clang` 17, `GCC` 13, `Python` 3.12 +- The `autonomysim` Python package has undergone a complete overhaul! `AutonomyLib` is next. +- `Windows`: we now provide separate Batch/Command and PowerShell build systems. Both are tested in CI/CD. +- `Documentation`: a new system has been rolled out that also generates Python and C++ API docs. +- Support for `Unity Engine`, `Gazebo`, and `ROS1` has been deprecated to focus on `Unreal Engine`, `ROS2`, `ArduPilot/PX4`, `qGroundControl`, `PyTorch`, and real-time applications of `AutonomyLib` via software- and hardware-in-the-loop. +- The `master` branch supports `Unreal Engine` version 5.03 and above. For version 4.27, you can use the `ue4.27` branch. + +### Unreal Engine and Omniverse + +- `Unreal Engine` version 5.4 brought [new features](https://www.unrealengine.com/en-US/blog/all-the-big-news-from-the-state-of-unreal-at-gdc-24) including animation and sequencing. +- `Unreal Engine` version 5.2 brought native support for Apple/ARM M-series silicon. +- `Unreal Engine` version 5.0 brought powerful new features including [Nanite](https://www.unrealengine.com/en-US/blog/understanding-nanite---unreal-engine-5-s-new-virtualized-geometry-system) and [Lumen](https://www.unrealengine.com/en-US/tech-blog/unreal-engine-5-goes-all-in-on-dynamic-global-illumination-with-lumen), while deprecating support for the [PhysX](https://developer.nvidia.com/physx-sdk) backend. +- The `Omniverse Unreal Engine Connector` enables you to sync `Unreal Engine` data with an `Omniverse Nucleus` server, which can then sync with any `Omniverse Connect` application including `IsaacSim`. For a complete list of changes, view the [change log](./docs/CHANGELOG.md). -## Vision +## Preface: Toward Robotic General Intelligence (RGI) -> "A central challenge in machine learning (ML), a branch of artificial intelligence (AI), is the massive amount of high-fidelity data needed to train models. Datasets for real-world systems are either hand-crafted or automatically labeled using other models, introducing biases and errors into data and downstream models while limiting learning to the offline case. Although game engines have long used hardware-accelerated physics engines of Newtonian dynamics for motion, new accelerators for physics-based rendering (PBR) have made real-time ray-tracing a reality, extending physical realism to the visual domain. Realism only continues to improve with the rapid growth of Earth observation data. For the first time in history, the average user can generate high-fidelity labeled datasets with known physics for offline or online learning. This is revolutionizing AI for robotics, where the data and safety requirements are often otherwise intractable. We invite you to join us in our quest to develop physical AI by contributing to AutonomySim." [-Dr. Adam Erickson, 2024](#) +> "A central challenge in the branch of artificial intelligence (AI) known as machine learning (ML) is the massive amount of high-fidelity labeled data needed to train models. Datasets for real-world systems are either hand-crafted or automatically labeled using other models, introducing biases and errors into data and downstream models, and limiting learning to the offline case. While game engines have long used hardware-accelerated physics engines of Newtonian dynamics, accelerators for physics-based rendering (PBR) have recently made real-time ray-tracing a reality, extending physical realism to the visual domain. In parallel, physical fidelity with the real world has skyrocketed with the rapid growth and falling cost of Earth observation data. For the first time in history, the average user can generate high-fidelity robotic system models and real-world labeled datasets with known physics for offline or online learning of intelligent agents. This will revolutionize AI for robotics, where the data and safety requirements are otherwise intractable, while enabling low-cost hardware prototyping _in silico_." [-Dr. Adam Erickson, 2024](#) ## Introduction -`AutonomySim` is a high-fidelity, photorealistic simulator for *multi-agent and -domain autonomous systems*, *intelligent robotic systems*, or *embodiment* as it is known in the AI community. `AutonomySim` is built on [`Unreal Engine`](https://www.unrealengine.com/) and based on Microsoft [`AirSim`](https://github.com/microsoft/AirSim/). It is an open-source, cross-platform, modular simulator for AI in robotics that supports software-in-the-loop (SITL) and hardware-in-the-loop (HITL) operational modes for popular flight controllers (e.g., `Pixhawk/PX4`, `APM/ArduPilot`). Future support is planned for ground control software (GCS) including `qGroundControl`. `AutonomySim` is developed as an `Unreal Engine` plugin that can be dropped into any Unreal environment or downloaded from the Epic Marketplace. The goal of `AutonomySim` is to provide physically realistic multi-modal simulations with popular built-in libraries and application programming interfaces (APIs) for the development of new sensing, communication, actuation, and AI systems in physically realistic environments. +`AutonomySim` is a high-fidelity, photorealistic simulator for _multi-agent and -domain autonomous systems_, _intelligent robotic systems_, or _embodiment_ as it is known in the AI research community. `AutonomySim` is based on [`Unreal Engine`](https://www.unrealengine.com/) and Microsoft's former [`AirSim`](https://github.com/microsoft/AirSim/). `AutonomySim` is an open-source, cross-platform, modular simulator for robotic intelligence that supports software-in-the-loop (SITL) and hardware-in-the-loop (HITL) operational modes for popular robotics controllers (e.g., `Pixhawk/PX4`, `APM/ArduPilot`). Future support is planned for SITL and HITL ground control software (GCS) such as `qGroundControl`. `AutonomySim` is developed as an `Unreal Engine` plugin that can be dropped into any Unreal environment or downloaded from the [Unreal Engine Marketplace](https://www.unrealengine.com/marketplace/). The aim of `AutonomySim` is to provide physically realistic multi-modal simulations of robotic systems with first-class support for popular AI and control systems libraries in order to develop new perception, actuation, communication, navigation, and coordination AI models for diverse real-world environments. -After an exhaustive analysis of existing solutions, [Nervosys](https://nervosys.ai/) created `AutonomySim` for the development of physical AI models that can be deployed on real-world robotic systems. Like many companies, we chose to build in software first for its increased speed and reduced cost of development. We would love it if you found `AutonomySim` useful for your needs as well. Unlike `AirSim` and related projects, we intend to make public any and all improvements to the software framework. We ask that you share your improvements in return, although you are not obligated in any way to do so. Our goal is simply to provide the most advanced simulator for intelligent robotic systems. +We hope that you find `AutonomySim` enjoyable to use and develop. Unlike other projects, we intend to make public any and all improvements to the software framework. We merely ask that you share your improvements in return, although you are not obligated to do so in any way. Together, we will build a foundation for robotic general intelligence (RGI) by providing the best simulation system for AI in robotics. ## Professional Services -Robotics companies interested in having Nervosys model their hardware and/or software in `AutonomySim` and develop new AI models can contact via e-mail [here](mailto:info@nervosys.ai). We are delighted to offer our services. That way, we can continue to support and improve this wonderful project. +Robotics companies interested in having [Nervosys](https://nervosys.ai/) model their hardware/software and develop related AI models in `AutonomySim` can reach us directly via [e-mail](mailto:info@nervosys.ai). We would be delighted to offer our services, so that we may continue to support and improve this critical open-source robotics project. ## Supported Operating Systems -Below is a list of officially supported operating systems. - ### Windows - Windows 10 @@ -106,46 +113,93 @@ Below is a list of officially supported operating systems. - Ubuntu 22.04 LTS (Jammy Jellyfish) - Ubuntu Server 22.04 LTS (untested) - Ubuntu Core 22 (untested) -- [Botnix 1.0](https://github.com/nervosys/Botnix/) (coming soon!) +- Botnix 1.0 (Torbjörn) ([coming soon!](https://github.com/nervosys/Botnix/)) ### macOS > [!NOTE] -> `Unreal Engine` version 5.2 and up provide native support for Apple/ARM M-series silicon +> `Unreal Engine` versions 5.2 and up natively support Apple/ARM M-series silicon. - macOS 11 (Big Sur) - macOS 12 (Monterey) - macOS 13 (Ventura) - macOS 14 (Sonoma) -### Other: (e.g., Unix, BSD, Solaris, GNU Mach/Hurd, L4, unikernels, OS/2 Warp) - -- Untested - ## Getting Started -Coming soon. - -[](https://nervosys.github.io/AutonomySim/apis/) -[](https://nervosys.github.io/AutonomySim/dev_workflow/) +- [Project structure](https://nervosys.github.io/AutonomySim/project_structure.html) +- [Development workflow](https://nervosys.github.io/AutonomySim/development_workflow.html) +- [Settings](https://nervosys.github.io/AutonomySim/settings.html) +- [API examples](https://nervosys.github.io/AutonomySim/apis.html) +- [Image APIs](https://nervosys.github.io/AutonomySim/apis_image.html) +- [C++ API usage](https://nervosys.github.io/AutonomySim/apis_cpp.html) +- [Camera views](https://nervosys.github.io/AutonomySim/camera_views.html) +- [Sensors](https://nervosys.github.io/AutonomySim/sensors.html) +- [Voxel grids](https://nervosys.github.io/AutonomySim/voxel_grid.html) +- [Robot controllers](https://nervosys.github.io/AutonomySim/controller_robot.html) +- [Radio controllers](https://nervosys.github.io/AutonomySim/controller_remote.html) +- [Wired controllers](https://nervosys.github.io/AutonomySim/controller_wired.html) +- [Adding new APIs](https://nervosys.github.io/AutonomySim/apis_new.html) +- [Simple flight controller](https://nervosys.github.io/AutonomySim/simple_flight.html) +- [ROS](https://nervosys.github.io/AutonomySim/ros_pkgs.html) ## Documentation -For details on all aspects of `AutonomySim`, view the [documentation](https://nervosys.github.io/AutonomySim/). +- [Main documentation](https://nervosys.github.io/AutonomySim/) +- [C++ API documentation](https://nervosys.github.io/AutonomySim/api/cpp/html/index.html) +- [Python API documentation](https://nervosys.github.io/AutonomySim/api/python/html/index.html) -For an overview of the simulation architecture, see the below figure. +Figure 1 below provides an overview of the simulation architecture:
-
+
- Overview of the simulation architecuture from Shah et al. (2017).
+ Figure 1. Overview of the simulation architecuture from Shah et al. (2017).
- - + +
### Machine Operation -`AutonomySim` exposes Application Programming Interfaces (APIs) for progammatic interaction with the simulation vehicles and environment. These APIs can be used to control vehicles and the environment (e.g., weather), generate imagery, audio, or video, record control inputs along with vehicle and environment state, _et cetera_. The APIs are exposed through a remote procedure call (RPC) interface and are accessible through a variety of languages, including C++, Python, C#, and Java. +`AutonomySim` exposes Application Programming Interfaces (APIs) for progammatic interaction with the simulation vehicles and environment. These APIs can be used to control vehicles and the environment (e.g., weather), generate imagery, audio, or video, record control inputs along with vehicle and environment state, _et cetera_. The APIs are exposed through a remote procedure call (RPC) interface and are accessible through a variety of languages, including C++, Python, and Rust. -The APIs are also available as part of a separate, independent, cross-platform library, so that they can be deployed on a real-time embedded system on your vehicle. That way, you can write and test your code in simulation, where mistakes are relatively cheap, before deploying it to real-world systems. Moreover, a core focal area of `AutonomySim` is the development of simulation-to-real (sim2real) domain adaptation AI models, a form of transfer learning. These metamodels map from models of simulations to models of real-world systems, leveraging the universal function approximation abilities of artificial neural networks to _implicitly_ represent real-world processes not _explicitly_ represented in simulations. +The APIs are also available as part of a separate, independent, cross-platform library, so that they can be deployed on embedded systems running on your vehicle. That way, you can write and test your code in simulation, where mistakes are relatively cheap, before deploying it to real-world systems. Moreover, a core focus of `AutonomySim` is the development of simulation-to-real (sim2real) domain adaptation AI models, a form of transfer learning. These metamodels map from models of simulations to models of real-world systems, leveraging the universal function approximation abilities of artificial neural networks (ANNs) to _implicitly_ represent real-world processes not _explicitly_ represented in simulation. -Note that you can use [Sim Mode](https://nervosys.github.io/AutonomySim/settings#simmode) setting to specify the default vehicle or the new [Computer Vision](https://nervosys.github.io/AutonomySim/apis_image#computer-vision-mode-1) mode, so you don't get prompted each time you start `AutonomySim`. See [this](https://nervosys.github.io/AutonomySim/apis) for more details. +> [!NOTE] +> The [Sim Mode](https://nervosys.github.io/AutonomySim/settings#simmode) setting or the new [Computer Vision](https://nervosys.github.io/AutonomySim/apis_image#computer-vision-mode-1) mode can be used to specify the default vehicle, so you don't get prompted each time you start `AutonomySim`. See [this](https://nervosys.github.io/AutonomySim/apis) for more details. ### Hybrid Human-Machine Operation -Using a form of hardware-in-the-loop (HITL), `AutonomySim` is capable of operating in hybrid human-machine mode. The classical example is a semi-autonomous aircraft stabilization program, which maps human control inputs (or lack thereof) into optimal control outputs. +Using a form of hardware-in-the-loop (HITL), `AutonomySim` is capable of operating in hybrid human-machine mode. The classical example is a semi-autonomous aircraft stabilization program, which maps human control inputs (or lack thereof) into optimal control outputs to provide level flight. ## Generating Labeled Data for Offline Machine Learning There are two general approaches to generating labeled data with `AutonomySim`: -1. Using the `record` button manually -2. Using the APIs programmatically +1. Manual: using the `record` button +2. Programmatic: using the APIs -The first method, using the `record` button, is the easiest method. Simply press the big red button in the lower right corner to begin recording. This will record the vehicle pose/state and image for each frame. The data logging code is simple and easy to customize to your application. +The first method, using the `record` button, is the easiest method. Simply press _the big red button_ in the lower right corner to begin recording. This will record the vehicle pose/state and image for each frame. The data logging code is simple and easy to customize to your application.@@ -195,40 +250,61 @@ The second method, using the APIs, is a more precise and repeatable method for g ### Computer Vision Mode -It is also possible to use `AutonomySim` with vehicles and physics disabled. This is known as Computer Vision Mode and it supports both human and machine control. In this mode, you can use the keyboard or APIs to position cameras in arbitrary poses and collect imagery including depth, disparity, surface normals, or object segmentation masks. As the name implies, this is useful for generating labeled data for learning computer vision models. See [this](https://nervosys.github.io/AutonomySim/apis_image) for more details. +It is possible to use `AutonomySim` with vehicles and physics disabled. This is known as `Computer Vision Mode` and it supports both human and machine control. In this mode, you can use the keyboard or APIs to position cameras in arbitrary poses and collect imagery including depth, disparity, surface normals, or object segmentation masks. As the name implies, this is useful for generating labeled data for learning computer vision models. See [this](https://nervosys.github.io/AutonomySim/apis_image) for more details. ## Labeled Data Modalities -The following [sensors](https://nervosys.github.io/AutonomySim/sensors/) and data modalities are either available or planned: +We plan on supporting the following [sensors](https://nervosys.github.io/AutonomySim/sensors/) and data modalities: -* RGB imagery -* Depth -* Disparity -* Surface normals -* Object panoptic, semantic, and instance segmentation masks -* Object bounding boxes (coming soon) -* Audio (coming soon) -* Video (coming soon) -* Short- or long-wavelength infrared imagery ([see](https://nervosys.github.io/AutonomySim/InfraredCamera/)) -* Multi- and Hyper-spectral (coming soon) -* LiDAR ([see](https://github.com/nervosys/AutonomySim/blob/master/docs/lidar.md); GPU acceleration coming soon) -* RaDAR (coming soon) -* SoNAR (coming soon) +- RGB imagery +- Depth +- Disparity +- Surface normals +- Object panoptic, semantic, and instance segmentation masks +- Object bounding boxes (coming soon) +- Audio (coming soon) +- Video (coming soon) +- Short- or long-wavelength infrared imagery ([see](https://nervosys.github.io/AutonomySim/InfraredCamera/)) +- Multi- and hyper-spectral (TBD) +- LiDAR ([see](https://github.com/nervosys/AutonomySim/blob/master/docs/lidar.md); GPU acceleration coming soon) +- RaDAR (TBD) +- SoNAR (TBD) -Autolabeling systems may be added in the future. +We also plan on providing autolabeling systems in the future. ## Vehicles ### Ground -* Automobile -* BoxCar (coming soon) -* ClearPath Husky (coming soon) -* Pioneer P3DX (coming soon) +- Automobile +- BoxCar (coming soon) +- ClearPath Husky (coming soon) +- Pioneer P3DX (coming soon) ### Air -* Quadcopter +- Multirotor aircraft: Quadcopter +- Rotor-wing aircraft (TBD) +- Fixed-wing aircraft (TBD) +- Hybrid aircraft (TBD) + +## Environmental Dynamics + +### Weather + +The weather system support human and machine control. Press the `F10` key to see the available weather effect options. You can also control the weather using the APIs, as shown [here](https://nervosys.github.io/AutonomySim/apis#weather-apis). + +
+
+
+ Weather effects menu.
+
-
-
- Weather effects menu.
-