Simulation environment for multiagent drone systems using Ardupilot, ROS 2, and Gazebo enabling users to spawn and control multiple drones, configure sensors, and test autonomous behaviors in a reproducible and extensible setup.
Maintainer: Gilbert Tanner
- Prerequisites
- Install Ardupilot environment
- Build workspace
- Devcontainer
- Run simulation
- Multiagent simulation
- Drone configuration
- Fly the drone via position control
- Feeding in external odometry
- Limitations
- Troubleshooting
- Contact
- ROS Dev tools:
sudo apt install ros-dev-tools
source <path-to-ardupilot-workspace>
rosdep install --from-paths src --ignore-src -r -y
colcon build --symlink-install
Using a devcontainer or Dockerfile provides a consistent development environment, isolates dependencies, and avoids conflicts with other software on the host machine. This setup ensures that the project runs the same way on any device, improving reproducibility and easing collaboration.
- Open the project in VSCode.
- Press
F1
and selectRemote-Containers: Open Folder in Container...
. - Choose the folder to open in the container.
- Navigate to the project directory.
- Run the following command:
code . --folder-uri vscode-remote://dev-container+<container-id>
- Build the Docker image:
docker build -t multiagent_simulation .devcontainer
- Run the Docker container:
docker run -it --rm -v $(pwd):/workspace multiagent_simulation
- Source workspace
source ./install/setup.{bash|zsh}
- Launch the simulation
Launch the simulation with a specific world file
ros2 launch multiagent_simulation multiagent_simulation.launch.py
ros2 launch multiagent_simulation multiagent_simulation.launch.py world_file:=rubico.sdf
The multiagent_simulation.launch.py allows the user to spawn multiple drones in the same world. Each drone gets its own ROS namespace and MAVLink system ID.
The LiDAR(s) and camera(s) are defined in separate xacro files (lidar, camera), depth_camera and rgbd_camera as macros allowing for quick addition and removal of sensors for testing.
Example:
<xacro:lidar_sensor name="lidar_3" pose="0.0 0.02 -0.05 0 1.57 0" horizontal_fov="0.614356" vertical_fov="0.673697" horizontal_samples="100" vertical_samples="50" update_rate="20" />
The xacro is then converted to SDF inside the multiagent_simulation.launch.py.
The move_drone node allows the user to set the flight mode, arm the drone, takeoff and move using position control.
Note: After starting the simulation it might take a few seconds until the drone can be armed. If arming was successful "Not ready" should change to "Ready to Fly".
ros2 run multiagent_simulation move_drone
The drone can be configured to fuse external odometry by changing the parameters described in Cartographer SLAM with ROS 2 in SITL. These parameters are already available in gazebo-iris.parm so in order to change from GPS navigation to external odometry navigation they only need to be uncommented and the GPS section commented out.
Ardupilot then receives the external odometry via the /ap/tf
topic. Currently feeding external odometry in simulations is only supported for a single drone as described in the limitations section below.
Description: Currently, the /ap
namespace isn't correctly namespace for multiple drones. For more details on this limitation, see this issue.
Related issue: ArduPilot/ardupilot_gz#74
If the drone model isn't spawning into the simulation the GZ_SIM_RESOURCE_PATH
environment variable might not be set correctly. This variable should include the models
folder and worlds
folder of the multiagent_simulation
package and the src
folder of the workspace. You can set it manually by running the following command:
WORKSPACE=${PWD}
export "GZ_SIM_RESOURCE_PATH=$GZ_SIM_RESOURCE_PATH:$WORKSPACE/src/multiagent_simulation/models:$WORKSPACE/src/multiagent_simulation/worlds:$WORKSPACE/src" >> ~/.bashrc
More information about this is available in the Using SITL with Gazebo article.
For any inquiries, please reach out to gilberttanner.contact@gmail.com or open an issue on this repository.