This repository contains simulation models, and corresponding motion planning and controlling demos of the robotic manipulator xArm6 from UFACTORY. There are five modules:
- aruco_calibration
- perception_etflab
- real_bringup
- sim_bringup
- RPMPLv2 (https://github.com/roboticsETF/RPMPLv2.git)
The development and test environment are tested on Ubuntu 22.04 + ROS2 Humble.
Choose a location for a target workspace, e.g.:
cd ~/
and then clone the repository:
git clone https://github.com/roboticsETF/xarm6-etf-lab.git
cd ~/xarm6-etf-lab # Not required if you are cloning through IDE (e.g., VS Code)
make submodules
cd ~/xarm6-etf-lab/src/external_modules/xarm_ros2
git submodule sync
git submodule update --init --remote
cd ~/xarm6-etf-lab/src/external_modules/gazebo_ros2_control
git submodule sync
git submodule update --init --remote
Optionally, if you want to change the robot initial configuration (all joint angles are set to zero by default), open the file ~/xarm6-etf-lab/src/external_modules/gazebo_ros2_control/gazebo_ros2_control/src/gazebo_system.cpp
, and find the function GazeboSystem::registerJoints
. Then, find the following code line:
initial_position = get_initial_value(joint_info.state_interfaces[i]);
Instead of that line, add the following code:
if (j == 3)
initial_position = M_PI;
else if (j == 4)
initial_position = M_PI_2;
else
initial_position = get_initial_value(joint_info.state_interfaces[i]);
Similarly, you can set any desired angle for the j
-th joint.
cd ~/xarm6-etf-lab
rosdep update
make dependencies
make build
source source_dirs.bash
In apps/object_segmentation.cpp
file within perception_etflab
library, you can set config_file_path
to be:
/perception_etflab/data/sim_perception_etflab_config.yaml
- Obstacles are random and dynamic, and the law of their motion is defined withinsrc/environments/Obstacles.cpp
file. Point cloud representing obstacles is generated within the code, and then corresponding obstacles are published at a rate of1/period
[Hz] toobjects_cloud
topic. All configuration settings (includingperiod
) can be set in the used yaml file withinrandom_obstacles
node./perception_etflab/data/real_perception_etflab_config.yaml
- Obstacles are static, and they are defined withinworld/etflab.world
file insim_bringup
library. After combining point clouds from two cameras (left and right one), a combined point cloud is read frompointcloud_combined
topic. Then, obstacles are published at a rate of cca. 5 [Hz] toobjects_cloud
topic after their segmentation is done.
First, type:
cd ~/xarm6-etf-lab
Then, launch the robot using one of the following three options:
First option (RViz2 + Gazebo):
ros2 launch sim_bringup xarm6_etflab.launch.py
# or just type:
make sim
Second option (MoveIt2 + RViz2 + Gazebo):
ros2 launch sim_bringup xarm6_moveit_gazebo.launch.py
Third option (MoveIt2 + RViz2):
ros2 launch xarm_moveit_config xarm6_moveit_realmove.launch.py robot_ip:=192.168.1.236 [add_gripper:=true]
Note: For each test file from apps
folder, there is a corresponding yaml file withing data
folder, where all necessary configurations can be set.
# In the new tab type:
cd ~/xarm6-etf-lab
# In C++
ros2 run sim_bringup test_move_xarm6
# In Python
ros2 run sim_bringup test_move_xarm6.py
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run sim_bringup test_planners
Sim_demo2.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run sim_bringup test_task_planning
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run sim_bringup test_real_time_planning
Sim_demo4.mp4
Sim_demo4_v2.mp4
3.5 Test demo 5 (real-time task planning using DRGBT planner from RPMPLv2 library and using cameras)
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run sim_bringup test_real_time_task_planning
First, launch the cameras:
cd ~/xarm6-etf-lab
ros2 launch real_bringup realsense_etflab.launch.py
# or just type:
make cameras
In apps/object_segmentation.cpp
file within perception_etflab
library, config_file_path
must be set to /perception_etflab/data/real_perception_etflab_config.yaml
. Two cameras scan the environment, and after combining point clouds from the cameras (left and right one), a combined point cloud is read from pointcloud_combined
topic. Then, obstacles are published at a rate of cca. 25 [Hz] to objects_cloud
topic after their segmentation is done.
Note: For each test file from apps
folder, there is a corresponding yaml file withing data
folder, where all necessary configurations can be set.
First, launch the robot:
cd ~/xarm6-etf-lab
ros2 launch real_bringup real_xarm6_etflab.launch.py
# or just type:
make real
# In the new tab type:
cd ~/xarm6-etf-lab
# In C++
ros2 run real_bringup test_move_xarm6
# In Python
ros2 run real_bringup test_move_xarm6.py
Real_demo1.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_bottle_and_glass
Real_demo2.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_pick_and_place
Real_demo3.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_pick_and_place_using_cameras
Real_demo4.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_planners
Real_demo5.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_task_planning
Real_demo6.mp4
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_real_time_planning
Scenario1.mp4
Scenario2.mp4
5.8 Test demo 8 (real-time task planning using DRGBT planner from RPMPLv2 library and using cameras)
# In the new tab type:
cd ~/xarm6-etf-lab
ros2 run real_bringup test_real_time_task_planning