Environmental setup for rovers using PX4, ros2 humble, and VICON MoCap
-
Boot the Nvidia Orin Board (if bootable, then go ahead)
-
Install Nvidia SDK Manager app in your favorite, host computer
-
Boot the orin as recovery mode, you need a thin connect
-
Through Nvidia SDK, set up the correct device, jetpack version (6) and everything. I suggest to check to install everything they have (including cuda)
Install Jetson Software with SDK Manager — SDK Manager 2.1.0 documentation
- It seems also can install via command line, but need to update the BSP (the nvidia driver) in advance, so not recommend to do so How to Install and Configure JetPack SDK — JetPack 6.0 documentation
-
After flashing is complete, Nvidia SDK will try to install the selected SDK packages afterwards. Here, you need to connect the Orin and your host PC either via USB or ethernet. In my case, USB wasn’t work, so I connected them via ethernet, and manually set up the IP address to 192.168.56.x (just not 55 which is the default one), and proceed.
- Orin board needs to be connected through internet.
- Install the wifi/bluetooth module. In my case it was Intel 8xxxx module. But it was not detecting the wifi.
- So connect the internet through any sources, and install the driver
sudo apt install iwlwifi-modules
-
Follow the basic installations
- (This part in DASC Lab member can be fully ignored)
- (assume SSD is already set up, and booted with that SSD, so need to skip the SSD part)
- Therefore, the
/mnt/nova_ssd/
should be ignored, and instead just use the default home directory, such as/home/ubuntu/
-
Check JetPack version. The version should be R36 (June 27th, 2024)
cat /etc/nv_tegra_release
if the output is not R36, then need to upgrade jetpack.
-
Clone
isaac_ros_common
under${ISAAC_ROS_WS}/src
.cd ${ISAAC_ROS_WS}/src && \ git clone https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_common.git
-
(Deprecated)
- Jetpack 6 will automatically install docker (nvidia-docker via nvidia-toolkit), so you can skip this part.
- I know you already might have it, but there has been an major update for nvidia docker (now
nvidia-docker2
is deprecated, and need to installnvidia-container-toolkit
that we have installed in step 3). And in this update, you don’t need to install CUDA, but need to install the new nvidia driver.
-
Follow the step in 3.9 Ubuntu installation (network repo).
- For instsance, the jetson Xavier that we are using, you should put
ubuntu2004/arm64
as$distro
and$arch
1. Introduction — Installation Guide for Linux 12.5 documentation
- For instsance, the jetson Xavier that we are using, you should put
-
Then, go to this page, follow the instructions to install Realsense related packages (make realsense camera works in the Docker file)
Isaac ROS RealSense Setup — isaac_ros_docs documentation
- add docker to the group (you will see the red message says to do so)
- set docker to use without sudo
sudo usermod -aG docker $USER
- the
run-dev.sh
takes a lot of time - Then, check whether the Isaac-ros docker can use realsense packages (see the instructions)
- need to connect with a valid USB cable (no the power charging cable)
-
If above step succeed, go to this page to finish nvblox setting
-
download assets
-
run-dev.sh
will now run the same docker container -
Then, install nvblox inside of your docker container (install from source)
-
If you find from the colcon build that
--allow-overriding (something)
, which means you have installed it via Debian and now tries to install it from source, so add that flag. -
try a demo example
-
-
If above step succeed, go to this page to try a real camera demo:
RealSense Camera Examples — isaac_ros_docs documentation
- note that the installation (in case from srouce - install packages via rosdep) is done within the docker container, so if you run docker again, you need to install them again.
- you need to unplug/plug the realsense camera whenever you re-run docker container
-
Let’s add our own docker setup into the isaac-ros docker container. The series of commands setting and installing docker environments are all organized in
run_dev.sh
, and it will callbuild_image_layers.sh
to build dockerfile one by one.-
So, we will change a few things, to insert our dockerfile, and
run_dev.sh
will automatically build our dockerfile during it’s own instruction. -
First, configure
.isaac_ros_common-config
. During the realsene setup, we already created and configured this file. You can find this file inisaac_ros_common/scripts
directory. It was previouslyros2_humble.realsense
. Now, change it to:CONFIG_IMAGE_KEY=ros2_humble.realsense.dasc_isaac
It means
build_image_layers.sh
will run in an order of: 1.Dockerfile.aarch64
2.Dockerfile.ros2_humble
3.Dockerfile.realsense
4.Dockerfile.dasc_isaac
5.Dockerfile.user
.Except for 4, they are already provided by Nvidia.
-
Second, place the
Dockerfile.dasc_isaac
in the directoryisaac_ros_common/docker
. It includes building PX4 and VICON related libraies. -
Third, replace the
workspace-entrypoint.sh
in directoryisaac_ros_common/docker/scripts
. It includes install rosdep and source ros setup.bash files. It prevents the docker container from installing rosdep for NvBlox every time. -
Lastly, remove the
--rm
argument inrun_dev.sh
’s last line (docker run arguments). Otherwise, the container will be removed whenever we exit the container. Then, add this line as an extra arguments in the last CLI inrun_Dev.sh
. Also change the workdir for convenience:docker run - it \ (...) -v /home/ubuntu/workspaces/px4_ugv_exp/colcon_ws:/workspaces/colcon_ws \ (...) --workdir /workspaces \
-
-
Before running
run_dev.sh
again, we need to clone our px4-related ros2 source packages from git.-
In
workspaces/
directory:git clone https://github.com/tkkim-robot/px4_ugv_exp cd px4_ugv_exp git submodule update --init --recursive
-
Run
run_dev.sh
, and build the ros2 packages:colcon build --symlink-install --cmake-args -DCMAKE_BUILD_TYPE=Release
-
Try running a demo NvBlox demo again
ROS_DOMAIN_ID={YOUR_ID} ros2 launch nvblox_examples_bringup realsense_example.launch.py
-
If the rviz doesn't pop up (or 'xcb' related issue), then:
export DISPLAY=":1"
It also might be
":0"
, but in our setting it was":1"
(read from another article it's related to auto login).
-
-
In the docker, give r/w access to USB1 (which is px4 board)
sudo chmod 777 /dev/ttyUSB1
-
Trun on the VICON.
- Turn two VICON computers and one switch on.
- On system tab, reboot the red cameras
- On object tab, de-select "auto enable" and click "track".
- Press 'alt' and drag markers of interset, and type the name (default: px4_1) and done.
- Right click 'px4_1' on object tab, click "save object" and make it "shared".
-
Turn on the rover with Orin and charged battery, put on markers on the plate (to be asymmetric).
-
Place the indicator to the battery (left sided), and change the beep threshold to 3.70. (11.1v is the lowest voltage to stop, it's charged up to 12.x v.)
-
SSH into the Orin with VSCode.
-
Setup ground station with random laptop.
cd hardik/rover_groundstation_ros2_jumpstart
xhost +
docker compose up -d
docker exec -it rover_groundstation_ros2_jumpstart-gs-1 bash
(in the docker) ROS_DOMAIN_ID=4 ros2 launch ground_station_launch gs.launch.py
-
Setup Orin with SSH in VSCODE
docker start isaac-(...)
docker exec -it isaac-(...) bash
(in the docker) cd colcon_ws && source install/setup.bash
sudo chmod 777 /dev/ttyUSB1
ROS_DOMAIN_ID=4 ros2 launch all_launch px4.launch.py
- Then, the RVIZ in groundstation turns green to "VALID"
-
Now, run scripts
- The extra repos and folders are placed in 'px4_ugv_exp/colcon_ws/src/dasc_ros/dasc_ros_utils/scripts/', and it is mounted to the docker.
- Now, running your own scripts that publish px4 topic, it will run the rover.
ROS_DOMAIN_ID=4 ros2 run dasc_lab_utils publish_u.py
- it will convert the ros2 ctrl_vel to left/right wheel velocity into px4.
ROS_DOMAIN_ID=4 ros2 run dasc_lab_utils publish_tracking_node.py
- customize your file and run like this.
-
To set (0,0) velocity to both wheels:
- (very beginning) In RVIZ, click "raw mode", make sure both motors are 0.0, click "publish", then click "arm".
- When you "disarm", click "publish", "arm", deselect "publish", then run my program again.