Skip to content

Learning self-supervised traversability with navigation experiences of mobile robots: A risk-aware self-training approach @ IEEE RA-L'24

License

Notifications You must be signed in to change notification settings

Ikhyeon-Cho/LeSTA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

54 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

LeSTA

Learning Self-supervised Traversability with Navigation Experiences of Mobile Robots


πŸ› οΈ Installation | πŸŽ₯ Video | πŸ“– Paper | πŸ“ Dataset


demo demo

LeSTA directly learns robot-specific traversability in a self-supervised manner by using a short period of manual driving experience.

πŸ“’ News & Updates

  • 2024.07.30: Our paper is accepted for presentation at IEEE ICRA@40 in Rotterdam, Netherlands
  • 2024.02.29: Our paper is accepted by IEEE Robotics and Automation Letters (IEEE RA-L)
  • 2024.02.19: We release the urban-traversability-dataset for learning terrain traversability in urban environments

πŸš€ What's in this repo

  • C++ package for LeSTA with ROS interface (lesta_ros)

    • Traversability label generation from LiDAR-reconstructed height map
    • Traversability inference/mapping using a learned network
  • PyTorch scripts for training LeSTA model (pylesta)

πŸ› οΈ Installation

Our project is built on ROS, successfully tested on the following setup.

  • Ubuntu 20.04 / ROS Noetic
  • PyTorch 2.2.2 / LibTorch 2.6.0

lesta_ros

  1. Install Grid Map library for height mapping:

    sudo apt install ros-noetic-grid-map -y
  2. Install LibTorch (choose one option):

    CPU-only version (Recommended for easier setup)
    wget https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-2.6.0%2Bcpu.zip -P ~/Downloads
    sudo unzip ~/Downloads/libtorch-cxx11-abi-shared-with-deps-2.6.0+cpu.zip -d /opt
    rm ~/Downloads/libtorch-cxx11-abi-shared-with-deps-2.6.0+cpu.zip
    GPU-supported version (e.g. CUDA 11.8)
    # To be updated...
  3. Build lesta_ros package:

    cd ~/ros_ws/src
    git clone https://github.com/Ikhyeon-Cho/LeSTA.git
    cd ..
    catkin build lesta
    source devel/setup.bash

πŸ’‘ Notes:

  • We recommend starting without GPU processing. The network effectively runs on a single CPU core.
  • If you are interested in height map reconstruction, see height_mapping for more details.

pylesta

  1. Install PyTorch (choose one option):

    CPU-only setup

    We recommend using a virtual environment for PyTorch installation.

    Conda

    conda create -n lesta python=3.8 -y
    conda activate lesta
    conda install pytorch=2.2 torchvision cpuonly tensorboard -c pytorch -y

    Virtualenv

     virtualenv -p python3.8 lesta-env
     source lesta-env/bin/activate
     pip install torch==2.2 torchvision tensorboard --index-url https://download.pytorch.org/whl/cpu
    CUDA setup

    We recommend using a virtual environment for PyTorch installation.

    Conda

    conda create -n lesta python=3.8 -y
    conda activate lesta
    conda install pytorch=2.2 torchvision tensorboard cudatoolkit=11.8 -c pytorch -c conda-forge -y

    Virtualenv

    virtualenv -p python3.8 lesta-env
    source lesta-env/bin/activate
    pip install torch==2.2 torchvision tensorboard --index-url https://download.pytorch.org/whl/cu118
  2. Install pylesta package:

    # Make sure your virtual environment is activated
    cd LeSTA
    pip install -e pylesta

🐳 If you are familiar with Docker, see here for easier CUDA environment setup.

πŸš€ Run the package

You have two options:

  1. Train the traversability model with your own robot from scratch
  2. Use pre-trained model to predict traversability

⚠️ Note: For optimal performance, we highly recommend training the model with your own robot's data. The robot's unique sensor setup and motion dynamics are crucial for accurate traversability predictions, yet the configuration of our robot might differ from yours. For details on our settings, visit urban-traversability-dataset repo.


The entire training-to-deployment pipeline consists of three steps:

  1. Label Generation: Generate the traversability label from the dataset.
  2. Model Training: Train the traversability model with the generated labels.
  3. Traversability Estimation: Prediction/mapping of the terrain traversability with your own robot.

For rapid testing of the project, you can use checkpoints in #model-zoo and directly go to #traversability-estimation.


1. Label Generation

Launch ROS node

roslaunch lesta label_generation.launch

Generate labels with rosbag

Note: See #sample datasets for example rosbag files.

rosbag play {your-rosbag}.bag --clock -r 3

Save traversability labels

rosservice call /lesta/save_label_map "training_set" ""  # {filename} {directory}

The labeled height map will be saved as a single training_set.pcd file in the root directory of the package.


2. Model Training

Launch training script with parameters

Note: See pylesta/configs/lesta.yaml for more training details.

# Make sure your virtual environment is activated
cd LeSTA
python pylesta/tools/train.py --dataset "training_set.pcd"

3. Traversability Estimation

Prerequisites

Configure model_path variable in lesta_ros/config/*_node.yaml with your model checkpoint.

  • trav_prediction_node.yaml
  • trav_mapping_node.yaml

Note: See #model-zoo for our pre-trained checkpoints.

Launch ROS node

We provide two options for traversability estimation:

Traversability Prediction Traversability Mapping

Left: Robot-centric traversability prediction. Right: Real-time traversability mapping.

1. Traversability Prediction 2. Traversability Mapping
  • Robot-centric local traversability
  • Suitable for local motion planning
  • Score traversability from model inference
  • Global traversability mapping
  • Suitable for global path planning
  • Update traversability scores over time

How to run:

  1. For traversability prediction:

    roslaunch lesta traversability_prediction.launch
  2. For traversability mapping:

    roslaunch lesta traversability_mapping.launch

Test the node with rosbag

rosbag play {your-rosbag}.bag --clock -r 2

Sample datasets

  • Download rosbag files to test the package. The datasets below are configured to run with the default settings:

See urban-traversability-dataset repository for more data samples.

Model Zoo

Model Description Environment Features Download
LeSTA-parking-lot The model trained on parking lot dataset Urban (parking lot with low-height curbs)
  • Step
  • Slope
  • Roughness
  • Curvature
LeSTA-campus-road The model trained on campus road dataset Urban (campus roads with flat terrain and hills)
  • Step
  • Slope
  • Roughness
  • Curvature

Using Docker

To be updated...

Issues

  • Artifacts from dynamic objects:

    • We currently implemented a raycasting-based approach to remove artifacts from dynamic objects.
    • This is crucial for accurate static terrain representation, which directly impacts prediction quality.
    • Yet, not enough to handle all artifacts.
    • We are working on more robust methods to detect and filter dynamic objects in real-time.
  • Performance degradation due to noisy height mapping:

    • Traversability is learned and predicted from a dense height map.
    • The dense height map is accomplished by concatenating many sparse LiDAR scans.
    • A good SLAM / 3d pose estimation is required to get a good height map.
    • In typical settings, FAST-LIO2, LIO-SAM, or CT-ICP are good starting points.
    • We are working on improving the height mapping accuracy.

πŸ“ Citation

Thank you for citing our paper if this helps your research project:

Ikhyeon Cho, and Woojin Chung. 'Learning Self-Supervised Traversability With Navigation Experiences of Mobile Robots: A Risk-Aware Self-Training Approach', IEEE Robotics and Automation Letters, Feb. 2024.

@article{cho2024learning,
  title={Learning Self-Supervised Traversability With Navigation Experiences of Mobile Robots: A Risk-Aware Self-Training Approach}, 
  author={Cho, Ikhyeon and Chung, Woojin},
  journal={IEEE Robotics and Automation Letters}, 
  year={2024},
  volume={9},
  number={5},
  pages={4122-4129},
  doi={10.1109/LRA.2024.3376148}
}

You can also check the paper of our baseline:

Hyunsuk Lee, and Woojin Chung. 'A Self-Training Approach-Based Traversability Analysis for Mobile Robots in Urban Environments', IEEE International Conference on Robotics and Automation (ICRA), 2021.

@inproceedings{lee2021self,
  title={A self-training approach-based traversability analysis for mobile robots in urban environments},
  author={Lee, Hyunsuk and Chung, Woojin},
  booktitle={2021 IEEE International Conference on Robotics and Automation (ICRA)},
  pages={3389--3394},
  year={2021},
  organization={IEEE}
}

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Contact

For any questions or feedback, feel free to contact us! or publish an issue on GitHub.