Skip to content

Repository for continual learning of regions to enhance loop closure detection and relocalization.

License

Notifications You must be signed in to change notification settings

MI-BioLab/continual-learning-regions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

continual-learning-regions

Repository for continual learning of regions to enhance loop closure detection and relocalization in the context of robotic SLAM.

Set up the environment

  1. Download the repository
  2. Create a conda environment called clr (stands for continual learning regions) using the yaml file in the folder with the command conda env create -f environment.yaml
  3. Activate the environment with the command conda activate clr
  4. Inside the avalanche folder run the command pip install -e . to install avalanche adapted for the experiments

Download the data

USyd

You can download USyd from here and follow the instruction from here to use it. We suggest to use docker, you can find an image inside the docker folder in this repository.

KITTI

You can download KITTI from here. To run KITTI with rtabmap_ros, you can create the rosbags using this package.

OpenLoris-Scene

You can download OpenLoris-Scene from this page. To run the experiments you need both the packages and the rosbags.

St.Lucia Multiple Times of Day

You can download the ten sequences from here.

Run the experiments

Inside the experiments folder you can change the settings by manipulating the files inside the config folder. To run the experiments you can change the main.py inside the src folder and run it with the command python src/main.py

Run RTAB-Map

To run RTAB-Map, we used Docker. Inside the docker folder there is a rtabmap folder which contains a Dockerfile you can use to build a docker image. See the docker/README.md for a step by step guide to build and run the docker image. In docker rtabmap folder is under /root/SLAM/programs (or ~/SLAM/programs), while rtabmap_ros is already inside ~/catkin_ws/src.

After run the container, you have to install libtorch C++ (we used libtorch 1.13.1, with CUDA 11.6). In our experiments, we never used GPU, so if you only want to reproduce the experiments, we suggest you to install libtorch-cpu.

You need to check if your GPU and nvidia drivers are compatible with CUDA 11.6.

  • You can install CUDA 11.6 and cuDNN 8.9.7 for CUDA 11.x.
  • you can download libtorch-cpu here.
  • you can download libtorch with CUDA 11.6 support from here

Install everything inside the docker container.

After libtorch installation you need to add to your .bashrc file (in /root/.bashrc if you used docker):
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/libtorch/lib/

If you also installed CUDA, add to your .bashrc file also:

export CUDA_HOME=/usr/local/cuda
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/lib64:/usr/local/cuda/extras/CUPTI/lib64
export PATH=$PATH:$CUDA_HOME/bin

In your bash run the command
source /path/to/.bashrc

Now in you bash run the command
ldconfig

Now, you can install rtabmap using the following commands:

cd path/to/rtabmap
mkdir build
cd build
cmake -DOpenCV_DIR=/usr/local/lib/cmake/opencv4 .. 
make -j4 
make install

Then you need to build catkin_ws with rtabmap_ros using the following commands:

cd ~/catkin_ws 
catkin_make -DOpenCV_DIR=/usr/local/lib/cmake/opencv4 -DRTABMAP_SYNC_MULTI_RGBD=ON -DRTABMAP_SYNC_USER_DATA=ON -j4

If you have problems in building catkin_ws, try this:

cd ~/catkin_ws/src/rtabmap_ros 
. /opt/ros/melodic/setup.bash
cd ~/catkin_ws 
catkin_make -DOpenCV_DIR=/usr/local/lib/cmake/opencv4 -DRTABMAP_SYNC_MULTI_RGBD=ON -DRTABMAP_SYNC_USER_DATA=ON -j4

About

Repository for continual learning of regions to enhance loop closure detection and relocalization.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published