MPNet algorithm implemented and tested for use with the Baxter Research Robot in a set of set of realistic obstacle scenes for motion planning experiments.
Install ROS and the necessary ROS packages below in a catkin workspace.
Navigate to wherever you keep your Python virtual environments, and create a new one for PyTorch with Python 2.7, and the other Python packages required to run (replace $PYTHON2_PATH with the absolute path to your system's python 2.7 -- for example, /usr/bin/python
). See this link for more details on Python virtual environments.
virtualenv pytorch-python2 -p $PYTHON2_PATH
and then install the dependencies from the requirements.txt file found in the root of this repository, after activating the virtual environment
source /path/to/pytorch-python2/bin/activate
pip install -r requirements.txt
In a catkin workspace, clone repo within the source folder and build the workspace
cd /path/to/catkin_workspace/src/
git clone https://github.com/anthonysimeonov/baxter_mpnet_experiments.git
catkin build
Navigate to data
folder, download and unzip this file to obtain a sample dataset for paths, collision free targets, and point clouds. Navigate to models
folder, and download and unzip this file to obtain a set of trained neural network models that produce similar results as described in the paper.
Configure whatever expert planner you want to use by modifying the ompl_planning.yaml
file in the baxter_moveit_config
package (found under the config
directory, RRTStarkConfig
works well). Rebuild workspace if any changes here have been made, then launch environment.
roslaunch baxter_mpnet_experiments baxter-mpnet.launch
and then run training data generation python
python path_data_generation.py
Make sure virtual environment is sourced
source /path/to/environments/pytorch-python2/bin/activate
and then run training (make sure run_training.sh
has been made executable)
./run_training.sh
Make sure that both the Baxter planning scene are launched and PyTorch Python2 virtual environment are sourced, and then run the test script
./run_testing.sh
During testing, the MPNet algorithm will plan paths and a collision checker will verify if they are feasible or not. The paths are then stored in the local path_samples
directory, and can be played back on the simulated Baxter with the path_playback_smooth.py
script. This is a very minimal working example of playing a saved path from start to goal, so USE WITH CAUTION --- ESPECIALLY IF RUNNING ON THE REAL ROBOT.
python path_playback_smooth.py
The experiments can alternatively be run in a container which has all the system dependencies set up, if the local system is incompatible with any of the supporting packages/libraries. The docker requires the local system to have a GPU and Nvidida drivers compatible with CUDA 9.0 (it can be easily adapted to work with CPU-only systems). To build the container, first download and unzip this folder in the docker
directory, which contains some of the resources necessary to use CUDA in the container, then navigate to the docker
folder, and execute the command
docker build -t baxter-moveit-docker .
Once the container has been built, navigate the root directory, and run the run_image.bash
executable. This will run the docker image and open a new terminal inside the container, and this repository and all its source code for running the experiments will be mounted inside the image. All the scripts and environments can then be run inside the container (see below for details).
In the terminal opened after launching the image, follow the below steps to set up the MoveIt! planning environment.
catkin build
source devel/setup.bash
roslaunch baxter_mpnet_experiments baxter_mpnet.launch
Then in a new terminal, enter the container with the command (replace $CONTAINER_NAME with whatever name was assigned to the container that was just started)
docker exec -it $CONTAINER_NAME bash
and once inside the container,
source devel/setup.bash
roscd baxter_mpnet_experiments
And all the MPNet scripts can be run as described in the section above.