News: EPH has been accepted at IROS 2024 !
|
[Optional] create a virtual environment:
conda create -n eph python=3.11
conda activate eph
Install the repo locally (with requirements listed in pyproject.toml):
pip install -e '.[all]'
Note: remove [all]
if you don't want to install the optional dependencies.
To train and test we need to load the configuration file. under configs/ you can find the default configuration file eph.py. To change the configuration or create a new one, you can use export the "CONFIG" environment variable as the desired configuration name without the .py
extension:
export CONFIG=eph
To train the model, you can use the following command:
python train.py
To test the model, you can use the following command:
python test.py
We made the configuration loading dynamic, so multiple configurations are allowed for different experiments under configs/.
Before running any script, you can change which configuration to load by changing the CONFIG_NAME
variable in the config.py file:
CONFIG_NAME = 'eph'
For example, the above will load the default configuration file configs/eph.py.
To change the model, we made sure that the model path is loaded from the configuration file.
You can change the target by:
model_target = "model.Network"
This will load the Network
class from the model.py
module.
Go to src/data/ and follow the instructions in the README.md for generating the MovingAI's test set.
Our codebase is heavily based on DHC (https://github.com/ZiyuanMa/DHC) and DCC (https://github.com/ZiyuanMa/DCC). We used some inspiration from SCRIMP for our communication block (https://github.com/marmotlab/SCRIMP) and reimplemented structured maps experiments of MovingAI datasets from SACHA (https://github.com/Qiushi-Lin/SACHA).
We are also looking into implementing MAPF in some modern platform (i.e. TorchRL enviroments and integration with RL4CO) once we have some bandwidth to do so!
eph-video.mp4
If you find our code or work (or hopefully both!) helpful, please consider citing us:
@inproceedings{tang2024eph,
title={Ensembling Prioritized Hybrid Policies for Multi-agent Pathfinding},
author={Tang, Huijie and Berto, Federico and Park, Jinkyoo},
booktitle={2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
organization={IEEE},
year={2024},
note={\url{https://github.com/ai4co/eph-mapf}}
}