This project demonstrates an interactive character path-following system using long-horizon motion matching with revised future queries. The goal of this project is to enable real-time character navigation while maintaining natural and smooth movement.
This project is based on the research paper titled "Interactive Character Path-Following Using Long-Horizon Motion Matching With Revised Future Queries" by Jeongmin Lee, Taesoo Kwon, and Yoonsang Lee, published in IEEE Access in 2023.
This project requires the below prerequisites:
pip3 install -U -f https://extras.wxpython.org/wxPython4/extras/linux/gtk3/your_linux wxPython
pip3 install pybind11
sudo apt-get install libeigen3-dev
This project uses the dataset of Phase-Functioned Neural Networks for Character Control, which was developed by Holden, D., Komura, T., & Saito, J. (2017).
You can directly download the PFNN dataset from this link. Browse the neccessary motion files at pfnn\data\animations
.
In our demo, we exclude jumping, t-poses, and walking on uneven terrain. That includes:
LocomotionFlat01_000.bvh
LocomotionFlat01_000_mirror.bvh
LocomotionFlat02_000.bvh
LocomotionFlat02_000_mirror.bvh
LocomotionFlat02_001.bvh
LocomotionFlat02_001_mirror.bvh
LocomotionFlat03_000.bvh
LocomotionFlat03_000_mirror.bvh
LocomotionFlat05_000.bvh
LocomotionFlat05_000_mirror.bvh
LocomotionFlat06_000.bvh
LocomotionFlat06_000_mirror.bvh
LocomotionFlat06_001.bvh
LocomotionFlat06_001_mirror.bvh
LocomotionFlat07_000.bvh
LocomotionFlat07_000_mirror.bvh
LocomotionFlat08_000.bvh
LocomotionFlat08_000_mirror.bvh
LocomotionFlat08_001.bvh
LocomotionFlat08_001_mirror.bvh
LocomotionFlat10_000.bvh
LocomotionFlat10_000_mirror.bvh
Copy and paste those bvh files into a new directory somewhere under this repository(For example, /path/to/repo/BvhData/
).
First, clone this repository to your local machine, and checkout to a specific branch.
git clone https://github.com/jmmhappy/2dt_match.git
git checkout tracking
Make sure that your bvh folder includes foot contacts. If none, generate by running python3 parseFootContact.py BvhData/
on terminal.
Next, generate a data binary from the dataset using Python3. You can do this by running the following two lines in Python3 shell:
from util.bvhMenu import generate
generate("/path/to/repo/BvhData/", "output.bin", True)
Note that the True
option generates a motion matching database.
Lastly, if you want to, learn a future direction network(rnn). The below code will generate a binary:
python3 trainNetwork.py -d <data binary> -o <output rnn binary> -w <window size>
This file is written with pybind11, so make sure you have it installed before proceeding. To compile the file, run the following commands:
cd util
mkdir build
cd build
cmake ..
make check -j 4
mv <output> ..
Notes:
-
If your computer cannot
find_package(pybind11)
, try installingpip3 install "pybind3[global]"
. Or you can manually add the package path as a cmake option. For example,cmake .. -Dpybind11_DIR=/path/to/pybind11/share/cmake/pybind11
. -
If your computer cannot find
Eigen/Core
, try making a softlink.sudo ln -s /usr/include/eigen3/Eigen /usr/include/Eigen
.
Now that you have the binary file and the necessary dependencies set up, you can run the main program, application.py
, by executing the following command in the terminal:
python3 application.py --data <data binary> --weights <rnn binary>
Note that the RNN network is optional.
Notes:
- If something like "No context" error appears, add
PYOPENGL_PLATFORM=egl
in front of the command.
To contribute to this project, first create a fork on GitHub and then submit changes as a pull request to the original repository.