Interaction ProMPs generate a robot collaborative motion based on the prediction from a set of partial human motion observations.
The approach also works in multi-task scenarios. This package use EMG signals to enhance the task recognition.
Not make sure if the EMG signals are correlated with robot motion and we will confirm it latter.
Serveral dependeces for this package.
- Python >=2.6
- NumPy
- sklearn
- SciPy >= 0.19.1
- pandas
- openni_launch
- aruco_hand_eye
- openni_tracker
- myo_driver
- states_manager
Need to upgrade the scipy especially to use the probability python module.
- Install the gfortran. Maybe need to install gfortran-5 as dependency.
- Upgrade the scipy with
sudo easy_install --upgrade scipy
(not sure need to upgrade the numpy withsudo easy_install --upgrade numpy
)
The reference tutorial is ref.
├── cfg
│ └── params.cfg
├── datasets
│ └── dataset_name
│ ├── info
│ ├── pkl
│ └── raw
├── scripts
│ ├── bag_to_csv.sh
│ ├── batch_bag2csv.py
│ ├── data_visualization.py
│ ├── ipromps_lib.py
│ ├── load_data.py
│ ├── noise_cov_cal.py
│ ├── test_online.py
│ ├── train_models.py
│ └── train_offline.py
├── README.md
├── CMakeLists.txt
└── package.xml
models.cfg
: the configuration including all params.
The datasets path involves multiple demo data in raw
path, the middle data in pkl
path and notes in info
path.
The scripts to load data, train models and test it online.
bag_to_csv.sh
: a script to convert single rosbag to csv, called in batch_bag2csv.py
batch_bag2csv.py
: a batch python script to convert rosbag to csv, run it in terminal for some ros shell script
ipromps_lib.py
: the lib for IProMP including unit ProMP, ProMPs and Interaction ProMPs
load_data.py
: load the data from csv file, filter the data and resample the data as same duration
train_models.py
: train the models from the loaded data
train_offline.py
: train the Interaction ProMPs (load data and train model), call for load_data.py
and train_models.py
visualization.py
: visualization for data
noise_cov_cal.py
: theorically measure the observation noise covariance matrix
test_online.py
: test the trained models
All commands run in baxter.sh space.
roslaunch openni_launch openni.launch
: open the xtionroslaunch aruco_hand_eye baxter_xtion_cal_pub.launch
: load the baxter-xtion calibration resultrosrun rviz rviz
: open the ros visualization windowsrosrun openni_tracker openni_tracker
: open the human skeleton tracking noderoslaunch myo_driver myo_raw_pub.launch
: start the Myo armband noderosrun states_manager states_pub.py
: open the state manager node
Some notes need to read everytime when collecting the demo data.
Everytime when collecting the datasets, please read these notes.
- Check the csv file derived from rosbag
- In the same time, start to record dataset and demonstrate a task
- Demonstrate a task with spatio-temporal variance , limit the demonstration motion space and use simple motion
- Overfitting advoidance
- cleaning/pruning
- decrease the unexpected noise: no vision occlusion about human motion tracking.
- keep consistent: not easy actually. One option is to demonstrate the simple trajectory like moving the hand toward the destination along a straight line as best as possible. The trajectories should not be too complex.
- validation: one guidance to choose the suitable training set.
- Increase demonstration number: 10-15 demonstration is the better.
- cleaning/pruning
Take some significant notes to be used in testing.
- Armband wear position on hand
- Object grasp position
- Collaborative task process video
- We need enough observation to conditioning, phase estimation fail (all observation stack on the begining) with too little observation