Cheng Guo, Lorenzo Rapetti, Kourosh Darvish, Riccardo Grieco, Francesco Draicchio and Daniele Pucci
Humanoid_2023_video_for_paper.mp4
2023 IEEE-RAS International Conference on Humanoid Robots (Humanoids)
🔓 The labeled dataset (as txt files), raw wearables dataset and models can be downloaded here.
- YARP: a library and toolkit for communication and device interfaces.
- YCM: a set of CMake files that support creation and maintenance of repositories and software packages
- CMake: an open-source, cross-platform family of tools designed to build, test and package software.
- HDE: a collection of YARP devices for the online estimation of the kinematics and dynamics of a human subject.
- iDynTree: a library of robots dynamics algorithms for control, estimation and simulation.
- Wearables: a library for communication and interfaces with wearable sensors.
- iFeel: a wearable perception system providing kinematic (position and velocities) and dynamic human information.
🔥 Ubuntu 20.04.5 LTS (Focal Fossa) is used in this project.
First download this repository:
git clone https://github.com/ami-iit/paper_Guo_2023_Humanoid_Action_Recognition_For_Risk_Prediction.git
To annotate the data, one may follow the instructions below:
- Launch the yarpserver:
yarpserver --write
- Run yarpdataplayer with:
yarpdataplayer --withExtraTimeCol 2
- Go to
~/robotology-superbuild/src/HumanDynamicsEstimation/conf/xml
and run the configuration file (in case full joints list useHuman.xml
, in case reduced joints list useHumanStateProvider_ifeel_0.xml
):
yarprobotinterface --config proper-configuration-file.xml
- Before going to
~/element_human-action-intention-recognition/build/install/bin
, be sure in the virtual environment previsouly installed, then you may run (make sure all parameters in humanDataAcquisition.ini are set properly):
./humanDataAcquisitionModule --from humanDataAcquisition.ini
- To start annotation you may need to visualize the human model by running (also be sure the parameters setting in HumanPredictionVisualizer.ini are correct):
./HumanPredictionVisualizer --from HumanPredictionVisualizer.ini
Recalling the index of each action defined here, one can annotate the data manually.
- First of all, make sure yarpserver is running.
- Open yarpdataplayer to replay data.
- Go to
~/robotology-superbuild/src/HumanDynamicsEstimation/conf/xml
and run configuration file (for 31 reduced joints DoF) with:
yarprobotinterface --config configuration_file_name.xml
- Then go to
~/element_human-action-intention-recognition/build/install/bin
and run:
./humanDataAcquisitionModule --from humanDataStreamingOnlineTest.ini
- (Remember be in virtual environment) Go to
~/element_human-action-intention-recognition
and run:
python3 ./scripts/MoE/main_test_moe.py
- (Remember be in virtual environment) Additional: for displaying the action recognition/motion prediction results, go to
~/element_human-action-intention-recognition_modified/scripts/MoE
and run:
bash ./runAnimators.sh
- Additional: for visualizing simulated human models, go to
~/element_human-action-intention-recognition_modified/build/install/bin
and run:
./HumanPredictionVisualizer --from HumanPredictionVisualizer.ini
- Additional: in case calibrating the simulated model, download the file here and run it when human model is in
T-pose
(you can stop theyarpdataplayer
first when calibrating, afterwards replay it again):
bash ./TPoseCalibration.sh zero
- Go to
~/element_risk-prediction
and run:
python3 ./src/main_model_based_risk_evaluation.py
- To start NIOSH-based ergonomics evaluation module, run:
python3 ./src/niosh_method/nioshOnlineEasyUse.py
- To display ergonomics evaluation results, go to
~/element_risk-prediction/src/niosh_method
and run:
bash ./runAnimators.sh
Under construction, for the moment one may follow the instructions here.
👤 This repository is maintained by:
@Zweisteine96 |