Real-Time Body Reconstruction and Recognition in Virtual Reality using HTC Vive Trackers and Controllers
This project implements an Inverse Kinematics solver to animate the motions of the avatar as smoothly, rapidly, and as accurately as possible. Using a HTC Vive headset, two Vive Controllers, and five Vive Trackers, attached to the hands and feet, it is possible to create immersive VR experiences, where the user is able to perceive the avatar as her or his own body. The user can see her or his avatar from the first-person perspective and in a virtual mirror.
Furthermore, branches yoga and motion_recognition implement full-body motion recognition using Hidden Markov Models and other machine learning algorithms..
- node.js
- VisualStudio 2019 or Xcode
- git clone --recursive git@github.com:CatCuddler/BodyTracking.git
- cd BodyTracking/BodyModel/Kore
- node Kore/make --vr steamvr
or - node Kore/make
Currently Metal does not work. Use Opengl:
- node Kore/make -g opengl
If you dont use Visual Studio 2019, you can specify the used version:
- node Kore/make -v vs20xx
- Open VisualStudio or Xcode project in BodyModel/build.
- Change to "Develop x86" mode in Visual Studio (Release doent work).
- Change working directory in Xcode: Edit Scheme -> Use custom working directory -> choose Deployment directory.
- Strap one Vive Tracker on your left foot and another one on your right foot (above ankles)
- Strap the third Vive tracker on your waist
- Strap the fourth and fifth tracker on the left and right forearm.
- Hold the Vive Controller in your hands ;)
- Start the project. You will see an avatar, standing in a T-Pose.
- Press the "grip button" to set the size of the avatar (you must look straight ahead).
- Go to where the avatar stands and put your feet and hands in the same position.
- Press the "menu button" to calibrate the avatar.