This program allows the user to control trained student models with their facial movement, which is captured by the iFacialMocap software. You can purchase the software from the App Store for 980 Japanese Yen.
Make sure you have (1) created a Python environment and (2) downloaded model files as instruction in the main README file.
- Open a shell.
cd
to the repository's directory.cd SOMEWHERE/talking-head-anime-4-demo
- Run the program.
bin/run src/tha4/app/character_model_ifacialmocap_puppeteer.py
- Open a shell.
cd
to the repository's directory.cd SOMEWHERE\talking-head-anime-4-demo
- Run the program.
bin\run.bat src\tha4\app\character_model_ifacialmocap_puppeteer.py
-
Run iFacialMocap on your iOS device. It should show you the device's IP address. Jot it down. Keep the app open.
-
Invoke the
character_model_ifacialmocap_puppeteer
application. -
You will see a text box with label "Capture Device IP." Write the iOS device's IP address that you jotted down there.
-
Click the "START CAPTURE!" button to the right.
If the programs are connected properly, you should see the numbers in the bottom part of the window change when you move your head.
-
Now, you can load a student model, and the character should follow your facial movement.