Our objective is to develop a 3D Virtual Human (3DVH) prototype that can mirror a user’s emotional reaction. The user emotional reaction is captured using a computer vison-based technique and a camera system. The VH will mimic user emotion in real-time.
- Anger
- Fear
- Happy
- Neutral
- Sad
- Surprised
(Showcasing Happy & Surprised)
Run locally:
- Clone this repository, cd into it, and install dependancies:
git clone https://github.com/Prem-ium/Metahuman-Emotion-Recognition.git
cd EmotionDetection
pip install -r requirements.txt
- Configure your
.env
file (See below and example for options) - Run the main script:
python emotional-detection-main.py
- Open Unreal Engine Project & Run the Blueprint
- Click the button to trigger the text reader to process the most common emotion recorded.
- The Metahuman mimics the user's most common emotion.
- Repeat Steps 5-6 until desired termination
Configure your variables in a .env file within the same directory.
HEADLESS
=True or False. Whether to open a GUI for testing webcam accuracy. Defaults to True.
PRODUCTION
= True or False. Whether the program is running in Unreal Engine or not.
DELAY
= Integer value of how many seconds the program will wait before starting the next iteration
FILE_PATH
= Path of directory containing the model and weights
WEIGHTS
=Name of eight Model being used.
- Prem Patel (@Prem-ium)
- Gabe Vindas (@GabeV95)
- Matthew Goetz
- Dustin Lynn (@Onemorehell)