Skip to content

licungang/Metahuman-Emotion-Recognition

Repository files navigation

Emotion Recognition Capable Metahumans

Our objective is to develop a 3D Virtual Human (3DVH) prototype that can mirror a user’s emotional reaction. The user emotional reaction is captured using a computer vison-based technique and a camera system. The VH will mimic user emotion in real-time.

Supported Emotions:

  • Anger
  • Fear
  • Happy
  • Neutral
  • Sad
  • Surprised

Our Virtual Humans

image

Example Results

image (Showcasing Happy & Surprised)

Installation Process & Setup Process

Run locally:

  1. Clone this repository, cd into it, and install dependancies:
   git clone https://github.com/Prem-ium/Metahuman-Emotion-Recognition.git
   cd EmotionDetection
   pip install -r requirements.txt
  1. Configure your .env file (See below and example for options)
  2. Run the main script:
   python emotional-detection-main.py
  1. Open Unreal Engine Project & Run the Blueprint
  2. Click the button to trigger the text reader to process the most common emotion recorded.
  3. The Metahuman mimics the user's most common emotion.
  4. Repeat Steps 5-6 until desired termination

Enviornmental Variables

Configure your variables in a .env file within the same directory.

HEADLESS=True or False. Whether to open a GUI for testing webcam accuracy. Defaults to True.

PRODUCTION= True or False. Whether the program is running in Unreal Engine or not.

DELAY= Integer value of how many seconds the program will wait before starting the next iteration

FILE_PATH= Path of directory containing the model and weights

WEIGHTS=Name of eight Model being used.

Group Members

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages