af·fec·tive /əˈfektiv/
Relating to moods, feelings or their expression
Human communication goes far past words. We often convey more with our facial expressions, body language, and tone than with the actual words we say.
With many disorders it becomes almost impossible to decipher the meaning and emotion of our non-verbal communication, especially facial expressions. In Frontotemporal disorders (a subset of dementias), strokes, schizoid disorder, TBIs, other disorders, an agnosia develops where a person stops being able to create and recognize facial expressions. This can cause daily turmoil by mistaking someone’s sadness for anger or shock for happiness. Their lack of facial expressions or inappropriate use of facial expressions also make it harder for them to communicate with their loved ones.
In similar disorders that cause a loss in recognition, there is often training to help improve and monitor the patient’s skills. We see this often in relearning to read or write after strokes and with aphasias. Our group thought the same principles of relearning and monitoring could be applied to facial expressions!
We sought out to design an initial web application for learning and creating facial expressions based on emotions. With gamification and metric tracking, we hope that we can create tools for people to practice non-verbal skills that were lost to them with illness.
Affective is a web application that has two games for practicing recognition of emotions through facial expressions.
In Game 1: Emotion Recogniton, users will be presented with an image of someone making another facial expression. The user will then select the emotion they think the image is portraying from a list of emotions (angry, happy, sad, surprise, and neutral). If they correctly identify the emotion, they’ll receive a point.
In Game 2: Emotion Imitation, users will be presented with an emotion and prompted to take and upload a picture of them showing that emotion with their face. The web application will then use an API to check if that image correctly matches the emotion prompted. If they correctly match the emotion, they’ll receive a point.
Arsa Technology. Face Detection and Analysis API. Retrieved January 14, 2023, from https://rapidapi.com/arsa-technology-arsa-technology-default/api/face-detection-and-analysis/details.
Olszanowski, M., Pochwatko, G., Kuklinski, K., Scibor-Rylski, M., Lewinski, P., & Ohme, R. K. (2015). Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Frontiers in psychology, 5, 1516. https://doi.org/10.3389/fpsyg.2014.01516