Skip to content

Commit

Permalink
Started expressing emotion blog post
Browse files Browse the repository at this point in the history
  • Loading branch information
TeoLj committed Dec 24, 2023
1 parent 9bb2638 commit c06955e
Show file tree
Hide file tree
Showing 2 changed files with 31 additions and 4 deletions.
8 changes: 4 additions & 4 deletions docs/_posts/2023-12-12-Interaction-Component.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ Botender's behavior is adapted in real-time to respond appropriately to the use

## Components and their functionality

- The **InteractionManager** Thread starts and manages the interactions. It waits for a face to be detected by the *Perception Manager* and initiates an interaction sequence. It also manages the current interaction state and delegates tasks to the *Interaction Coordinator*.
- The **InteractionManager** Thread starts and manages the interactions. It waits for a face to be detected by the *Perception Manager* . When a face is detected and present for a specified duration (more than 150 consecutive frames), the thread starts an interaction by creating an instance of the InteractionCoordinator.

- The **GazeCoordinator** Thread manages Botender's gaze, ensuring it follows the user or adopts an idle state when no interaction is taking place. By utilizing frame dimensions and perception data, it calculates where Botender should look. That is how a more engaging interaction is created.
- The **GazeCoordinator** Thread manages Botender's gaze, ensuring it follows the customer or adopts an idle state when no interaction is taking place. By utilizing frame dimensions and perception data, it determines where Botender should look. That is how a more engaging interaction is created.

- The **InteractionCoordinator** follows the state design pattern. It manages different states. Each state encapsulates specific behaviors and responses. The transitions between states are based on user input, perceived emotions, and interaction progress.

Expand All @@ -17,11 +17,11 @@ The *Interaction Manager* utilizes a FSM to manage the flow of interaction betwe

The use of a FSM allows for a rule-based conversation flow with Botender. Hence, the FSM provides a predictable flow of interaction, where each state can be developed and tested independently. This enhances maintainability.

The states in Botender's Interaction Manager are listed below:
The states are listed below:
- The `Greeting State`: It initiates the interaction with a welcoming message. Botender greets the user using pre-defined phrases and gestures. After the greeting is completed, the FSM transitions to the Introduction State.
- The `Introduction State`: This state involves listening to the user's response and extracting the user's name using natural language processing techniques. It handles the introduction of the user to the robot.
- The `Acknowledge Emotion State`: This state acknowledges the user's current emotional state as determined by the User Perception subsystem. It responds with appropriate gestures and comments based on the perceived emotion. To conclude the interaction, it leads to the Farewell State.
- The `Farewell State`: The final state which concludes the interaction. Botender bids farewell to the user with a friendly and positive gesture.
- The `Search State`:
- The `Search State`: Represents a scenario where Botender actively looks for the customer or awaits a new customer's arrival. The robot exhibits gestures indicating it's searching for interaction.

At any state, the system may return to the Greeting State, if the user disappears.
27 changes: 27 additions & 0 deletions docs/_posts/2023-12-24-EmotionLearning.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
---
layout: post
title: Expressing emotions
---
Our robotic bartender has now mastered the art of expressing emotions. Botender can convey feelings ranging from happiness to concern.


Botender selects a gesture based on the perceived emotion of the customer, using a randomization function to keep interactions unpredictable and varied.


## Technical Background
Each gesture type, like 'happy', includes several JSON files, each representing a unique expression within that emotion. These files detail precise facial movements recorded through an iPhone.

The randomization function first retrieves a list of possible gestures for the specified type. If gestures are available, the function uses a random index within the range of the available gestures to select one. Its details are loaded from the corresponding JSON file. The gestures are used when:

- **Responding to emotional cues**: Botender utilizes the gesture system to respond to the perceived emotional state of customers. For instance, if the Perception Manager identifies a customer as happy, Botender may use a gesture from the 'happy' category.

- **Enhancing Conversational Context**: Gestures are also used to complement Botender's verbal responses, adding depth and context to conversations. For example, while listening to a customer, Botender might use a gesture from the 'listening' category to show attentiveness.
- **Dynamic Interaction Flow**: The gesture system is integrated into Botender's interaction flow, allowing it to seamlessly switch between different gestures based on the conversation's context and the customer's emotional state.



## Example: The Happy Gesture

---- Include Picture -----

The 'Happy' gesture JSON files reveal a sequence of frames, each dictating specific facial movements. Parameters like `EYE_BLINK_LEFT`, `MOUTH_SMILE_LEFT`, and `NECK_TILT` are fine-tuned to mimic human expressions, ensuring that Botender's gestures are as lifelike and relatable as possible.

0 comments on commit c06955e

Please sign in to comment.