OC1 sub-project: ALIVE

Summary of the project

ALIVE aims to develop Empathic Virtual Assistants (EVAs) for customer support use cases. EVAs are capable of recognising, interpreting and responding to human emotions, by applying state-of-the-art Deep Learning (DL) algorithms for vision, text and voice-based emotion recognition. The output of the models is congregated into a unique state that is fed in real-time in the dialogue state of a Large Language Model (LLM) for empathic text generation, and in the state machine of the Avatar to adapt its state accordingly and modify each answer and interaction (e.g., facial expressions) with the maximum personalization to the user’s needs.

How to know more

ALIVE was developed by Thingenious and Igodi.

Read our interview with the team to know more here.