Browsing by Author "Pereira, Carolina da Silva Costa"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
- Multimodal Interaction System Supported by Digital HumansPereira, Carolina da Silva Costa; Gonçalves, Alexandrino José Marques; Rodrigues, Nuno Carlos Sousa; Ribeiro, Roberto Aguiar; Marto, Anabela Gonçalves RodriguesAs digital services continue to grow in popularity, there is an increasing need for systems that replicate human interaction while prioritizing user satisfaction and engagement. Early implementations offered interactions that felt unnatural, mainly due to the lack of realistic facial expressions and movement in character animations. Over time, however, significant advancements—particularly from the video game industry—have driven major improvements in this field. Titles such as Senua’s Saga: Hellblade II and Black Myth: Wukong illustrate how these developments have enabled the creation of immersive characters with highly realistic facial animation. The Metaverse has emerged as a key area of interest, offering immersive virtual environments where users interact within a shared digital space. This evolution has increased the demand for personalized Digital Humans—high-fidelity, computer-generated avatars capable of expressing empathy in real time. This study examines how Digital Humans can enhance human–computer interaction by reducing the emotional disconnect commonly associated with automated systems. Such avatars show potential across remote meetings, customer service, online education, and Metaverse platforms, fostering more natural and engaging interactions. Emotionally expressive Digital Humans were created using the MetaHuman framework and Unreal Engine, incorporating multimodal MoCap based on computer vision and RGB camera input. Two user tests were conducted: one focused on facial expressions and another combining facial expressions with full-body movement, involving a total of 40 participants. Empathy levels were assessed using the Toronto Empathy Questionnaire (TEQ), administered before and after interaction. One-Way ANOVA analyses showed no statistically significant differences in elicited empathy between default and custom animations. In the facial-animation test, participants’ average TEQ scores were 47.4 for default animations and 45.0 for custom ones, both within or above the general population average (40–45). In the combined full-body test, mean scores were 47.6 for human body movement and 45.5 for MetaHuman body motion. Although personalization did not significantly outperform default animations, the results highlight the essential role of realistic body movement in shaping emotional perception and interaction quality. The findings confirm that emotionally expressive Digital Humans can be effectively integrated into digital platforms, while showing that facial personalization alone must be complemented by contextual and narrative elements to maximize empathetic impact. This work provides a solid foundation for future research on deploying Digital Humans in real-world interactive systems.
