Human communication goes far beyond just words. Non-verbal behaviours, like eye contact, play a key role in helping us understand each other’s intentions and level of engagement. Recognising these cues is essential when designing robots that aim to interact naturally with people. By incorporating such behaviours, we can make human-robot interactions feel more intuitive, increase user acceptance, and create more positive experiences.
The Linguistics and Computational Linguistics Research Lab at the University of Gothenburg explores how language works in social interaction, including human-machine communication.
The Furhat robot played a central role in this research, acting as the expressive, physical platform to explore how laughter, speech, and eye movements come together in human-robot communication. Thanks to Furhat’s lifelike face and advanced gaze system, researchers were able to test how synchronising facial expressions (like laughter) with natural eye movements, could enhance social interaction.
A key part of the study involved developing an advanced neural network that allowed Furhat to learn from real human eye movement data. This enabled the robot to predict where people are likely to look and to reproduce these natural gaze patterns dynamically. As a result, Furhat could perform fluid, human-like eye movements during live interactions, making the robot feel more responsive and lifelike.
Experiments showed that when Furhat adapted its gaze behaviour to mirror human interaction patterns, people perceived the robot as more human-like and engaging. This adaptability not only improved user perception but also strengthened Furhat’s role as an effective, socially aware communication partner.
Beyond enhancing interaction quality, Furhat served as a valuable tool for testing how different gaze strategies affect human perceptions. The research revealed that subtle choices, like making eye contact or looking away during moments of laughter, significantly shape how users feel during conversations with the robot.
Finally, the research confirmed the effectiveness of the overall approach for building robots capable of multisensory, human-like communication. The success of the gaze prediction system in Furhat highlights its potential for advancing future generations of socially intelligent robots, capable of building more natural and meaningful connections with humans.
Publications
Somashekarappa, Vidya, Asad Sayeed, and Christine Howes. Neural Network Implementation of Gaze-Target Prediction for Human-Robot Interaction. 2023.
https://gup.ub.gu.se/publication/337432
Salamat Ravandi, Banafsheh. Personalized Human-Robot Interaction in Companion Social Robots. 2023. CEUR Workshop Proceedings, CEUR-WS.org.
https://gup.ub.gu.se/publication/334007
Somashekarappa, Vidya, Christine Howes, and Asad Sayeed. Good Looking: How Gaze Patterns Affect Users’ Perceptions of an Interactive Social Robot. Proceedings of IEEE Workshop on Advanced Robotics and Its Social Impacts (ARSO), 2024.
https://gup.ub.gu.se/publication/341099
Giannitzi, Eleni, Vadim Maraev, and Christine Howes. "Reacting to the Last Laugh: Probing Non-Humorous Laughter with Furhat Robot." Laughter and Other Non-Verbal Vocalisations Workshop 2024, 16–17 July 2024, Belfast, United Kingdom.
https://gup.ub.gu.se/publication/346207