You can already see them in areas such as healthcare, retail, and education: robots are increasingly becoming part of our society. This also makes it more important to be able to easily communicate with them. Social robots – unlike industrial robots – are specifically meant to interact with people. “So this is not a vacuum cleaner robot, but a robot with whom we can actually communicate, such as a personal assistant,” explains robotics researcher Chinmaya Mishra. “We want them to behave as we expect in our society. To make our lives easier, robots should be made to fit our way of communicating.”

 

Human-like

The face of robots plays a big role in this. “This has been ignored by many developers because it is super difficult to make a robot’s face do the same as a human’s,” says Mishra. “There are robots that are getting close, but they are extremely expensive.” In particular, eye contact, gaze direction and facial expressions are crucial in human communication. “A social robot that has to receive people in a hospital, for example, could smile when referring someone to the right room, or look away for a moment when it needs to think,” says Mishra. “This would create a more personal and natural interaction.”

 

For his research, the robotics researcher used a Furhat robot, a social robot with a back-projected animated face that can move and express emotions in a human-like manner. He developed an algorithm to automate the robot’s viewing behaviour during human-robot interactions. The system was then evaluated on test subjects. “Especially averting the gaze turned out to be very important,” explains Mishra. “If we made the robot stare at the participant, the participant started feeling uncomfortable and avoiding the robot’s gaze. So if a robot exhibits non-human gaze behaviour, interacting with it becomes more difficult.”

 

Emotions

To get the robot to express the right emotions, Mishra used the precursor of ChatGPT (GPT-3.5), which ‘listened’ in on the conversation, and based on that predicted the emotion the robot should show – such as happy, sad, angry, disgusted, afraid or surprised – which then appeared on the Furhat robot. Results from a user study showed that this approach worked well and participants scored higher in a collaborative task with the robot when the robot expressed appropriate emotions.  Participants also felt more positive about their interaction with a robot if it displayed correct emotions. Mishra: “A robot that provides emotionally appropriate responses makes for a more effective collaboration between humans and robots.”

 

Tool

Mishra’s research shows that appropriate non-verbal behaviour facilitates our interaction with robots, but that does not mean that lifelike, human robots will soon be walking the streets. “Robots are tools,” argues the researcher. “They don’t have to be able to do everything we can, that’s over-engineering. But if they can communicate with us in a familiar way, we don’t have to teach ourselves new communication behaviour. Eye gaze could also be indicated with a pointer, an emotion could be represented with a word/LED. But that is not natural for us. Why should we have to adapt? We would be better off developing robots that adapt to what we know.”