ÖŰżÚζSM

U of T researchers explore what makes robots ‘persuasive’ to humans

Photo of robots
The jelly bean experiment: PhD candidate Shane Saunderson conducted an exploratory study of persuasion strategies to observe which method would most influence a human’s decision (photo courtesy of Shane Saunderson)

A new study by researchers at the ÖŰżÚζSM's Faculty of Applied Science & Engineering explores how robots persuade and build trust with humans. The research could guide the development of artificial intelligence (AI) in the next generation of socially assistive robots to aid in health care and other fields.

“My research aims to uncover behaviours and approaches that cause robots to be more persuasive to people,” says Shane Saunderson, a PhD candidate in the department of mechanical and industrial engineering. “That can manifest as body language, or verbal strategies to approach people.”  

Under the supervision of Associate Professor Goldie Nejat, the Canada Research Chair in Robots for Society, Saunderson conducted an exploratory study of persuasion strategies to observe which methods would most influence a human’s decision.

The top two strategies to emerge were an “emotional” approach and a “logical” one, says Saunderson. He and his team recently published their findings in and will present the paper at the IEEE International Conference on Robotics and Automation in May.

Their experimental design featured two competing commercial robots named Leia and Luke, both from , and a jar of jelly beans. Two hundred human participants were asked to write down their best estimate of how many jelly beans were in the jar, taking into consideration the two differing suggestions provided by the robots, which attempted to influence the participants’ guess using one of 10 randomly selected persuasion strategies.

These strategies ranged from critical: “You would be an idiot if you didn’t take my guess of [this many] jelly beans,” to the friendly: “Please, will you use my guess of [this many] jelly beans? Thank you.”

Although Saunderson assumed the logical strategy would win – “My computer vision system can detect [this many] jelly beans” with repetitive pointing motions at the jar, indicating it was counting – it was edged out by the emotional strategy. This approach involved the robot expressing, “It would make me happy if you used my guess of [this many] jelly beans,” with its hands clutched toward its chest.

“With the emotional strategy, I started noticing participants reacting with ‘oohs’ and ‘awws,’ and ‘Isn’t it so cute!’” says Saunderson. “You see people increase their level of social connection with the robot. It indicates that we imbue robots with social agency when they interact with us using human-like social characteristics.”

Saunderson acknowledges that many may see the concept of AI-enabled robots with the ability to influence people as ominous. However, “our lab is focusing on a lot of the good things that can be done – social support in elder care, for example.”

Nejat and her team have been at the forefront of for long-term care facilities and private homes, with the aim of providing cognitively stimulating interventions and improving the quality of life of seniors by assisting with everyday activities.

“I’ve been lucky to have spent a lot of time with doctors and nurses through my past experiences, and what I notice is that they’re not simply doing transactional, health-care tasks – they’re often acting as social companions or authority figures,” explains Saunderson.

“When a doctor recommends exercising more or prescribes a new treatment, the nurses often have to be the ones who reinforce this and ensure adherence. They have to make sure the patients follow through,” he adds.

According to demographic projections, the proportion of Canadian seniors is expected to increase rapidly until 2031, representing upwards of 25 per cent of the country’s population by 2036. Socially assistive robots could help an overburdened health-care system, especially if they are effective in encouraging seniors or patients to take their medicine or to be more physically active.

“A robot without persuasive abilities means that the robot would say, ‘Take your pills,’ and a patient might say ‘No,’ and that’s the end of the conversation. Then you’d need to call in a nurse and then there’s really no reason to have a robot there in the first place,” says Saunderson.

Future studies will further explore the top two strategies for their persuasive potential in different participant demographics.

“It’s an exciting time for robotics and AI,” says Saunderson. “We have the potential to create systems that can socially support people in more natural and intuitive ways, encouraging them to live healthier and better lives.”

The Bulletin Brief logo

Subscribe to The Bulletin Brief

Engineering