Font Size

HOME > No.3, Nov 2015 > Research Highlights : Robot's disfluent speaking just to get attention from you

Robot's disfluent speaking just to get attention from you

Dynamically adapting robot’s utterance and body language based on subtle human cues By Michio Okada

Michio Okada and his colleagues have developed Talking-Ally, the novel robot that dynamically generate appropriate utterances and gestures based on the person’s attention as indicated by his or her actions. The experiments show that this new communicative approach significantly enhances the attention engagement of interactive users.

Communication between humans is based on one another’s words and body language. We can sense whether the other person is distracted, and we change the course of our conversation and our actions to regain their attention.

A user interacting with the Talking-Ally robot
A user interacting with the Talking-Ally robot

Most existing robots, however, still use monologue mechanisms, even when engaging in dialogue with a person. For example, they continue speaking in the same way, even if the person is not paying attention.

Researchers at the Interactions and Communication Design (ICD) Lab at Toyohashi University of Technology have devised a novel robotic communication approach that takes into account the listener’s attention. The robot follows a person’s gaze and determines if that person is distracted by, for instance, a sports event in the background or something in their surroundings. For example, the robot bends forward and nods if the person it is communicating with is watching television, similarly it turns its head and looks around if the person is looking elsewhere. These behaviors are accompanied by an appropriate utterance intended to regain the person’s attention. Experiments have confirmed that these adaptive interactions considerably increase the level of the other party’s attention focused toward the robot as compared with the case where the robot’s gestures and speech are generated without considering the person’s gaze.

“We have set up an environment to manipulate the person’s attention with an engaging sports program broadcast simultaneously with the human-robot interaction. This allowed us to validate a suite of conversation situations and utterance-generation patterns,” said Hitomi Matsushita, first author of the conference paper on the robot.

Movie: A user interacting with the Talking-Ally robot

“Talking-Ally dynamically determines and synchronizes its body language, turn initials, and entrust behaviors of its speech, according to the person’s attention coordinates,” Professor Michio Okada, head of the ICD Lab, explained. “Our analysis shows that this is significantly more persuasive than generating these behaviors randomly.”

The experimental results significantly contribute to the HRI community by confirming that adaptive communication is essential in acquiring and maintaining attention during conversation. Moreover, Talking-Ally demonstrates a specific communication protocol that is shown to successfully re-engage a distracted person. This is instrumental in achieving persuasive communication and convincing interaction with the robot. Such a platform can ultimately be tailored for use with any HRI application.

Talking-Ally currently chooses its responsive gestures at random from a set that suitably corresponds to the person’s level of attention. Future work on the project will include further customizing the robot’s interaction to individuals by choosing a specific body language to use in each situation based on subtle cues from the other party.

This research has been supported by both Grant-in-Aid for scientific research of KIBAN-B (26280102) and Grant-in-Aid for scientific research for HOUGA (24650053) from the Japan Society for the Promotion of Science (JSPS).


Hitomi Matsushita, Yohei Kurata, P. Ravindra S. De Silva, and Michio Okada (2015). Talking-Ally: What is the Future of Robot’s Utterance Generation? Proceedings of the 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2015), Kobe, Japan. Best Paper Award Finalist.

Naoki Ohshima, Yusuke Ohyama, Yuki Odahara, P. Ravindra S. De Silva, and Michio Okada (2015). Talking-Ally: The Influence of Robot Utterance Generation Mechanism on Hearer Behaviors, International Journal of Social Robotics, 7(1), 51-62.












Share this story

Researcher Profile

Name Michio Okada
Affiliation Department of Computer Science and Engineering
Title Professor
Fields of Research Cognitive Science / Social Robotics / Interaction