Keep up to date with the latest Eye Tracking news and trends

Eye Tracking Non-Verbal Communication Cues for Advanced Robotics Pt.2

Eye Tracking Non Verbal Communication Cues for Advanced Robotics Pt.2In our first article about Jokinen’s research on non-verbal signals for turn-taking and feedback using eye tracking, we discussed the planned objectives of  her research and what she sought to uncover about the way we communicate.  This article is a continuation which describes the procedures of her research and the fascinating conclusions she and her team uncovered.

Jokinen set up her research using groups of threes at Doshisha University in Japan.  One person was designated the eye tracked person (ES), and the other two were designated the left- and right-hand speaker (LS and RS respectively).  There were a total of 28 multiparty conversations, with each lasting about 10 minutes.  Four conversations involved an all female group and with English-speaking participants.  The ES was monitored by the eye tracker, and the RS and LS were videotaped to provide a reference to where the ES’s gaze was focused. The eye tracker, while not named, was mounted on the table in front of the ES.

The data collected was then analyzed on both a dialogue and signal level.  On the dialogue level, annotations were manually made on key aspects of the observed dialogue actions.  The annotations mostly concerned dialogue acts, observed gaze and gestures in relation to their communicative functions, specifically looking at their functions for turn-management and feedback.  These annotations were based on the Augmented Multiparty Interaction project (AMI) and a modified multimodal annotation scheme.

The signal level data focused on gaze path, which was coded with the GazeObject feature, refering to the ES’s focus of attention, as measured by the eye tracker. The eye tracker that they used had a few issues tracking the eyes when the ES blinked, laughed or moved his/her head or body around.  There was also a NoGaze function that was used to categorize moments when the tracking was not occurring due to blinks or other causes of lost tracking.

When researching non-verbal cues, the terms Turn Give, Turn Take, Turn Hold and Turn None, referred to what the speakers were doing.  These terms designated whether the user was providing a cue to turn over the speaking role, actively taking a turn as the speaker, waiting to take their turn speaking, listening, or had no turn.  Depending on how the participants were interacting, they could be trying to interject for a turn by showing non-verbal communication cues or actively interject.  These descriptions provided for a method for labeling what the participant was doing, based on how they interacted with the other participants.

Ultimately, Jokinen was able to support that eye-gaze interaction is an important signal for coordinating turn-taking and feedback.  Future studies will involve a larger pool of test subjects, larger speaking groups, and variations in the way that different ethnic groups communicate with each other, as well as how they communicate with other ethnic groups.  The results indicate that non-verbal communication is made up of indexical signs, as opposed to meaning carrying symbols.

Non-verbal signals for turn-taking and feedback

Related articles:

  1. Eye Tracking Non-Verbal Communication Cues for Advanced Robotics
  2. Eye Tracking to Research Nonverbal Turn Taking Signals