Using Eye Tracking to Study Linguistic Annotation
An article published recently by researchers at Universitat Duisburg-Essen and Universitat Jena, both located in Germany, featured the use of eye tracking. Researchers focused on the concept of annotation and the behavioral tendencies of those people doing annotations for various linguistic studies.
In the world of linguistics and others, supervised approaches to machine learning are quite popular, especially in the natural language processing community. Human experts no longer create most linguistic regularities, but it’s still necessary for human experts to take part in the immense amounts of reliably annotated training material necessary for research.
In linguistics, the assignment of linguistic metadata, which documents speech, syntactic parses, or semantic interpretations, can be a complex and cognitive task. In order to annotate reliably, you need a sound competence of natural language as well as a good level of domain and text genre expertise. According to the paper, the complexity of linguistic utterances is judged by structural or behavioral criteria. Structural complexity results from static topology of phrase structure trees and graphs, but does not translate easily into “empirically justified cost measures.”
That’s where the behavioral side of things comes in to play, and it accounts for that problem as it monitors the annotators’ eye movements. Eye trackers are typically used to track these movements, revealing ambiguities and the like. Eye tracking devices used for linguistics and the observation of the annotation behavior of a subject monitor the length of gaze duration and behavioral patterns. Together, these demonstrate the hardness of the linguistic analysis, and the gaze duration and search time are considered empirical correlates of processing complexity. Therefore, they are thought to reveal more than structural behavior monitoring. Thus, the group of researchers thought eye tracking would be a good way to better understand the linguistic annotation process.
Using a Tobii eye tracking device embedded in a 17” monitor, the researchers then set out to record participants eye movements as they completed an annotation task over a set amount of time. With eye tracking technology, the team was able to investigate human annotators’ behavior during various assignments, testing a pair of hypotheses: one that related to the amount of contextual information used for annotation decisions and the other related to the degrees of syntactic and semantic complexity of the expressions that needed to be annotated.
Related articles: