L2 speakers’ argument structure sensitivity inferring from the participant’s eye gazes : an eye-tracking experimental study with adult Norwegian learners of English
MetadataShow full item record
Eye tracking lets us record eye gazes and the eye-movements of the participants as they are monitored and listen at spoken sentences for words that match pictures on a visual display. Typically, fixation patterns are closely time-locked to the ongoing verbal input, providing a continuous real-time measure of comprehension that is independent of any overt spoken or manual response (Cooper, 1974; Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995). To test L2 speaker’s sensitivity toward argument structure, we used Tanenhaus’s well tested Visual World paradigm, along side with the addition Brock et.al (Brock, Norbury, Einav, & Nation, 2008) did, adding a second condition in which the phonological competitor was present on the screen but the target was absent but mentioned in the utterance (Brock et al., 2008). In addition, the stimuli design was improved compared to Brock & colleagues to avoid the use of verbs with highly predictable argument structure. Consistent with previous studies (Allopenna, Magnuson, & Tanenhaus, 1998; Brock et al., 2008; Cooper, 1974; Tanenhaus et al., 1995), the results in the current study, conducted on adult University students, shows that eye-movements were affected by the semantic association between the sentence verb and the target object. Moreover, the effect observed in the target present condition, where the restriction effect is evident in the most restrictive sentences. Furthermore, participants looked less at the phonological competitor in the target-absent trails, but the data was not as salient as in the target-present condition.