Adults and preschoolers seek visual information to support language comprehension in noisy environments

AbstractLanguage comprehension in grounded, social contexts involves integrating information from both the visual and the linguistic signals. But how should listeners prioritize these different information sources? Here, we test the hypothesis that even young listeners flexibly adapt the dynamics of their gaze to seek higher value visual information when the auditory signal is less reliable. We measured the timing and accuracy of adults (n=31) and 3-5-year-old children's (n=39) eye movements during a real-time language comprehension task. Both age groups delayed the timing of gaze shifts away from a speaker's face when processing speech in a noisy environment. This delay resulted in listeners gathering more information from the visual signal, more accurate gaze shifts, and fewer random eye movements to the rest of the visual world. These results provide evidence that even young listeners adjust to the demands of different processing contexts by seeking out visual information that supports language comprehension.

Return to previous page