We report two visual-world eye-tracking experiments that investigated how and with which time course emotional information from a speaker’s face affects younger (N = 32, Mean age = 23) and older (N = 32, Mean age = 64) listeners’ visual attention and language comprehension as they processed emotional sentences in a visual context. The age manipulation was aimed at testing predictions by socioemotional selectivity theory of a positivity effect in older adults. After viewing the emotional face of a speaker (happy or sad) on a computer display, participants were presented simultaneously with two pictures depicting opposite-valence events (positive and negative; IAPS database) while they listened to a sentence referring to one of the events. Participants’ eye fixations on the pictures while processing the sentence were enhanced when the speaker’s face was emotionally congruent with the sentence/picture compared to when it was not. The enhancement occurred from the early stages of sentence-reference disambiguation; importantly, it was modulated by age, in that for the older adults it was more pronounced with positive faces, and for the younger ones with negative faces. These findings demonstrate for the first time that emotional facial expressions, similarly to previously studied speaker cues such as eye gaze and gestures, are rapidly integrated into sentence processing. They also provide new evidence for positivity effects in older adults in online incremental situated sentence processing.