Audio-Visual Integration: Point Light Gestures Influence Listeners’ Behavior

AbstractListeners are influenced by speakers’ hand gestures. However, it is not clear what processes support gesture processing. We investigated listeners’ behavior after observing speech with videotaped gestures or with point light gesture trajectories in the Tower of Hanoi task. Listeners were influenced by the synchrony of the visual and auditory information but not the nature of the information – both videotaped and point light gestures reliably influenced behavior. Thus, visual information that is not perceived as produced by the speaker nonetheless reliably influences listeners’ behavior, so long as information is synchronized across modalities. Thus, observers do not appear to rely on functional or biological links between speech and hand gesture but rather on more general processes of multimodal integration. The principles underlying integration of auditory language with visual information from hand gestures appear to different from those underlying integration of auditory language and visual speech.

Return to previous page