Sign language experience affects comprehension and attention to gesture

AbstractDifferent language experiences could shape how one looks for information in communication, particularly gesture. In a within-subjects design, deaf signing (n = 12) and hearing participants (n = 30) watched narratives in four conditions: Gesture+Speech without Sound, Gesture+Speech with Sound, No Gesture+Speech without Sound, and No Gesture+Speech with Sound. Subjects did a forced choice task, choosing between two cartoon vignettes that best matched the narrative. There were Easy and Hard trials. Across conditions, speakers spent less time looking to the Face than signers (β=-0.17,p<0.001), but looked more to Gesture than signers (β=0.18,p<0.001). For comprehension, we focused on our analyses on the G+S without Sound where we predicted the two groups would differ. For Hard trials signers performed marginally better than speakers (p =.09). Future work will explore how these different attention patterns emerge in development.


Return to previous page