Infant Action Prediction in the Wild

AbstractThe ability to predict others’ actions is fundamental to successful joint action, communication, and theory of mind. Research has shown that infants predict other people’s actions across a variety of laboratory tasks. However, it is unknown whether the action prediction skills that infants demonstrate during screen-based eye-tracking tasks scale up to real-life action contexts, and whether they relate to general learning abilities. To address these questions, we used head-mounted eye-tracking to investigate action prediction and visual sequence learning during live parent-child interactions. Findings reveal that 18-month-old infants predict reaching actions during the majority of trials, and that their gaze latencies become faster as they learn 3-step action sequences. These findings demonstrate that infants can learn sequence regularities and anticipate the actions of other people in live, naturalistic contexts, as they have been shown to do in traditional laboratory contexts. This research contributes new insight into early cognitive and social development.


Return to previous page