Music, language, and gesture: Neural oscillations and relational cognition

AbstractMusic, language, and action involve the ability to combine and flexibly recombine sequences of discrete elements into hierarchical structures. Can structures in one domain influence the other? Does this sequential structure building process rely on shared neural resources or shared types of computation? Initially, we tracked a neural correlate of this sequential structure-building process in each domain individually using steady-state evoked potentials (SSEPs). We then explored the behavioral effect on sentence comprehension of mismatching linguistic phrase structures with metrical musical ones. We interpret our findings in terms of the Shared Syntactic Integration Resource Hypothesis. We extend the purview of this theory beyond harmonic syntax in music to considerations of how the mental organisation of musical elements in time (meter) can be considered syntactic. Our findings suggest fresh parallels between language and music, and how certain processes may be shared by more domain-general aspects of our cognitive architecture.


Return to previous page