Bifurcation analysis of a Gradient Symbolic Computation model of incremental processing

Abstract

Language is ordered in time and an incremental processing system encounters temporary ambiguity in the middle of sentence comprehension. An optimal incremental processing system must solve two computational problems: On the one hand, it has to keep multiple possible interpretations without choosing one over the others. On the other hand, it must reject interpretations inconsistent with context. We propose a recurrent neural network model of incremental processing that does stochastic optimization of a set of soft, local constraints to build a globally coherent structure successfully. Bifurcation analysis of the model makes clear when and why the model parses a sentence successfully and when and why it does not---the garden path and local coherence effects are discussed. Our model provides neurally plausible solutions of the computational problems arising in incremental processing


Back to Table of Contents