A Computational Model of Auditory Perceptual Learning: Predicting Learning Interference Across Multiple Tasks

Abstract

In this work we build a computational model of several auditory perceptual learning experiments. The modeled experiments show a pattern of learning interference which may help shed light on the structure of both short and long term stores of perceptual memory. It is our hypothesis that the observed interference patterns can be explained by the relationship of stimuli across tasks and how these relationships interact with the limits of human memory. We account for the fact that information is shared across tasks in our model through use of methodology from the machine learning community on transfer learning. When we introduce a set of plausible limits on memory, such a model demonstrates the same pattern of learning interference observed in the human experiments.


Back to Table of Contents