Modeling Perspective-Taking by Correlating Visual and Proprioceptive Dynamics

Fabian SchrodtCognitive Modeling, Computer Science, Tuebingen, Germany
Georg LayherInstitute of Neural Information Processing / Ulm University, Ulm, Baden-Wuerttemberg, Germany
Heiko NeumannInstitute of Neural Information Processing / Ulm University, Ulm, Baden-Wuerttemberg, Germany
Martin ButzCognitive Modeling, Computer Science, Tuebingen, Germany

Abstract

How do we manage to step into another person's shoes and eventually derive the intention behind observed behavior? We propose a connectionist neural network (NN) model that learns self-supervised a prerequisite of this social capability: it adapts its internal perspective in accordance to observed biological motion. The model first learns predictive correlations between proprioceptive motion and a corresponding visual motion perspective. When a novel view of a biological motion is presented, the model is able to transform this view to the closest perspective that was seen during training. In effect, the model realizes a translation-, scale-, and rotation-invariant recognition of biological motion. The NN is an extended adaptive resonance model that incorporates self-supervised error backpropagation and parameter bootstrapping by neural noise. It segments and correlates relative, visual and proprioceptive velocity kinematics, gradually refining the emerging representations from scratch. As a result, it is able to adjust its internal perspective to novel views of trained biological motion patterns. Thus, we show that it is possible to take the perspective of another person by correlating proprioceptive motion with relative, visual motion, and then allowing the adjustment of the visual frame of reference to other views of similar motion patterns.

Files

Modeling Perspective-Taking by Correlating Visual and Proprioceptive Dynamics (473 KB)



Back to Table of Contents