Representing and Learning a Large System of Number Concepts with Latent Predicate Networks

Abstract

Conventional concept learning models typically acquire one concept at a time and underemphasize how many concepts are learned as part of systems rather than as individuals. Natural number is one of the first such abstract conceptual systems children learn. Models of number learning focused on single-concept acquisition have, however, largely ignored two challenges posed by number viewed as a conceptual system: 1) there are infinitely many semantically distinct number concepts; and 2) people can reason flexibly about any number concept. To succeed, models must learn the structure of the infinite set, focusing on how relationships between numbers support reference and generalization. Here, we suggest that latent predicate networks (LPNs) – a probabilistic grammar formalism – facilitate tractable learning and reasoning for number concepts. We show how to express several key numerical relationships in our framework, and how Bayesian learning for LPNs models key phenomena observed in children learning to count.


Back to Table of Contents