Barsalou’s (1999, 2003) perceptual symbol systems hypothesis describes how semantic knowledge is grounded in sensorimotor experience. According to the theory, knowledge is acquired through sensorimotor simulations. This challenges the classical view supported by the disembodied cognition hypothesis, which generally favours an abstract and symbolic system. We propose a unified perspective, in which, the embodied cognition hypothesis, with a particular focus on the semantic domain, is provided with a mechanistically tractable computational framework based on the parallel distributed processing (PDP) paradigm. A critical difference between the current approach and previous mechanistic accounts of embodied cognition is that the current approach avoids using hand-coded representations and instead, relies on an agent-based simulation with environmental interaction for the creation of situated inputs and outputs, supplemented with supervised and unsupervised deep learning mechanisms, from which semantic cognition emerges.