Recent Semantic Space Models (SSMs) are now integrating perceptual information with linguistic statistics into a unified mental space, offering a solution to the criticism that SSMs are disembodied. However, these new models introduce the problem of illusory feature migrations. When the word dog is perceived, its perceptual features should migrate to hyena, so the system can infer the perceptual features for a non-perceived word (hyenas have fur). In doing so, however, the models are unable to avoid migrating the features for dog to syntagmatically related words, such as bone. As a result, the models incorrectly infer that bones have fur. We argue that the problems of perceptual grounding and word order are not independenta model of word order information is needed to correctly infer how features should migrate in mental space. We introduce a multiplicative binding framework that allows all information sources to be stored in a composite mental space, but features will only migrate to words that share sufficient order information with directly perceived words.