To what extent are schematic representations neurally distinguished from language on the one hand, and from rich perceptual representations on the other? In a group lesion study, matching tasks depicting categorical spatial relations were used to probe for the comprehension of basic spatial concepts across distinct representational formats (words, pictures, schemas). Focused residual analyses using voxel-based lesion-symptom mapping (VLSM) suggest that left hemisphere deficits in categorical spatial representation are difficult to distinguish from deficits in naming such relations, and that the right hemisphere plays a special role in extracting schematic representations from richly textured pictures. EE555, a patient with simultagnosia, performed six similar matching tasks. On the only two tasks that did not include matching to, or from, schemas, EE555 performed at chance levels. EE555 was significantly better on schema tasks, indicating that abstract analog representations make spatial relations visible in a manner that symbols and complex images do not.