People reason about possibilities routinely, and reasoners can infer “modal” conclusions, i.e., conclusions that concern what's possible or necessary from premises that don't mention modality. For instance, given that Cullen was born in New York or Kentucky, it's intuitive to infer that it’s possible that Cullen was born in New York. Conventional logic does not apply to modal reasoning, and so logicians invented systems to capture valid modal inferences. But, none of them can explain the inference above. We posit a novel theory based on the idea that reasoners build mental models, i.e., sets of conjunctive possibilities, when they reason about sentential connectives such as "and", "if", and "or". The theory is implemented in a new computational process model of sentential reasoning. We show how its performance matches reasoners’ inferences in studies by Hinterecker et al. (2016). We conclude by discussing the theory in light of alternative accounts of reasoning.