A controversial claim in linguistics is that children face an induction problem, which is often used to motivate the need for Universal Grammar. English anaphoric one has been argued to present this kind of induction problem. While the original solution was that children have innate domain-specific knowledge about the structure of language, more recent studies have suggested alternative solutions involving domain-specific input restrictions coupled with domain-general learning abilities. We consider whether indirect evidence coming from a broader input set could obviate the need for such input restrictions. We present an online Bayesian learner that uses this broader input set, and discover it can indeed reproduce the correct learning behavior for anaphoric one, given child-directed speech. We discuss what is required for acquisition success, and how this impacts the larger debate about Universal Grammar.