Bayesian Logic and Trial-by-Trial Learning


Standard logic and probability theory are both beset with fundamental problems if used as adequacy criteria for relating logical propositions to learning data. We discuss the problems of exception, of sample size, and of inclusion. Bayesian pattern logic (‘Bayesian logic’ or BL for short) has been proposed as a possible rational resolution of these problems. BL can also be taken as psychological theory suggesting frequency-based conjunction fallacies (CFs) and a generalization of CFs to other logical inclusion fallacies. In this paper, this generalization is elaborated using trial-by-trial learning scenarios without memory load. In each trial participants have to provide a probability judgment. Apart from investigating logical probability judgments in this trial-by-trial context, it is explored whether under no memory load the propositional assessment of previous evidence has an influence on further probability judgments. The results generally support BL and cannot easily be explained by other theories of CFs.

Back to Table of Contents