Consider this exchange from the 1994 American film Dumb and Dumber:
Lloyd: What do you think the chances are of a guy like you and a girl like me ending up together? …
Mary: Not good.
Lloyd: You mean, not good like one out of a hundred?
Mary: I’d say more like one out of a million.
Lloyd: So, you’re telling me there’s a chance. Yeah!
When Mary says, “one in a million,” what does she mean? Mary’s claim, about chance, might be about a mathematical or logical calculation. It might be about the properties of the objects in the world. And it might be about people’s mental states.
We call these alternatives ‘interpretations’ of probability. Let’s explore these alternatives.
1. Logical Conceptions: Classical and Logical Probability
A popular rule for assigning probabilities is the Principle of Indifference. One version says that if there are so many – n – possible outcomes of an event, and we have insufficient reason to expect any outcome over any other outcome, then each distinct outcome has a 1/n probability: each is equally likely to occur. The Principle of Indifference is controversial, but versions of it are widely employed.
Classical Probability is the version of probability we get by using the Principle of Indifference: divide the number of outcomes we are interested in by the number of possible outcomes. If there are four aces but fifty-two cards in a deck (and each card is equally likely to be drawn), then the classical probability of drawing an ace in one draw is 4/52, i.e., 1/13.
A relative of classical probability is ‘Logical’ Probability, which attempts to refine Classical Probability in a way that provides a general logic for learning from experience. The basic idea is that we can describe outcomes in more ways than just listing all the possibilities. For example, if I toss two coins, we can list the four possible outcomes: HH, HT, TH, and TT. But we can also describe three possible outcomes: ‘both Heads,’ ‘both Tails,’ and ‘half and half.’ In brief, the ability to describe outcomes in this latter way – as “structures” rather than merely as “states” – may allow us to explain logically how learning from experience works.
In the Lloyd-Mary exchange, if she’s referring to Classical Probability, she might be reasoning like this:
‘There are only about one million people I might realistically end up with, and Lloyd is only one of those one million people; but there’s no reason to believe he’s more likely than any other. Therefore, the chance is only about one in a million.’
2. Physical Conceptions: Frequency and Propensity Probability
We calculate Frequency Probability by considering a set of trials or events and counting how many of those events feature the outcome we care about. If I watch cars go by for a few hours, and I find that 5% of the cars I’ve seen are Toyotas, then there’s a 5% frequency-probability that a car in this set I’ve seen is a Toyota.
‘Propensity’ Probability is a related conception. Who would win in a 100-meter sprint: Napoleon Bonaparte on his twenty-fifth birthday, or Usain Bolt on his fiftieth birthday? It may seem as if there are facts now about the probability of Bonaparte’s winning, as well as the probability of Bolt’s winning. But frequency probability can’t tell us, because the two have never raced, and they never will. If there’s a fact about how likely it is that Bonaparte would beat Bolt, then, this must be because of the inherent features of Bolt and Bonaparte themselves: their strengths, weights, levels of cardiovascular fitness, stride-lengths, and so on. So there is some inherent propensity-probability that Bolt will win, but there is also some (far lesser) inherent propensity-probability that Bonaparte will win.
In the Lloyd-Mary exchange, if Mary is talking about Frequency Probability, then she means something like this:
‘In general, people like you end up with people like me about one time out of a million.’
If she’s talking about Propensity Probability, then she means that at the current moment, the qualities of Lloyd and the qualities of Mary make it such that they are inherently one-in-a-million likely to end up together.
On the assumption that induction is reliable, Frequency Probability is good evidence of a propensity, and in turn, that tells you how justified you are in expecting a certain event.
3. Cognitive Conceptions: Subjective Probability and Objective, Epistemic Probability
We can describe our personal, subjective confidence in something (e.g., that a belief is true, that something will happen, etc.) in terms of percentages. For example, if I’m completely certain that something will occur, I am 100% confident that it will occur. If I’m only 20% certain that something will occur (and 80% certain that it won’t occur), then I might say that it ‘probably’ won’t occur. So, I might say, ‘You’ll probably never understand Michael Bay’s genius,’ meaning that I’m confident that you won’t. If I’m just offering a description of my own mental states, I’m talking about my Subjective Probability: my confidence in something.
This doesn’t say whether a person’s confidence is justified: whether that confidence is reasonable, given the person’s evidence. That notion may be called ‘Objective, Epistemic Probability.’ In the Lloyd-Mary exchange, if Lloyd were happy about Mary’s reply because he confidently concluded that he would end up with her, then that would be objectively unreasonable. Indeed, we normally think that if you learn that the objective (say, propensity) probability that something will occur is x, then you should set your confidence that it will occur to be x.
There are debates about each of these conceptions of probability: whether they give rise to paradoxes, whether they match intuitive conceptions of probability or everyday-language uses of ‘probability,’ and so on. But because knock-down proofs are uncommon in philosophy – conclusions rarely are reached with certainty – the language of probability is very common, as it is in daily life.,
 Keynes 1921: chapter IV. The Principle of Indifference is controversial, but versions of it are widely employed. See e.g. Jaynes 2003: chs. 7 and 12 for discussion of the principle and alternatives.
 Laplace 1902 : ch. 1.
 Carnap 1950.
 We can call these “physical” conceptions because they tend to deal with observed, physical phenomena and objects (cf. Venn 2006 : ch. IV). But strictly speaking, we may be able to imagine abstract objects’ having propensities.
 Venn, op. cit.
 See Peirce 1910: 237 and Popper 1957: 67 for discussions of propensity probability. I say it’s “related” because it is also about objective features of the objects in the trials or events, and frequency is commonly taken to be good evidence of propensity. See n. 8 below as well.
 Arguably, however, if determinism is true, then the only propensity probabilities that exist are 0% and 100%. See, e.g., Loewer 2004.
 Whether there are such things as objective facts about which beliefs are reasonable or unreasonable for a person, given their evidence, is a distinct, but related, philosophical debate. But we can understand the experience of gaining evidence for a belief in terms of probabilities. Perhaps ˹x is evidence of y˺ means that the (epistemic) probability of y, after encountering x, is higher than the probability of y before having encountered x: x makes y more (epistemically) likely or probable, and you would be justified in raising your confidence that y.
 Lewis (1980) calls this the “principal principle” (p. 266). Compare Pettigrew 2016: ch. 8.
 For a much deeper introduction, see Hájek 2018.
 See for example Swinburne 2004 as an example of the use of probabilistic arguments to attempt to justify belief in (but not 100% prove) the existence of God, and Rowe 1979 as an example of the use of probabilistic arguments to attempt to justify belief in (but not 100% prove) the nonexistence of God.
 Thanks to Alan Hájek for helpful comments on an earlier version of this essay.
Carnap, Rudolf. Logical Foundations of Probability. Chicago, IL: University of Chicago Press, 1950.
Hájek, Alan. “Interpretations of Probability,” in Zalta, E. N. (ed.), The Stanford Encyclopedia of Philosophy, Spring 2018 Edition, URL = < https://plato.stanford.edu/archives/spr2018/entries/probability-interpret/>.
Hoefer, Carl. “Causal Determinism,” in Zalta, E. N. (ed.), The Stanford Encyclopedia of Philosophy, Spring 2018 Edition, URL = < https://plato.stanford.edu/archives/spr2018/entries/determinism-causal/>.
Jaynes, E. T. Probability Theory: The Logic of Science. New York, NY: Cambridge University Press, 2003.
Keynes, John Maynard. A Treatise on Probability. London, UK: Macmillan and Co., 1921.
Laplace, Pierre-Simon. A Philosophical Essay on Probabilities, tr. Truscott, Frederick Wilson and Emory, Frederick Lincoln. London, UK: John Wiley & Sons, 1902 .
Lewis, David. “A Subjectivist’s Guide to Objective Chance.” In Jeffrey, Richard C. (ed.), Studies in Inductive Logic and Probability, Volume II (Berkeley, CA: University of California Press, 1980), pp. 263-93.
Loewer, Barry. “David Lewis’s Humean Theory of Objective Chance.” Philosophy of Science 71 (5) (2004): 1115-1125.
Peirce, C. S. “Notes on The Doctrine of Chances.” Popular Science Monthly 44 (1910): 237-45.
Pettigrew, Richard. Accuracy and the Laws of Credence. Oxford, UK: Oxford University Press, 2016.
Popper, Karl R. “The Propensity Interpretation of Probability.” British Journal for the Philosophy of Science 10 (1959): 25-42.
Rowe, William. “The Problem of Evil and Some Varieties of Atheism.” American Philosophical Quarterly 16 (4) (1979): 335-41.
Swinburne, Richard. The Existence of God, 2nd ed. New York, NY: Oxford University Press, 2004.
Venn, John. The Logic of Chance, 3rd ed. Mineola, NY: Dover Publications, 2006 .
White, Alan. “The Propensity Theory of Probability.” British Journal for the Philosophy of Science 23 (1) (1792): 35-43.
The Problem of Induction by Kenneth Blake Vernon
About the Author
Tom is an assistant professor at Spring Hill College in Mobile, AL. He received his PhD in philosophy from the University of Colorado, Boulder. He specializes in ethics, metaethics, epistemology, and the philosophy of religion. Tom has two cats whose names are Hesperus and Phosphorus. http://shc.academia.edu/ThomasMetcalf