Author: Jonathan Matheson
Word Count: 998
People tend to care deeply about their political, religious, ethical, and philosophical beliefs. For many of these beliefs, however, you know of people who disagree with you and are roughly as intelligent, informed, and open-minded as you are.
How should you respond when you recognize this fact? Should you lose confidence in our own beliefs, gain confidence, reject your own beliefs, adopt others’ beliefs, suspend judgment on the issue, or some other response? This is the epistemological problem of disagreement.
1. Peer Disagreement
Philosophers try to understand the epistemic significance of disagreement initially by examining idealized cases of “peer disagreement.” People are “epistemic peers” about some matter when they are equally likely to be correct about that matter: they are roughly equal in terms of information, intelligence, and intellectual virtues, the factors that make them likely to have true beliefs. (Actual disagreements often don’t involve genuine epistemic peers in this sense; this is discussed below).
The problem is that when peers disagree about some matter of fact, at least one of them must be incorrect: at least one of them has a false belief about the issue. Peer disagreement is troubling because, since the peers are equally likely to be right, it is just as likely that one peer made a mistake as the other.
So, what is rational to do upon discovering that a peer disagrees? While you should seek out further evidence and double-check your reasoning, the central epistemological question concerns what you should believe.
There are two main positions on the issue. Conciliatory views of disagreement claim that upon discovering a peer disagreement you are rationally required to decrease your confidence in your belief, if not give up your belief altogether. Steadfast views of disagreement maintain that it can be rational to continue believing just as you did before.
2. Conciliatory Views
Conciliatory views claim that, upon discovering that a peer disagrees with you about some claim, you should change your mind. There is a wide spectrum of conciliatory views depending upon how much change they claim is rationally required.
The most discussed conciliatory view is the Equal Weight view. According to it, your belief and your peer’s belief should be given equal evidential weight: each should be seen as equally reasonable. This equal weighing is thought to require both peers to ‘split the difference’ and meet in the middle. So, if one peer believed that God exists and the other peer disbelieved that God exists, the Equal Weight view requires that both peers should now suspend judgment as to whether God exists: any other response would be irrational.
Two cases, among other considerations, support the Equal Weight view. First, the Thermometer case:
You and I are in the same room. We each have what we reasonably believe are equally reliable thermometers. We see that my thermometer reads ‘72’ and yours reads ‘74’.
What should we believe about the room’s temperature? To believe that the temperature is 72, simply because that is what my thermometer reads, would be arbitrarily biased. To disbelieve that it is 72, since your thermometer ‘disagrees’, would be to be overly deferential. The rational response, it seems, is to suspend judgment as to whether it is 72 and for whether it is 74: we should have no belief about the exact temperature.
Second, the Restaurant Check case:
Five people go out to dinner. We all agree to split the check evenly, not worrying about who ordered what. Eve does the math in her head and becomes highly confident that the shares are $43 each. Meanwhile, her peer Ava does the math in her head and becomes highly confident that everyone’s share is $45 each.
Intuitively, Eve should give up her belief that the shares are $43 upon learning that Ava disagrees. Again, given the disagreement, she should have no particular belief about each share of the bill, and neither should anyone else at the table.
3. Steadfast Views
Not everyone agrees with these verdicts, or at least that the intuitive results of these examples generalize to other cases of peer disagreement. According to Steadfast views, it can be rational to keep believing just as you did before you found out your peer disagreed.
Several motivations have been given for steadfast views. Some think that it makes a difference who reasoned correctly. Going back to the check case, if Eve, in fact, did the math correctly, then this is an important difference. If she correctly evaluated her evidence, then she should keep believing as she did. Learning that someone else made a mistake with the evidence should not make her give up her belief. Additional support for such a position can come from thinking about more extreme disagreements. For instance, what if Ava believed each share of the bill was $4,500? Should Eve still reduce her confidence or simply dismiss that calculation outright?
A second source of motivation for steadfast views comes from thinking about self-trust. The idea is that it makes a difference that you are one of the disagreeing parties. The trust that you place in yourself and your own faculties must come first (unlike a thermometer), so this can provide a symmetry breaker in the disagreement.
Debates about idealized peer disagreements are difficult. While thinking about peer disagreement should help us figure out how to deal with everyday disagreements, things get even more complicated with real-world disagreements. When it comes to political, religious, ethical, and philosophical beliefs, you are aware of many people who disagree (and agree) with you. Most, if not all, of those people are not our epistemic peers, sometimes because they are our epistemic superiors: they are more likely to be right than you on the matter. So, how are we to weigh all these opinions? And where does this leave our beliefs at the end of the day? Navigating actual disagreements is even more complicated than complicated idealized disagreements but, given the importance of our beliefs, it is important work we must do.
 This case follows Christensen (2007) p. 193.
 See Kelly (2005) for a more detailed defense of this view.
 See Lackey (2010) and Christensen (2009) for more on extreme disagreements.
 See King (2011).
 Further, these opinions are not always independently held. So, determining how the numbers matter is no easy thing. See Lackey (2013).
Frances, Bryan and Jonathan Matheson (2018). “Disagreement” (co-authored with Bryan Frances). Stanford Encyclopedia of Philosophy (Spring 2018 Edition), Edward N. Zalta (ed.), URL = <https://plato.stanford.edu/archives/spr2018/entries/disagreement/>.
Lackey, Jennifer (2013). “Disagreement and Belief Dependence: Why the Numbers Matter.” In The Epistemology of Disagreement, Eds. David Christensen and Jennifer Lackey. Oxford: Oxford University Press.
About the Author
Jonathan Matheson is an Associate Professor of Philosophy at the University of North Florida. He is the author of The Epistemology of Disagreement (Palgrave) and co-editor of The Ethics of Belief: Individual and Social (Oxford University Press). He has also written numerous articles in epistemology and philosophy of religion. https://jonathandmatheson.wordpress.com