Skip to main content
Log in

Framing how we think about disagreement

  • Published:
Philosophical Studies Aims and scope Submit manuscript

Abstract

Disagreement is a hot topic right now in epistemology, where there is spirited debate between epistemologists who argue that we should be moved by the fact that we disagree and those who argue that we need not. Both sides to this debate often use what is commonly called “the method of cases,” designing hypothetical cases involving peer disagreement and using what we think about those cases as evidence that specific normative theories are true or false, and as reasons for believing as such. With so much weight being given in the epistemology of disagreement to what people think about cases of peer disagreement, our goal in this paper is to examine what kinds of things might shape how people think about these kinds of cases. We will show that two different kinds of framing effect shape how people think about cases of peer disagreement, and examine both what this means for how the method of cases is used in the epistemology of disagreement and what this might tell us about the role that motivated cognition is playing in debates about which normative positions about peer disagreement are right and wrong.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Alexander, J. (2010). Is experimental philosophy philosophically significant? Philosophical Psychology, 23, 331–355.

    Article  Google Scholar 

  • Alexander, J. (2012). Experimental philosophy: An introduction. Cambridge: Polity Press.

    Google Scholar 

  • Alexander, J. (2016a). Thought experiments, mental models, and experimental philosophy. In J. Nado (Ed.), Advances in experimental philosophy and philosophical methodology (pp. 53–68). New York: Bloomsbury Press.

    Google Scholar 

  • Alexander, J. (2016b). Philosophical expertise. In W. Buckwalter & J. Sytsma (Eds.), Companion to experimental philosophy (pp. 557–567). New York: Blackwell.

    Google Scholar 

  • Alexander, J., Mallon, R., & Weinberg, J. (2010). Accentuate the negative. Review of Philosophy and Psychology, 1, 297–314.

    Article  Google Scholar 

  • Alexander, J., & Weinberg, J. (2007). Analytic epistemology and experimental philosophy. Philosophy Compass, 2, 56–80.

    Article  Google Scholar 

  • Alexander, J., & Weinberg, J. (2014). The “unreliability” of epistemic intuitions. In E. Machery & E. O’Neill (Eds.), Current controversies in experimental philosophy (pp. 128–145). New York: Routledge.

    Google Scholar 

  • Balcetis, E., & Dunning, D. (2006). See what you want to see: Motivational influences on visual perception. Journal of Personality and Social Psychology, 91, 612–625.

    Article  Google Scholar 

  • Cappelen, H. (2012). Philosophy Without Intuitions. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Christensen, D. (2007). Epistemology of disagreement: The good news. Philosophical Review, 116, 187–217.

    Article  Google Scholar 

  • Christensen, D. (2009). Disagreement as evidence: The epistemology of controversy. Philosophy Compass, 4, 756–767.

    Article  Google Scholar 

  • Clarke, S. (2013). Intuitions as evidence, philosophical expertise and the development challenge. Philosophical Papers, 42, 175–207.

    Article  Google Scholar 

  • Cullen, S. (2010). Survey-driven romanticism. Review of Philosophy and Psychology, 1, 275–296.

    Article  Google Scholar 

  • Dawson, E., Gilovich, T., & Regan, D. (2002). Motivated reasoning and performance on the Wason selection task. Personality and Social Psychology Bulletin, 28, 1379–1387.

    Article  Google Scholar 

  • Demaree-Cotton, J. (2016). Do framing effects make moral intuitions unreliable? Philosophical Psychology, 29, 1–22.

    Article  Google Scholar 

  • DePaul, M., & Ramsey, W. (1998). Rethinking intuition. Latham: Rowman & Littlefield.

    Google Scholar 

  • Deutsch, M. (2015). The Myth of the Intuitive: Experimental Philosophy and Philosophical Method. Cambridge: MIT Press

    Book  Google Scholar 

  • Ditto, P., & Lopez, D. (1992). Motivated skepticism: Use of differential decision criteria for preferred and nonpreffered conclusions. Journal of Personality and Social Psychology, 63, 568–584.

    Article  Google Scholar 

  • Ditto, P., Scepansky, J., Munro, G., Apanovitch, A., & Lockhart, L. (1998). Motivated sensitivity to preference-inconsistent information. Journal of Personality and Social Psychology, 75, 53–69.

    Article  Google Scholar 

  • Dunning, D., Leuenberger, A., & Sherman, D. (1995). A new look at motivated inference: Are self-serving theories of success of product of motivational forces? Journal of Personality and Social Psychology, 69, 58–68.

    Article  Google Scholar 

  • Elga, A. (2007). Reflection and disagreement. Nous, 41, 478–502.

    Article  Google Scholar 

  • Elgin, C. (2010). Persistent disagreement. In R. Feldman & T. Warfield (Eds.), Disagreement (pp. 53–68). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • Enoch, D. (2010). Not just a truthometer: Taking oneself seriously (but not too seriously) in cases of peer disagreement. Mind, 119, 953–997.

    Article  Google Scholar 

  • Feldman, R. (2006). Epistemological puzzles about disagreement. In S. Hetherington (Ed.), Epistemology futures (pp. 216–236). Oxford: Oxford University Press.

    Google Scholar 

  • Gilovich, T. (1991). How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life. New York: The Free Press

    Google Scholar 

  • Grundmann, T. (2010). Some hope for intuitions: A reply to Weinberg. Philosophical Psychology, 23, 481–509.

    Article  Google Scholar 

  • Hales, S. (2006). Relativism and the foundations of philosophy. Cambridge: MIT Press.

    Google Scholar 

  • Hawthorne, J., & Srinivasan, A. (2013). Disagreement without transparency: Some bleak thoughts”. In D. Christensen & J. Lackey (Eds.), The epistemology of disagreement: New essays (pp. 9–30). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • Horvath, J. (2010). How (not) to react to experimental philosophy. Philosophical Psychology, 23, 447–480.

    Article  Google Scholar 

  • Kaplan, M. (2000). To what must an epistemology be true. Philosophy and Phenomenological Research, 61, 279–304.

    Article  Google Scholar 

  • Kelly, T. (2005). The epistemic significance of disagreement. In J. Hawthorne & T. Szabo Gendler (Eds.), Oxford studies in epistemology (Vol. 1, pp. 167–196). Oxford: Oxford University Press.

    Google Scholar 

  • Kelly, T. (2010). Peer disagreement and higher order evidence. In R. Feldman & T. Warfield (Eds.), Disagreement (pp. 111–174). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • Knobe, J. (2007a). Experimental philosophy. Philosophy Compass, 2, 81–92.

    Article  Google Scholar 

  • Knobe, J. (2007b). Experimental philosophy and its philosophical significance. Philosophical Explorations, 10, 119–122.

    Article  Google Scholar 

  • Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108, 480–498.

    Article  Google Scholar 

  • Lackey, J. (2010). What should we do when we disagree? In J. Hawthorne & T. Szabo Gendler (Eds.), Oxford studies in epistemology (pp. 274–293). Oxford: Oxford University Press.

    Google Scholar 

  • Lackey, J. (2014). Epistemology of disagreement. Oxford Bibliographies. doi:10.1093/OBO/9780195396577-0137.

    Google Scholar 

  • Liao, S. M., Wiegmann, A., Alexander, J., & Vong, G. (2012). Putting the trolley in order: Experimental philosophy and the loop case. Philosophical Psychology, 25, 661–671.

    Article  Google Scholar 

  • Lombrozo, T. (2009). The role of moral commitments in moral judgment. Cognitive Science, 33, 273–286.

    Article  Google Scholar 

  • Lord, C., Ross, L., & Lepper, M. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37, 259–287.

    Article  Google Scholar 

  • Ludwig, K. (2007). The epistemology of thought experiments: First person versus third person approaches. Midwest Studies in Philosophy, 31, 128–159.

    Article  Google Scholar 

  • Machery, E. (2011). Thought experiments and philosophical knowledge. Metaphilosophy, 42, 191–214.

    Article  Google Scholar 

  • Machery, E. (2012). Expertise and intuitions about reference. Theoria, 27, 37–54.

    Google Scholar 

  • Machery, E. (2017). Philosophy within its proper bounds. Oxford: Oxford University Press.

  • Machery, E., Stich, S., Rose, D., Chatterjee, A., Karasawa, K., Struchiner, N., Sirker, S., Naoki, U., & Hashimoto, T. (forthcoming). Gettier was framed! In E. McCready et al. (Eds.), Epistemology for the rest of the world. Oxford: Oxford University Press.

  • Nadelhoffer, T., & Feltz, A. (2008). The actor-observer bias and moral intuitions: Adding fuel to Sinnott-Armstrong’s fire. Neuroethics, 1, 133–144.

    Article  Google Scholar 

  • Nado, J. (2014). Philosophical expertise. Philosophy Compass, 9, 631–641.

    Article  Google Scholar 

  • Nado, J. (2015). Philosophical expertise and scientific expertise. Philosophical Psychology, 28, 1026–1044.

    Article  Google Scholar 

  • Petrinovich, L., & O’Neill, P. (1996). Influence of wording and framing effects on moral intuitions. Ethology and Sociobiology, 17, 145–171.

    Article  Google Scholar 

  • Rini, R. (2013). Analogies, moral intuitions, and the expertise defense. Review of Philosophy and Psychology, 5, 169–181.

    Article  Google Scholar 

  • Ryberg, J. (2012). Moral intuitions and the expertise defense. Analysis, 73, 3–9.

    Article  Google Scholar 

  • Schaeffer, N. C., & Presser, S. (2003). The science of asking questions. Annual Review of Sociology, 29, 65–88.

    Article  Google Scholar 

  • Schwarz, N. (1996). Cognition and communication: Judgmental biases, research methods, and the logic of conversation. Mahwah: Erlbaum Publishers.

    Google Scholar 

  • Schwitzgebel, E., & Cushman, F. (2012). Expertise in moral reasoning? Order effects on moral judgment in professional philosophers and non-philosophers. Mind and Language, 27, 135–153.

    Article  Google Scholar 

  • Schwitzgebel, E., & Cushman, F. (2015). Philosophers’ biased judgments persist despite training, expertise and reflection. Cognition, 141, 127–137.

    Article  Google Scholar 

  • Sinnott-Armstrong, W. (2008). Framing moral intuitions. In W. Sinnott-Armstrong (Ed.), Moral psychology, volume 2: The cognitive science of morality—Intuition and diversity (pp. 47–76). Cambridge: MIT Press.

    Google Scholar 

  • Sosa, E. (2007). Experimental philosophy and philosophical intuition. Philosophical Studies, 132(1), 99–107.

    Article  Google Scholar 

  • Sripada, C., & Konrath, S. (2011). Telling more than we can know about intentional action. Mind and Language, 26, 353–380.

    Article  Google Scholar 

  • Swain, S., Alexander, J., & Weinberg, J. (2008). The instability of philosophical intuitions: Running hot and cold on Truetemp. Philosophy and Phenomenological Research, 76, 138–155.

    Article  Google Scholar 

  • Thomson, J. (1985). The trolley problem. Philosophy & Public Affairs, 36, 359–374.

    Article  Google Scholar 

  • Tobia, K., Buckwalter, W., & Stich, S. (2013). Moral intuitions: Are philosophers experts? Philosophical Psychology, 26, 629–638.

    Article  Google Scholar 

  • Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211, 453–458.

    Article  Google Scholar 

  • Uhlmann, E., Pizarro, D., Tannenbaum, D., & Ditto, P. (2009). The motivated use of moral principles. Judgment and Decision Making, 4, 476–491.

    Google Scholar 

  • Weatherson, B. (2013). Disagreements, philosophical and otherwise. In D. Christensen & J. Lackey (Eds.), The epistemology of disagreement: New essays (pp. 54–76). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • Weinberg, J. (2008). How to challenge intuitions empirically without risking skepticism. Midwest Studies in Philosophy, 31, 318–343.

    Article  Google Scholar 

  • Weinberg, J., Gonnerman, C., Buckner, C., & Alexander, J. (2010). Are philosophers expert intuiters? Philosophical Psychology, 23, 331–355.

    Article  Google Scholar 

  • Williamson, T. (2007). The Philosophy of Philosophy. New York: Blackwell Publishing.

    Book  Google Scholar 

  • Worsnip, A. (2014). Disagreement about disagreement? What disagreement about disagreement? Philosophers’ Imprint, 14, 1–20.

    Google Scholar 

Download references

Acknowledgements

We would like to thank the following for valuable feedback on earlier versions of this paper: Joshua Knobe, Shaun Nichols, Jonathan Weinberg, members of the Arizona Experimental Philosophy Lab, members of the Department of History and Philosophy of Science and the Center for Philosophy of Science at the University of Pittsburgh, participants at the 2014 Buffalo X-Phi Conference, Daniel Howard-Snyder’s three-day conference Intellectual Humility: Its Nature, Value, and Implications, the capstone conference for The Science of Intellectual Humility, and the 2016 Meeting of the Society for Philosophy and Psychology, and an anonymous reviewer for this journal. Support was provided by the Fuller Theological Seminary/Thrive Center in concert with the John Templeton Foundation and by the Center for Philosophy of Science at the University of Pittsburgh.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joshua Alexander.

Appendix

Appendix

You and your friend are doctors in the same practice. One of your patients has a very serious condition. Both of you examine the patient, study his medical records, read the relevant literature, and come to conflicting conclusions. It seems that there are two theories that might explain the patient’s symptoms. You and your friend have a long history of working together, and know that over time you’ve been equally successful at making the right diagnosis.

Suppose that you find out that your friend disagrees with you about some claim. Your friend has moderately high confidence that it’s true, and you have moderately high confidence that it’s false. But to the best of your knowledge, your friend is just as well informed as you are—in fact, suppose that you and your friend have had long discussions in which you share every bit of evidence that you can think of that is relevant to the claim in question, that you have good reasons to believe that you are both equally intelligent and rational, and that you have no general reason to think that either one of you is especially likely to be particularly good or bad at reacting to evidence about this particular claim—no reason, that is, aside from the fact that you disagree about whether the claim is true or false. In other words, you friend seems to be what some have called an “epistemic peer”.

Suppose you and your friend go out to dinner. When it is time to pay the check, you agree to split the check evenly and to give a 20% tip. You do the math in your head and become highly confident that your shares are $43 each. Meanwhile, your friend does that math in her head and becomes highly confident that your shares are $45 each. You and your friend have a long history of eating out together and dividing the check in your heads, and know that you’ve been equally successful at making these kinds of calculations: usually you agree; but when you disagree, you know that your friend is right as often as you are. Moreover, you are both feeling sharp tonight and thought that the calculation was pretty straightforward before learning that you disagreed about the shares.

Suppose you and your friend go out to dinner. When it is time to pay the check, you agree to split the check evenly and to give a 20% tip. You do the math carefully on pencil and paper, checking your results with a calculator, and become highly confident that your shares are $43 each. But then your friend, who was also writing down numbers and pushing calculator numbers, announces that your shares are $45 each. You and your friend have a long history of eating out together and dividing the check in this way, and know that you’ve been equally successful at making these kinds of calculations: usually you agree; but when you disagree, you know that your friend is right as often as you are. Moreover, you are both feeling sharp tonight and thought that the calculation was pretty straightforward before learning that you disagreed about the shares.

Suppose that you are a meteorologist who has access to current weather data provided by National Oceanic and Atmospheric Administration, the National Weather Service, etc., and that you have learned to apply various models to use this data in making predictions. Applying the models is not just a matter of clear-cut calculation; it involves judgment and experience. After thoroughly studying the data and applying the various models you know, you come to believe that there is a 55% that it will rain tomorrow. But then you learn that your classmate from meteorology school—who has thoroughly studied the same data, knows the same models, etc.—believes that there is only a 45% that it will rain tomorrow. You and your former classmate have extensive track records of past predictions, and have done equally well in the past.

Suppose that you and your friend independently evaluate the same factual claim—for example, the claim that the death penalty significantly deterred crime in Texas in the 1980s. Each of you has read the same crime statistics, sociological reports, and so on, and has no other relevant evidence. Furthermore, you count your friend as an epistemic peer—as being as good as you at evaluating such claims. You perform your evaluation, and come to a conclusion about the claim. But then you find out that your friend has come to the opposite conclusion.

Suppose that you and a friend are standing by the window looking out on the quad. Your friend seems to see what looks like a person in a blue coat in the middle of the quad, but you see nothing of the kind there. Although you disagree, you know each other to be honest and to have comparable vision. You both know that something weird is going on, but you have no idea which of you has the problem; either your friend is “seeing things” or you are missing something.

Jack takes great pleasure in slowly killing young children by torturing them while forcing their parents and siblings to watch. He has devised a way to do this often, without fear of getting caught. You think this behavior of Jack’s is morally wrong—extremely so. But you have two friends—both of whom seem to you to be about as intellectually virtuous as you—who disagree. One is an ethical egoist who thinks this behavior of Jack’s is morally right since it maximizes Jack’s self-interest; the other is a moral nihilist who thinks it’s not the case that Jack’s behavior is morally wrong since there are no moral facts and nothing is either morally wrong or morally right. All three of you feel disgusted by Jack’s behavior and very strongly wish that Jack wouldn’t engage in it. But only you think it is morally wrong. Now, each of you lays before the others all of the relevant considerations you can think of for your respective views on Jack’s behavior, including arguments for and against moral nihilism and ethical egoism. And each of you has a theory of error explaining why the other two mistakenly think as they do about the morality of Jack’s behavior. Moreover, each of you believes that the other two have strong apparent insights in support of their own views, including the theories of error they have.

Suppose two paleontologists, Jack and Jill, are epistemic peers who disagree about the fate of the Neanderthals. Jack believes that Neanderthals were an evolutionary dead end. Unable to compete, the simply died out. Jill believes that Neanderthals evolved into later hominids whose descendants are alive today. Because the issue is complex and the evidence is equivocal, they come to different conclusions about it.

Suppose that two epistemic peers—let’s call them ‘you’ and ‘I’—are each deliberating about what attitude to take towards a given hypothesis H in the light of the available evidence. Suppose further that, as a result of my assessment of the evidence, I come to believe H, while as a result of your assessment of the evidence, you come to believe not-H. We become aware of our disagreement.

Suppose that you and I have been exposed to the same evidence and arguments that bear on some proposition: there is no relevant consideration that is available to you but not to me, or vice versa. Suppose further that neither of us has any particular reason to think that he or she enjoys some advantage over the other when it comes to assessing considerations of the relevant kind, or that he or she is more or less reliable about the relevant domain. Indeed, let us suppose that, to the extent that we do possess evidence about who is more reliable—evidence afforded, perhaps, by a comparison of our past track records—such evidence suggests that we are more or less equally reliable when it comes to making judgments about the domain in question. Nevertheless, despite being peers in these respects, you and I arrive at different views about the question on the basis of our common evidence.

Suppose that you and I have been exposed to the same evidence and arguments that bear on some proposition: there is no relevant consideration that is available to you but not to me, or vice versa. For the sake of concreteness, we might picture that you and I are attentive members of a jury charged with determining whether the accused is guilty. The prosecution, following the defense, has just rested its case. Suppose further that neither of us has any particular reason to think that he or she enjoys some advantage over the other when it comes to assessing considerations of the relevant kind, or that he or she is more or less reliable about the relevant domain. Indeed, let us suppose that, to the extent that we do possess evidence about who is more reliable—evidence afforded, perhaps, by a comparison of our past track records—such evidence suggests that we are more or less equally reliable when it comes to making judgments about the domain in question. Nevertheless, despite being peers in these respects, you and I arrive at different views about the question on the basis of our common evidence. Perhaps I find myself quite confident that the accused is guilty while you find yourself equally confident of the opposite.

You and I, two equally attentive and well-sighted individuals, stand side-by side at the finish line of a horse race. The race is extremely close. Just as the first horses cross the finish line, it looks to me as though Horse A has won the race in virtue of finishing slightly ahead of Horse B; on the other hand, it looks to you as though Horse B has won in virtue of finishing slightly ahead of Horse A. We discover that we disagree about which horse has won the race.

You and I are each attempting to determine the current temperature by consulting our own personal thermometers. In the past, the two thermometers have been equally reliable. I consult my thermometer, find that it reads ‘68 degrees’, and so immediately take up the corresponding belief. Meanwhile, you consult your thermometer, find that it reads ‘72 degrees’, and so immediately take up that belief. We compare notes and discover that our thermometers have disagreed.

Despite having access to the same substantial body of evidence E, you and I arrive at very different opinions about some hypothesis H: while I am quite confident that H is true, you are quite confident that it is false. As it turns out, hypothesis H is quite unlikely given evidence E. Your confidence that H is false is a reasonable response to the evidence, and you respond in this way precisely because you recognize that H is quite unlikely given E. On the other hand, my confidence that H is true is an unreasonable response to the evidence and reflects the fact that I have significantly overestimated the probative force of E with respect the H.

You are a professional mathematician. Within the mathematics community, there is substantial and longstanding interest in a certain mathematical conjecture. (Call it The Conjecture.) If forced to guess, some members of the community would guess that The Conjecture is true, others that it is false; all agree that there is no basis that would justify a firm opinion one way or the other. Then, 1 day, the unexpected happens: alone in your study, you succeed in proving The Conjecture. On the basis of your proof, you become extremely confident, indeed practically certain, that The Conjecture is true. Because your high degree of confidence is based on a genuine proof that you correctly recognize as such, it is fully justified. Later, you show the proof to a colleague whose judgment you respect. Much to your surprise, the colleague, after examining the proof with great care, declares that it is unsound. Subsequently, you show the proof to another colleague, and then to a third, and then to a fourth. You approach the colleagues independently and take pains to ensure that they are not influenced by one another in arriving at their judgments about the status of your proof. In each case, however, the judgment is the same: the proof is unsound. Ultimately, your proof convinces no one: the entire mathematical community is united in its conviction that it is unsound, and thus, that the status of The Conjecture remains very much an open question.

Estelle, Edwin, and I, who have been roommates for the past 8 years, were eating lunch together at the dining room table in our apartment. When I asked Edwin to pass the wine to Estelle, he replied, ‘Estelle isn’t here today’. Prior to this disagreement, neither Edwin nor I had any reason to think that the other is evidentially or cognitively deficient in any way, and we both sincerely avowed our respective conflicting beliefs.

Harry and I, who have been colleagues for the past 6 years, were drinking coffee at Starbucks and trying to determine how many people from our department will be attending the upcoming APA. I, reasoning aloud, say, ‘Well, Mark and Mary are going on Wednesday, and Sam and Stacey are going on Thursday, and, since 2 + 2 = 4, there will be four other members of our department at that conference’. In response, Harry asserts, ‘But 2 + 2 does not equal 4’. Prior to this disagreement, neither Harry nor I had any reason to think that the other is evidentially or cognitively deficient in any way, and we both sincerely avowed our respective conflicting beliefs.

I have lived in Chicago for the past 15 years and during this time I have become quite familiar with the downtown area. Of the many restaurants that I enjoy frequently dining at, My Thai on Michigan Avenue is among my favorites. Jack, my neighbor, moved into the same apartment building the very weekend that I did 15 years ago and he, too, has become quite competent in his acquaintance with the city. Indeed, it is not uncommon for us to bump into each other at various places, My Thai being one of them. Today, when I saw Jack coming out of his apartment, I told him that I was on my way to My Thai on Michigan Avenue, after which he responded, ‘My Thai is not on Michigan Avenue—it is on State Street’. Prior to this disagreement, neither Jack nor I had any reason to suspect that the other’s memory is deficient in anyway, and we both rightly regarded one another as peers as far as knowledge of Chicago is concerned.

While reading in the library with my friend Eva, I glance out the window, catch a glimpse of a bird flying by, and on this basis hastily form the belief that a magpie just flew by. After saying to Eva, who was just looking out the window at the same time, that I enjoyed seeing the magpie that just flew by, she responded, “Nothing flew by the window.” Prior to this disagreement, neither Evan nor I had any reason think that the other is evidentially or cognitively deficient in any way, and we both sincerely avowed our respective conflicting beliefs.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Alexander, J., Betz, D., Gonnerman, C. et al. Framing how we think about disagreement. Philos Stud 175, 2539–2566 (2018). https://doi.org/10.1007/s11098-017-0971-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11098-017-0971-9

Keywords

Navigation