Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Getting It Together: Psychological Unity and Deflationary Accounts of Animal Metacognition

Abstract

Experimenters claim some nonhuman mammals have metacognition. If correct, the results indicate some animal minds are more complex than ordinarily presumed. However, some philosophers argue for a deflationary reading of metacognition experiments, suggesting that the results can be explained in first-order terms. We agree with the deflationary interpretation of the data but we argue that the metacognition research forces the need to recognize a heretofore underappreciated feature in the theory of animal minds, which we call Unity. The disparate mental states of an animal must be unified if deflationary accounts of metacognition are to hold and untoward implications avoided. Furthermore, once Unity is acknowledged, the deflationary interpretation of the experiments reveals an elevated moral standing for the nonhumans in question.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2

Notes

  1. 1.

    Armstrong (1993: p. 93) also discusses this type of case. Discussing a case that parallels the driving case, Tye (2003: p. 2) imagines a distracted philosopher walking home thinking about her latest theory. She later realizes that she was not aware of any of her perceptions on the walk home. She sees (in some sense) the sidewalk and the trees (otherwise she would trip and bump into things), but lacks what Tye (2003: p. 5) calls “introspective consciousness” which seems to require a metacognitive ability. The distracted philosopher and the daydreaming driver are not introspectively aware of their perceptions (Tye 2003: p. 5). Yet they are conscious in some sense, as we discuss below.

  2. 2.

    Thanks to an anonymous reviewer for suggesting that we discuss in greater depth the notion of consciousness.

  3. 3.

    We think it is better to categorize consciousness into types, as with Tye (2003), rather than levels. Despite the utility of the concept of levels of consciousness in cognitive science, it faces conceptual difficulties and problems to do with properly ordering different types of global states (see Bayne et al. 2016).

  4. 4.

    Another test, for instance, is the so-called “false belief” test, constructed to ascertain when a child first learns to understand that the child’s beliefs are her own and may be different from others’ beliefs. We will focus exclusively on the uncertainty test.

  5. 5.

    In the experiments, the relevant key is marked “?” but we change the designation here only to aid our reader in interpreting Figs. 1 and 2.

  6. 6.

    References to Carruthers (2008) made by philosophers include Jacobson (2010) and Proust (2009, 2010), and references made by psychologists include Beran and Smith (2011), Couchman et al. (2009), and Smith (2009).

  7. 7.

    Folk psychology consists of the pre-theoretical assumptions people make about their own and others’ minds (Goldman 1993). Scientific progress is possible using folk psychology. The folk understand their own uncertainty in terms of conflicts between and among beliefs and desires. The subjects in the experiments, for example, desire to answer all questions in the way that brings reward but sometimes they do not know the right answer. When confronted with an ambiguous figure, the subject is unable to react quickly because of a paralyzing mismatch between their beliefs and desires. In folk psychology, therefore, the typical explanation of uncertainty is to say that the subject does not know on which belief they ought to act.

  8. 8.

    Morgan’s canon is a methodological principle used to guide the study of comparative psychology: “In no case may we interpret an action as the outcome of the exercise of a higher psychical faculty, if it can be interpreted as the outcome of the exercise of one which stands lower in the psychological scale” (Morgan 1894: p. 53). This requires that “the most general cognitive mechanism” (Karin-D’Arcy 2005: p. 182)—presumably not metacognition in the experiments discussed in this paper—be used to explain animal behavior. However, in a revised statement of the principle, Morgan cautions that the canon should not prohibit one from attributing more complex psychological processes if independent evidence suggests animals do undergo the more advanced process in question (for discussion, see Karin-D’Arcy 2005: p. 182). Grounding his view in Morgan’s canon, Carruthers (2008) holds that attributing more complex psychological processes is not necessary to explain animal behavior in the uncertainty tests since his first-order explanation is sufficient. We agree, provided the qualifications we advance in this paper. For cautions about the proper interpretation of Morgan’s Canon, see Sober (2009), Fitzpatrick (2008), Andrews and Huss (2014), and Andrews (2012).

  9. 9.

    Kant (1998: p. A103) discusses the need for the unity of consciousness in the process of counting.

  10. 10.

    Insofar as an animal’s beliefs and desires are unified in the relevant situations, this fact, if it is a fact, at least suggests that basic cognition feels to an animal no differently than basic cognition feels to us, when we are not metacognizing.

  11. 11.

    Whether Unity means that all contents of the metacognitive state must be globally broadcast in the brain or whether it applies only to the contents of specific modules is a matter we do not take up here.

  12. 12.

    Thanks to an anonymous reviewer for encouraging us to compare our notion of unity to others.

  13. 13.

    Not only must the two or more states be disposed to be accessed at time t1 (or between t1 and tn, for diachronic cases), but the subject must be disposed to access them at t1 (or during t1 through tn), in order for access to be possible.

  14. 14.

    The taxonomy of unity becomes more complicated if we combine the broader concepts of access unity and phenomenal unity with the different kinds of unity discussed above (object unity, spatial unity, etc.). These variations are not central to our argument.

  15. 15.

    Bayne and Chalmers (2003: p. 33, 46) are primarily interested in subsumptive phenomenal unity, in which a set of phenomenally conscious states are subsumed under a single phenomenal state.

  16. 16.

    If we can do this with monkeys, we can do it with humans. Hirstein (2012) argues for the possibility of mind-melding between humans, resisting the claim that the mind is necessarily private. The question arises whether such mind-melding creates an additional mind, a conclusion that seems virtually impossible to reach once Unity is firmly in place.

  17. 17.

    Although there are reasons to doubt the existence of levels of consciousness as a conceptual necessity or neuroscientific reality (Bayne et al. 2016), as mentioned in footnote 2, this need not contradict talk of “degrees” or “amount” of consciousness (or, conscious contents) for a specific type of consciousness (e.g., phenomenal, discriminatory, responsive), or to the total information processing occurring in the mind.

References

  1. Andrews, K. (2012). Do apes read minds?: toward a new folk psychology. Cambridge: MIT Press.

  2. Andrews, K., & Huss, B. (2013). Assumptions in animal cognition research. Proceedings of the CAPE International Workshops, Part II presented at the CAPE philosophy of animal minds workshop, Kyoto University, February 12. https://repository.kulib.kyoto-u.ac.jp/dspace/bitstream/2433/203241/1/capes_1_152.pdf.

  3. Andrews, K., & Huss, B. (2014). Anthropomorphism, anthropectomy, and the null hypothesis. Biology and Philosophy, 29(5), 711–729. https://doi.org/10.1007/s10539-014-9442-2

  4. Armstrong, D. (1993). A materialist theory of the mind. New York: Routledge. (Originally published in 1968).

  5. Baars, B. J. (1988). A cognitive theory of consciousness. In Cambridge [England]. New York: Cambridge University Press.

  6. Baars, B. J. (2005). Subjective experience is probably not limited to humans: the evidence from neurobiology and behavior. Consciousness and Cognition, 14(1), 7–21. https://doi.org/10.1016/j.concog.2004.11.002

  7. Bayne, T. (2010). The unity of consciousness. New York: Oxford University Press.

  8. Bayne, T., & Chalmers, D. J. (2003). What is the unity of consciousness? In A. Cleeremans (Ed.), The unity of consciousness: binding, integration, dissociation (pp. 23–58). New York: Oxford University Press.

  9. Bayne, T., Hohwy, J., & Owen, A. M. (2016). Are there levels of consciousness? Trends in Cognitive Sciences, 20(6), 405–413.

  10. Beran, M. J., Couchman, J. J., Coutinho, M. V. C., Boomer, J., & David Smith, J. (2010). Metacognition in nonhumans: methodological and theoretical issues in uncertainty monitoring. In A. Efklides & P. Misailidi (Eds.), Trends and prospects in metacognition research (pp. 21–35). New York: Springer http://www.springerlink.com/content/u74731956121t3l2/

  11. Beran, M. J., & Smith, J. D. (2011). Information seeking by rhesus monkeys (Macaca mulatta) and capuchin monkeys (Cebus apella). Cognition, 120(1), 90–105. https://doi.org/10.1016/j.cognition.2011.02.016

  12. Block, N. (1995). On a confusion about the function of consciousness. Behavioral and Brain Science, 18, 227–247.

  13. Carruthers, P. (1989). Brute experience. The Journal of Philosophy, 86(5), 258–269. https://doi.org/10.2307/2027110

  14. Carruthers, P. (1992). The animals issue: moral theory in practice. Cambridge [England]: Cambridge University press.

  15. Carruthers, P. (2008). Metacognition in animals: a skeptical look. Mind & Language, 23(1), 58–89. https://doi.org/10.1111/j.1468-0017.2007.00329.x

  16. Carruthers, P., & Brendan Ritchie, J. (2013). The emergence of metacognition: affect and uncertainty in animals. In M. J. Beran, J. Brandl, J. Perner, & J. Proust (Eds.), Foundations of metacognition (pp. 76–93). Oxford: Oxford University Press. http://www.oxfordscholarship.com/view/10.1093/acprof:oso/9780199646739.001.0001/acprof-9780199646739-chapter-005.

  17. Cerullo, M. (2015). The problem with phi: a critique of integrated information theory. PLoS Computational Biology, 11(9), 1–12. https://doi.org/10.1371/journal.pcbi.1004286

  18. Couchman, J. J., Coutinho, M. V. C., Beran, M. J., & Smith, J. D. (2009). Metacognition is prior. Behavioral and Brain Sciences, 32(02), 142–142. https://doi.org/10.1017/S0140525X09000594

  19. Couchman, J. J., Coutinho, M. V. C., Beran, M. J., & Smith, J. D. (2010). Beyond stimulus cues and reinforcement signals: a new approach to animal metacognition. Journal of Comparative Psychology, 124(4), 356–368. https://doi.org/10.1037/a0020129

  20. Del Cul, A., Baillet, S., & Dehaene, S. (2007). Brain dynamics underlying the nonlinear threshold for access to consciousness. PLoS Biology, 5(10), e260. https://doi.org/10.1371/journal.pbio.0050260

  21. DeGrazia, D. (2009). Self-awareness in animals. In R. W. Lurz (Ed.), The philosophy of animal minds (Vol. 201). Cambridge: Cambridge University Press.

  22. Fitzpatrick, S. (2008). Doing away with Morgan’s canon. Mind & Language, 23(2), 224–246. https://doi.org/10.1111/j.1468-0017.2007.00338.x

  23. Foote, A. L., & Crystal, J. D. (2007). Metacognition in the rat. Current Biology, 17(6), 551–555. https://doi.org/10.1016/j.cub.2007.01.061

  24. Gennaro, R. J. (2009). Animals, consciousness, and I-thoughts. In R. W. Lurz (Ed.), The Philosophy of Animal Minds (pp. 184–200). Cambridge: Cambridge University Press.

  25. Goldman, A. I. (1993). The psychology of folk psychology. Behavioral and Brain Sciences, 16(01), 15–28. https://doi.org/10.1017/S0140525X00028648

  26. Hampton, R. (2001). Rhesus monkeys know when they remember. PNAS, 98(9), 5359–5362.

  27. Hampton, R. R. (2009). Multiple demonstrations of metacognition in nonhumans: converging evidence or multiple mechanisms? Comparative Cognition & Behavior Reviews, 4(January), 17–28.

  28. Hirstein, W. (2012). Mindmelding: consciousness, neuroscience, and the mind’s privacy. New York: Oxford University Press.

  29. Jacobson, H. (2010). Normativity without reflectivity: on the beliefs and desires of non-reflective creatures. Philosophical Psychology, 23(1), 75–93. https://doi.org/10.1080/09515080903532282

  30. Jozefowiez, J., Staddon, J. E. R., & Cerutti, D. T. (2009). Metacognition in animals: how do we know that they know? Comparative Cognition & Behavior Reviews, 4. https://doi.org/10.3819/ccbr.2009.40003.

  31. Kant, I. (1998). Critique of pure reason. Translated by Paul Guyer and Allen W. Wood. 1781, 1st ed./1787 2nd ed. Cambridge; New York: Cambridge University Press.

  32. Karin-D’Arcy, M. R. (2005). The modern role of Morgan’s canon in comparative psychology. International Journal of Comparative Psychology, 18(3) http://escholarship.ucop.edu/uc/item/3vx8250v#page-2

  33. Kornell, N., Son, L. K., & Terrace, H. S. (2007). Transfer of metacognitive skills and hint seeking in monkeys. Psychological Science, 18(1), 64–71. https://doi.org/10.2307/40064579

  34. Le Pelley, M. E. (2012). Metacognitive monkeys or associative animals? Simple reinforcement learning explains uncertainty in nonhuman animals. Journal of Experimental Psychology: Learning, Memory, and Cognition, 38(3), 686–708. https://doi.org/10.1037/a0026478

  35. Malassis, R., Gheusi, G., & Fagot, J. (2015). Assessment of metacognitive monitoring and control in baboons (Papio Papio). Animal Cognition, 18(6), 1347–1362. https://doi.org/10.1007/s10071-015-0907-8

  36. Morgan, C. L. (1894). An introduction to comparative psychology. London: Walter Scott.

  37. Nagel, T. (1974). What is it like to be a bat? The Philosophical Review, 83(4), 435.

  38. Pais-Vieira, M., Chiuffa, G., Lebedev, M., Yadav, A., & Nicolelis, M. A. L. (2015). Building an organic computing device with multiple interconnected brains. Scientific Reports 5 (July). https://doi.org/10.1038/srep11869.

  39. Potter, M. C., Wyble, B., Hagmann, C. E., & McCourt, E. S. (2014). Detecting meaning in RSVP at 13 ms per picture. Attention, Perception, & Psychophysics, 76(2), 270–279. https://doi.org/10.3758/s13414-013-0605-z

  40. Proust, J. (2009). Overlooking metacognitive experience. Behavioral and Brain Sciences, 32(2), 158–159.

  41. Proust, J. (2010). Metacognition. Philosophy Compass, 5(11), 989–998. https://doi.org/10.1111/j.1747-9991.2010.00340.x

  42. Ramakrishnan, A., Ifft, P. J., Pais-Vieira, M., Byun, Y. W., Zhuang, K. Z., Lebedev, M. A. & Nicolelis, M. A. L. (2015). Computing arm movements with a monkey Brainet. Scientific Reports 5 (July). https://doi.org/10.1038/srep10767.

  43. Rosati, A. G., & Santos, L. R. (2016). Spontaneous metacognition in rhesus monkeys. Psychological Science, 27(9), 1181–1191. https://doi.org/10.1177/0956797616653737

  44. Smith, J. D. (2009). The study of animal metacognition. Trends in Cognitive Sciences, 13(9), 389–396. https://doi.org/10.1016/j.tics.2009.06.009

  45. Smith, J. D., Schull, J., Strote, J., McGee, K., Egnor, R., & Erb, L. (1995). The uncertain response in the Bottlenosed dolphin (Tursiops truncatus). Journal of Experimental Psychology: General, 124(4), 391–408. https://doi.org/10.1037/0096-3445.124.4.391

  46. Smith, J. D., Shields, W. E., Schull, J., & Washburn, D. A. (1997). The uncertain response in humans and animals. Cognition, 62(1), 75–97. https://doi.org/10.1016/S0010-0277(96)00726-3

  47. Smith, J. D., Shields, W. E., & Washburn, D. A. (2003). The comparative psychology of uncertainty monitoring and metacognition. Behavioral and Brain Sciences, 26, 317–373.

  48. Sober, E. (2005). Comparative psychology meets evolutionary biology: Morgan’s canon and cladistic parsimony. In L. Daston & G. Mitman (Eds.), Thinking with animals: new perspectives on anthropomorphism (pp. 85–99). New York: Columbia University Press.

  49. Sober, E. (2009). Parsimony and models of animal minds. In R. W. Lurz (Ed.), The philosophy of animal minds (Vol. 257). Cambridge: Cambridge University Press.

  50. Sperling, G. (1960). The information available in brief visual presentations. Psychological Monographs: General and Applied, 74(11), 1–29.

  51. Tononi, G. (2008). Consciousness as integrated information: a provisional manifesto. Biological Bulletin, 215, 216–242.

  52. Tye, M. (2003). Consciousness and persons: unity and identity. Cambridge, MA: MIT Press.

  53. Washburn, D. A., Smith, J. D., & Shields, W. E. (2006). Rhesus monkeys (Macaca Mulatta) immediately generalize the uncertain response. Journal of Experimental Psychology: Animal Behavior Processes, 32(2), 185–189. https://doi.org/10.1037/0097-7403.32.2.185.

  54. Zeki, S. (2015). A massively asynchronous, parallel brain. Philosophical Transactions of the Royal Society B, 370(1668), 20140174. https://doi.org/10.1098/rstb.2014.0174

Download references

Acknowledgements

We thank Dorit Bar-On for getting us started on this topic; Peter Carruthers for helpful criticisms when we presented at the 2017 University of Connecticut ECOM conference on “Human and Nonhuman Animals: Minds and Morals;” Irina Mikhalevich for comments at the 2016 Central APA meeting, as well as participants there; participants at the 2015 North Carolina Philosophical Society meeting, the 2014 Towards a Science of Consciousness Conference at the University of Arizona, and the 2013 Pacific University Northwest Philosophy Conference; and several anonymous reviewers.

Author information

Correspondence to William A. Bauer.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Comstock, G., Bauer, W.A. Getting It Together: Psychological Unity and Deflationary Accounts of Animal Metacognition. Acta Anal 33, 431–451 (2018). https://doi.org/10.1007/s12136-018-0340-0

Download citation

Keywords

  • Metacognition
  • Psychological unity
  • Animal minds
  • Brainets
  • Moral standing of animals
  • Uncertainty test