Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Our particular focus in this chapter is on the intentional development of oneself as a person, and how this relates to both brain and education. In this we aim to promote a person-centered developmental science such as Bronfenbrenner and Morris (2006), Magnusson and Stattin (2006) and Rathunde and Csikszentmihalyi (2006) have recently called for, one that has long-standing roots in psychology (Rogers, 1961). Developing individual persons are often trichotomized into the following: (1) a biological body (including especially its brain) and its physical environment; (2) a biological body and an experiencing self (conscious and unconscious mind); and (3) an experiencing self and other selves (sociocultural or symbolic environment) that has its own institutions and practices (including educational institutions and practices). These divisions are conceptually rich, but we argue they are best understood in a way that is emergent and relational. In other words, because persons are at once biological, social, and cultural actors, considering the development of “personal selves” allows us to span this Cartesian divide between mind, body, self and environment, inherited from the 19th century, that still haunts developmental psychology. Furthermore, it allows us to address an issue often sidelined: how people make deliberate efforts to develop themselves in ways that are personally meaningful to them (see also Blasi, 2004 and Foucault, 1994, 2004).

The developmental relations between mind, brain, and education are a particularly fruitful framework for understanding personal development; however, this ambitious task is fraught with problems. Let’s lay out the problems before us:

  1. 1.

    What it means to be and become a person and how this relates to notions of self-identity.

  2. 2.

    The place of autonomy and intentionality in personal development.

  3. 3.

    Relation of mind, brain, and education to intentional personal development.

One might despair of making any headway with such a list, and this chapter just sketches out a few suggestions of promising directions that integrate the work of Francesco Varela, Antonio Damasio, Paul Ricoeur, and Jochen Brandtstädter.

Being and Becoming a Person

Before we begin, it is important to consider what we mean by a person and how this relates to the notion of self. These concepts have a rich and linked history in philosophy, linguistics, and the law, as one can see by consulting the Oxford English Dictionary entry for person. Among the principal meanings of concern to us in this context are the following:

  1. 1.

    In grammar, person is a category used to classify pronouns, possessive determiners, and verb forms: More specifically, person is used to denote whether one indicates the speaker (first person), the addressee (second person), or someone or something spoken of (third person). As we will see, this is important to the work of theorists like Rom Harré and Paul Ricoeur, in particular to the relation between the self and the other in personal development.

  2. 2.

    In a general philosophical sense, a person is a conscious or rational being; that is, an individual human being, as distinguished from another kind of animal or thing.Footnote 1 In this sense, a human person is characterized in part by their neurobiology. More specifically, the term person refers to the self, or individual personality, as distinct from their occupation or created works. In this chapter, self and person will be considered synonyms—although self is perhaps more closely identified with the psychological experience of being an individual actor, while person often refers to the sociocultural aspect of identity.

  3. 3.

    Person also refers to individuals considered with regard to their outward appearance, or to a social or cultural role or character assumed in real life or in a literary narrative. Carl Jung (1953) contrasted persona (the social masks we don to hide our true self) with anima or soul (the authentic desires and possibilities we find socially difficult to express). Erwin Goffman (1959) developed an entire dramaturgical theory of self, in which people compete to assume roles that they believe have the greatest advantage. This notion of character or role relates to the legal sense of the term person, in which a person—whether an individual (i.e., a natural person) or corporate body (artificial person)—is recognized as having certain rights and duties; artificial persons are beyond the scope of our chapter.

For Mihaly Csikszentmihalyi and Kevin Rathunde (1998), persons are not born but made through their cultural participation. They propose six features of personal development that are particularly important:

  1. 1.

    being healthy and fit of body;

  2. 2.

    sustaining an alert, active, and open mind, and an interest in life;

  3. 3.

    having and sustaining a vocation or other meaningful social roles;

  4. 4.

    sustaining and nurturing relationships with family and friends;

  5. 5.

    involvement in one’s community; and

  6. 6.

    gaining wisdom, defined as holistic thinking that discerns the essence of problems; an equanimity that accepts oneself and the world; an ability to act gladly to do one’s part for the common good, or as Baltes and colleagues say, an expertise in the “fundamental pragmatics of life” (Ardelt, 2004; Baltes & Smith, 2008; Baltes & Staudinger, 2000).

While these six points are important, we believe this list is incomplete because it fails to include one’s efforts to intentionally develop oneself into a particular kind of person, in particular a good person. As Taylor (1989) proposes:

We are selves only in that certain issues matter for us. What I am as a self, my identity, is essentially defined by the way things have significance for me.[…] To ask what a person is, in abstraction from his or her self-interpretations, is to ask a fundamentally misguided question, one to which there couldn’t in principle be an answer.[…] We are not selves in the way that we are organisms, or we don’t have selves in the way that we have hearts and livers. We are living beings with these organs quite independently of our self-understandings or -interpretations, or the meaning things have for us. But we are only selves insofar as we move in a certain space of questions, as we seek and find an orientation to the good. (p. 34)

What is appealing about the concept of person, then, is that it spans biological, experiential, and cultural senses of human self and identity and promises to allow us to bridge very different discourses about these various aspects of human life. This, in turn, will allow a way to overcome a long-standing problem—some have called it a crisis—that dates to the very origins of psychology as a science.

An Enduring Crisis in the Scientific Study of Psychology

When Wundt (1863) founded psychology as a discipline in the late 19th century, he immediately identified a problem for the emerging field—namely, that two quite separate aspects of psychology need to be addressed by this science. On the one hand, psychology is an extension of biology and physiology, and the task of a science of psychology is to explain how the human body can have various psychological functions, such as perception, rationality, consciousness, and other capacities that connect and distinguish us from other living things. On the other hand, psychology must also explain how people come to have a personal identity within the cultural, social, and historical context into which they are born as infants and to which they contribute as adults; this second sort of psychology requires an understanding of cognitive science as it relates to language and education—both informal education, and formal education as mandated within cultural institutions designed to educate children or adults in special skills. Wundt called the first kind of science “physiological psychology” (Wundt, 1911) and the second “cultural psychology” (Völkerpsychologie) (Wundt, 1900–1920). He believed that each required its own distinct methods of investigation, but did not build a clear bridge between them. The only proposal he advanced was to consider spoken language as a personal expression of historically developing cultural laws, myths, and customs, but their specific link (e.g., through autobiographical narratives) was never formally studied.

Over the years, others have continued to point to this crisis in the science of psychology (Snow, 1959, 1964; Vygotsky, 1927). Most notably, Vygotsky (1927) considered the split between physiological and cultural psychology to be the defining crisis in psychology. His efforts to resolve this crisis led him to propose that personal development requires mastery of cultural symbol systems (e.g., language) that generate higher mental functions (such as higher memory or higher attention) by building on and transforming our basic biological capacities. For Vygotsky, education in the use of such cultural tools was essential to becoming fully human. Piaget’s (1968/1995) sociological writings and those on education (Piaget, 1998) make a very similar point, adding that such social interactions must allow individuals to develop increasingly complex and coherent cognitive structures that make sense of the world.

These points are not merely of historical interest. Neo-Piagetians like Robbie Case and Kurt Fischer (whose work in many ways bridges the ideas of Vygotsky and Piaget) have refined earlier insights noting that Central Conceptual Structures (Case, 1992, 1998) or Skills (Fischer & Bidell, 2006) emerge out of personal efforts to master knowledge domains developed within a particular subculture, with more or less support from significant social others. But neo-Piagetian theories do not build a bridge from neuroscience all the way to the humanities as Wundt and Vygotsky had hoped, and in considering developmental relations between brain, mind, and education today, we find that the crisis in psychology still endures—as seen in the debate between Jean-Pierre Changeux and Paul Ricoeur (1998/2000) that could as easily be held today. As Ricoeur points out in this debate, the difficult task is to create a “third discourse,” in which biological, phenomenological, and cultural aspects of human experience are integrated (and not merely correlated). Ricoeur doubted such a discourse was even possible, but we believe that the most promising example of a third discourse is the enactive approach championed by Maturana and Varela and their colleagues.

Experienced Bodies and Their Physical and Symbolic Environment

The enactive approach is a contemporary effort to bridge this divide between biology and culture in a way that also integrates the phenomenology of personal experience, as well articulated in the work of Maturana and Varela (1980, 1992; Varela, 1987, 1992/1999) and Thompson (2007; see also Varela, Thompson & Rosch, 1991). The enactive approach acknowledges Piagetian genetic epistemology as a direct predecessor (Varela et al., 1991, Chapter 1), and so it is easy to imagine it would have been of interest to Robbie Case, who was master at synthesizing various competing approaches to developmental psychology (see Case, 1998, for example).

According to the enactive approach, autonomy is integral to life from its earliest and most basic forms. In their remarkable book, The Tree of Knowledge, Maturana and Varela (1992) illustrate the importance of considering the relation between brain and mind—and more generally between biology and knowing—as a circle rather than as a line (an approach also adopted by Piaget). In this way, Maturana and Varela propose to resolve the crisis in psychology that dates back to Wundt’s original division of psychology into physiological psychology and Völkerpsychologie. (How the circle between the individual knower and his symbolic environment involves education is shown later in this chapter.)

The idea that there is a circle between biology and knowing is motivated by certain experiences that are clearly explained by our biological history. For example, Maturana and Varela (1992) begin their book by allowing the reader to personally experience colored shadows and how our visual blind spot is filled in, such that we have no perception of it without special effort such as they propose. The point of staging these first-person conscious experiences is to show unequivocally that our experience is bound to our biological structure, such that “we live our field of vision [and] our chromatic space” (p. 23). This leads to the further point that we cannot separate the history of our biological and social actions from how the world appears to us (a point also made by Lewin (1939)). Facts are not independently “out there” in the world and then represented internally “in our brain,” rather they are meanings that emerge from our human interaction with the world that has both an evolutionary and a personal history. They make the strong claim that “every act of knowing brings forth a world” (p. 26) summed up by the following aphorisms: (1) “All doing is knowing, and all knowing is doing” (p. 26) and (2) “Everything said is said by someone” (p. 26).

The validity of this is shown by considering the very origins of life on earth, as it emerged from the world before life existed. Most particularly, we recognize living things by a particular kind of organization they exhibit. Maturana and Varela (1992) propose that what characterizes living things is that they are literally “self-producing,” or as they put it, “the organization that defines them [is] an autopoietic organization” (p. 43), something that is true of all living things from single-celled organisms to multicellular organisms like human beings. The defining feature of an autopoietic organization is that the system is involved in “a network of dynamic transformations that produces its own components and that is essential for a boundary [itself] essential for the operation of the network of transformations that produce it as a[n autonomous] unity” (p. 46). In this sense, organisms specify their own laws proper to their functioning (and in so doing call forth an environment).Footnote 2

For Maturana and Varela, the history of life on earth is the history of autopoietic forms that have descended throughout their histories with modification that allowed structural change without loss of unity, but also provided new possibilities of coupling with their environment—a structural drift that resembles natural selection, but is better thought of as co-creation of self and environment. Individuals are distinct from their environment, but the very distinction defines both the organism and its environment.

What this framework highlights is that, for all biological systems with an autonomous identity, interactions with the environment are perceived as perturbations for which the system must compensate by changing its structure to preserve its identity (if severe enough, perturbations can destroy it). The entire developmental history of the organism is the history of these environmental couplings—not merely its own, but those of its historical lineage. Each such history is unique and not predetermined; rather it is the active expression of individual behavior that acts to preserve and enhance its own identity. For Varela (1987), the history of these environmental couplings is an internal project that the system manifests in light of environmental perturbations it encounters—perturbations it also provokes. In this light, it is important to distinguish between determinism and predictability. Living beings always function in their structural present because of their structural coupling, itself a result of their biological and individual history; in the present moment they have a range of options allowed by this history. Thus, representation is the wrong metaphorical model; rather it is better to think of skilled behavior. The nervous system expands the range of interactions of an organism by coupling its sensory and motor surfaces through a network of neurons—most particularly, in humans, the brain.Footnote 3 “The nervous system, therefore, by its very architecture does not violate but enriches the operational closure that defines the autonomous nature of the living being” (p. 166). By its very plasticity, the nervous system (through its sensory and motor organs) participates in the structural drift of the organism in ways that help conserve its structure. An organism’s range of possible behaviors is determined by its structure, both innate and as acquired through learning, at the most fundamental level.

Social phenomena arise spontaneously in what Maturana and Varela (1992) call “third order couplings” that create social systems.Footnote 4 In this view, communication most fundamentally allows coordinated behavior that is mutually triggered in members of any such social system—whether insect or human—such that it acts like a new unity.Footnote 5

Human linguistic activity in particular is central to the operation of the human social system. Language appears when operations appear in the linguistic domain that result in the coordination of actions that pertain to the linguistic domain itself; thus, as languages arise, so do objects that reflect “linguistic distinctions of linguistic distinctions” (Maturana and Varela, 1992, p. 210).Footnote 6

In this sense, language is reflexive, allowing the emergence of new phenomena such as psychological reflection and consciousness. In other words, language enables those who speak it to describe themselves and their circumstances through “linguistic distinction of linguistic distinctions” (Maturana & Varela, 1992). According to Maturana and Varela, with the emergence of linguistic distinctions there also arises ability to distinguish oneself and others linguistically as languaging entities by virtue of participating in the linguistic domain. Furthermore, recurrent individual interactions that linguistically personalize other individuals (e.g., by identifying them by name), may allow the emergence of the notion of self—a point further articulated by Paul Ricoeur (1992). All of this suggests to Maturana and Varela (1992), that

It is in language that the self, the I, arises as the social singularity defined by the operational intersection in the human body of the recursive linguistic distinction in which it is distinguished. This tells us that in the network of linguistic interactions in which we move, we maintain an ongoing descriptive recursion which we call the “I”. It enables us to conserve our linguistic operational coherence and our adaptation in the domain of language (p. 231; italics in the original)

Thus, the conscious mind and the self are better thought of as belonging to a realm of social coupling than as residing within an individual brain: “it is by languaging that the act of knowing […] brings forth a world […] because we are constituted in language in a continuous becoming that we bring forth with others” (Maturana and Varela, 1992, pp. 234–235); for Maturana and Varela social coupling supports (even compels) an ethics of mutual love and charity, in the hope of creating a world that we all can share and enjoy together, rather than fight over whose ideology or worldview is true.

Thus, in the more complex case of human experience, our environment involves both physical and personal and autopoiesis. Personal autopoiesis involves determining how best to live a good life, in which one can flourish both biologically and personally as a member of one’s cultural community (Brandtstädter, 2006). Indeed, as the original Greek term implied, personal autopoiesis involves intentionally crafting oneself to meet challenges and opportunities that arise in our life as part of a historical community. But this capacity to intentionally develop ourselves is itself a developmental accomplishment.

The Biological Bases of Self and Personal Identity

The ideas of Maturana and Varela (1992, see also Thompson, 2007) are supported, but also challenged, through the work of Damasio (1999), who proposes three layers to the self, as understood in contemporary neuroscience: the proto-self, core self, and autobiographical self. According to Damasio, these three levels support an ever more articulate experience of personal consciousness.

The proto-self is a “preconscious biological precedent” (Damasio, 1999, p. 153) to the sense of self that consists of “a coherent collection of neural patterns which map, moment by moment, the state of the physical structure of the organism in its many directions” (p. 154). A number of brain structures are required to implement the proto-self including the several brain-stem nuclei, the hypothalamus, the insular cortex, the cortices known as S2, and the medial parietal cortices. According to Damasio, the proto-self supports the emergence of the core self and core consciousness.

The core self is generated for any object that incites core consciousness, and is considered “an imaged, nonverbal account of how the organism’s own state is affected by the organism’s processing of an object, and when this process enhances the image of the causative object, thus placing it saliently in a spatial and temporal context” (Damasio, 1999, p. 169). The core self generates a transient experience of knowing that is the object of experience and requires the presence of the proto-self. The neuro-anatomical structures that support core consciousness and core self include those that support the proto-self mentioned earlier, as well as the cingulate cortices, the thalamus, and the superior colliculi. Core consciousness and core self, in turn, allow for the emergence of extended consciousness and an autobiographical self (Damasio, 1999). The core self is an agent and object of linguistic designators such as “I,” identified by Maturana and Varela (1992) as essential to a sense of self, but it cannot sustain a personal narrative of events extended in time. The proto and core self together seem to capture what Ricoeur (1987) calls “Mimeses1”—or the implicit organization of events that might be rendered as a narrative about a particular person, but has not yet been articulated as such.

The autobiographical self (or we would say personal identity) is based on “the permanent but dispositional records of core self-experiences” (Damasio, 1999, p. 175). The sustained display of autobiographical self requires the presence of a core self and core consciousness and is the key to the experience of extended consciousness (i.e., a sense of self that is connected to the lived past and anticipated future). Extended consciousness is generated by accumulating autobiographical memories and occurs when working memory simultaneously holds in mind the object known and the objects constituting the autobiographical self for that time. The neuro-anatomical structures that support autobiographical self and extended consciousness include the early sensory cortices (also referred to as the limbic cortices), subcortical nuclei, and higher-order cortices. Ultimately, extended consciousness allows for distinctive human mental abilities (e.g., the ability to value self and life) and supports human conscience (Damasio, 1999). The autobiographical self, to the extent that it is articulated as a self-concept or a personal narrative, captures what Ricoeur (1988) calls Mimesis2—the actual historical or fictional personal narratives that organize events into life stories.Footnote 7

What Damasio’s (1999) work shows is that the self needs to be considered in a more complex way than Maturana and Varela had imagined; we need to consider not only the biological and autobiographical self, but also a core self that precedes our ability to describe ourselves linguistically. However, as Robbie Case’s work and others have shown, the ability to sustain an autobiographical self, and associated personal narratives, is itself a developmental accomplishment.

Development of the Autobiographical Self

The work of Damasio (1999) and that of Maturana and Varela (1992) both suggest that an autobiographical self should itself emerge through ontogenetic development, specifically, as brain systems that support more complex sociocultural interactions become available. This is in fact the position of neo-Piagetians like Case and Fischer: Case (1991) explored the possibility that Self is a Central Conceptual Structure essential to social life, one that develops in complexity as persons themselves develop. Work by Fischer has also fruitfully explored this idea (see Fischer & Bidell, 2006).

Along these lines, Habermas and Bluck (2000) consider the autobiographical self an increasingly coherent integration of (1) neo-Piagetian structures allowing increasing cognitive competence and (2) of sociohistorical norms expressed or institutionalized in different historical times and cultures.Footnote 8 They propose four kinds of coherence by which to judge life narratives, suggesting that “Temporal coherence and the cultural concept of biography are used to form a basic, skeletal life narrative consisting of an ordered sequence of culturally defined, major life events. Causal and thematic coherence express the unique interpretative stance of the individual.” (Habermas & Bluck, 2000, p. 750, italics added). Only life narratives that are causally and thematically coherent will be recognized as good life narratives; nevertheless, temporal ordering—which can be linear, circular, or involve multiple timelines, depending on the cultural expectations governing such stories—provides a fundamental coherence in life narratives and appears early in childhood (Brockmeier, 2000; Lalonde & Chandler, 2004; Habermas & Bluck, 2000).

Studies of children’s earliest autobiographical memories in natural contexts show that the earliest verbal remembering starts around 16 to 20 months and extends to distal events around 30 months when a familiar adult helps the child to remember (Fivush, Gray, & Fromhoff, 1987). (At this age, the degree of children’s elaboration is predicted by adult scaffolding.) Katherine Nelson (1988, 1993) found that generalized scripts describing normative action sequences (i.e., specifying actors, locations, and objects) evolve between ages 2 and 3; these become structurally more complex and hierarchical (e.g., with nested and subordinate actions) by age 4–5.

Young children can order some events in time: For example, 3-year-olds can embed events into daily and weekly routines (Nelson, 1988) and by early school age children have a linear transitive sense of time (McCormack & Hoerl, 1999; Piaget, 1946); however, studies suggest that the ability to order remote events into a series develops later (Habermas & Bluck, 2000; also see Friedman, 1986, 1992, 1993).

According to McKeough and Griffiths (Chapter “Adolescent Narrative Thought: Developmental and Neurological Evidence in Support of a Central Social Structure”), the ability to temporally sequence distant experiences (what Ricoeur (1988) calls “emplotment” essential to all forms of mimesis) begins to develop around age 6: The ability to tell stories with subplots seems to emerge around age 8, but Habermas and Bluck (2000) claim that a cultural concept of biography only emerges around age 10. Other researchers have arrived at slightly different ages for these abilities. For example, some claim that about age 4 or 5, children begin to tell stories in terms of initiating problems and their resolution, reaching almost adult levels of performance around age 9–11, but all agree that these narratives are not integrated into an account of their life as a whole (Applebee, 1978; Botvin & Sutton-Smith, 1977; Case & McKeough, 1990; Habermas & Bluck, 2000; Chapter “Adolescent Narrative Thought: Developmental and Neurological Evidence in Support of a Central Social Structure”).

Habermas and Bluck (2000) detail many studies that suggest that the emergence of an integrated personal identity (whether a self-concept or a personal narrative able to integrate diverse episodes in ways that generate causal coherence) does not occur before early adolescence, when children become able to integrate characters’ actions and their mental states into a single story: By age 12, early adolescents begin to interpret the intentions of characters in terms of lasting mental states and character traits that transcend the particular situation recounted (Chapter “Adolescent Narrative Thought: Developmental and Neurological Evidence in Support of a Central Social Structure”). Only in mid- to late adolescence does evidence support the emergence of biographical conceptions of a self capable of change through life experiences. Likewise, the ability to thematically interpret complex texts and an awareness of the need to interpret past personal events in light of the present develop simultaneously (as predicted by Ricoeur, 1992); this occurs in mid- to late adolescence—an ability that may continue to develop into adulthood.

For example, in a study comparing 12-, 15-, and 18-year-olds, Habermas and Paha (1999; cited in Habermas and Bluck, 2000) note that it is more difficult to establish global thematic coherence across stories in an individual’s life than it is to find local coherence within a story. Supporting this intuitive claim, they found that efforts to explain action according to general personality and developmental status were present only in 15- and 18-year-olds, while explanations of personal discontinuity as due to particular biographical experiences were found only among 18-year-olds. Likewise, McKeough and Griffiths (Chapter “Adolescent Narrative Thought: Developmental and Neurological Evidence in Support of a Central Social Structure”) found that by age 18, adolescent narratives can contain a coherent and coordinated dialectic tension between a character’s internal and external struggles. These findings concord well with findings by Harter (1999) that late adolescence is associated with a more integrated self-conception used to explain a variety of events and actions more coherently. As McKeough’s work shows (see Chapter “Adolescent Narrative Thought: Developmental and Neurological Evidence in Support of a Central Social Structure”), these changes in narrative capability are well explained by Robbie Case’s (1985, 1992) theory of cognitive and self-development. Marini and Case (1994) conducted a study in which they gave five tasks of increasing difficulty to adolescents between the ages of 11 and 19. Participants read vignettes containing several situations and then inferred the protagonist’s traits and predicted their behavior in a new situation. Most 13-year-olds gleaned one personal trait from a vignette when predicting the protagonist’s behavior in the new situation. Most 16-year-olds gleaned two possibly contradictory traits and used them both to predict new behavior. Most 19-year-olds considered mood as a situational modifier.

Causal coherence does more than link life episodes; it also explains how one’s values or personality change over time. Thematic coherence requires establishing thematic identity among disparate life events; sometimes this is implicit to the stories told, but sometimes links are made explicit by narrators who describe abiding metaphors of their life experience, or describe evaluative trajectories often associated with declared turning points in people’s lives (Bruner, 1990, 2001, 2002). Organizing life narratives is not merely a matter of cognitive skill in coordinating information about one’s life. Understanding of a normatively typical life varies according to culture, and within cultural groups, according to class and gender, and awareness of these norms can provide an “advance organizer” of personal autobiography (Botvin & Sutton-Smith, 1977; Peterson & McCabe, 1983, 1994; Schutze, 1984).

Development of Self-Concept

Selman (1980), Damon and Hart (1988), and McKeough and Griffiths have all documented the development of self-concept. Habermas and Bluck (2000) characterized the development of children’s conceptions of self- and other as falling into the following five levelsFootnote 9:

  • Level 0: preschool children describe themselves and other people in terms of their physical appearance, and make general evaluations about them such as being “good” or “nice.”

  • Level 1: young schoolchildren describe self and other people in terms of simple feelings and preferences, or in terms of specific abilities, like “She’s good at swimming.”

  • Level 2: older schoolchildren compare their own skills to a reference group, describing their personality as unrelated but fairly stable habits and attitudes that explain a variety of actions and feelings.

  • Level 3: early to mid-adolescents develop psychological concepts of personality able to integrate and coordinate a wide range of emotions, motivations, and action in relationships. Thus, the notion of a consistent personality or self emerges that makes sense of a disparate set of characteristics in light of underlying personality traits—sometimes over-generalized into thinking of others stereotypically as a “jock” or some other summative label.

  • Level 4: mid- to late adolescents develop self-concepts that begin to acknowledge potential conflict between aspects of one’s personality and realize that some aspects may not be fully articulated or even accessible to personal awareness; evaluations of people are sometimes made in light of particular belief systems, including religious or philosophical systems.

Once one reaches level 4, one is capable of engaging in intentional efforts to craft one’s life to conform to particular ideals, or to make plans that allow one to tell certain kinds of stories characteristic of adult conceptions of self (Brandtstädter, 2006; Ferrari & Mahalingam, 1998). The importance of intentionally crafting life stories relates to what we deeply care about in our lives—how we make sense of our lives and shape them so that we can talk about ourselves or live as a certain sort of person (Blasi, 1983, 2004; Frankfurt, 1988, 2004; Taylor, 1977, 1989)—a long-standing concern that can be traced to autopoietic practices of the Ancient Greeks (Foucault, 1978, 1994, 2004). It is in this sense that we can understand well-documented abilities for life plans, life guides, and life review (Staudinger, 2005; Staudinger, Dörner, & Mickler, 2005) in light of possible and feared selves (Markus & Nurius, 1986; Oyserman, Bybee, Terry & Hart-Johnson, 2004; Oyserman, Bybee, & Terry, 2006).

As Brandtstädter (2006) notes in writings about intentional self-development, adults intentionally plan and manage their self-concept throughout life—and particularly in old age, as physical resources begin to fail. To do so requires constantly balancing between assimilative and accommodative modes of self-development. Assimilative processes are involved in preventative or corrective action (problem solving), compensating actions (especially for highly valued aspects of one’s personal narrative) and self-verification (when one selects social contexts and information to validate personally meaningful self-conceptions). When the “production possibility frontier” diminishes, people eventually accommodate (that is, change their story), through devaluing, re-interpreting, or redeploying their attention:

The distinction between assimilation and accommodation processes …may recall traditional distinctions between active and passive concepts of happiness; philosophical notions of wisdom have emphasized the importance of finding the right balance between these stances. …Intentional self-development across the life span is based on this interplay between engagement and disengagement; between tenacious goal pursuit and flexible goal adjustment (Brandtstädter, 2006, p. 555)

Thus, empirical evidence suggests that our sense of self gains in complexity from childhood to adolescence, and that full-fledged autobiographical reasoning that relates life events into a globally coherent “autobiographical self,” such as Damasio describes, does not appear before adolescence. However, it is important to remember that our sense of self is a product of our cultural participation, not an inherent essence (Varela et al., 1991). Indeed, Michael Chandler and his associates (1987) note that any conception of self must also be able to deal with change in the self and self-understanding over time. Indeed, one is always poised between personal sameness and personal change—a tension Ricoeur (1992) has captured wonderfully in his discussion of “oneself as another.”

Oneself as Another

Ricoeur (1987, 1992) sets out to explain how self as both what is identical and what is constantly changing over time develops out of our linguistic resources. Ricoeur makes the important point that neuroscience is concerned with the questions of “how mind relates to brain,” or “what part of the brain relates to what function of the mind,” but the narrative is essentially concerned with the question “who?” (who speaks? who acts? who narrates? who is responsible?). Ricoeur explores two contrasting notions of self and personal identity:

  • a. Self-identity as an identical core that remains unchanged, or whose elements can be substituted one for the other (from the Latin idem, which means “same”); psychological studies of self-concept also often emphasize sameness.

  • b. Self-identity of the same object that has undergone a series of transformations (from the Latin ipse); thus an oak tree is the same (the Latin ipse) as the acorn from which it developed.

These two senses of self-identity are not opposed, but rather are dialectically engaged with each other in any discussion of self or personal identity. While Ricoeur (1987, 1992) agrees with Varela and colleagues (1992) and Dennett (1991) that there are no core or essential elements of self that remain identical over the lifespan, he echoes Maturana and Varela (1992) in defending an “ipsitive” sense of identity, constructed and maintained through our use of language to express and talk about our individuality. Ricoeur presents his views in three steps (individualization, identification, and imputation) that he claims allow us to go systematically from the idem to the ipse notions of personal identity by means of two transitions (pragmatics and narrativity).

The first step concerns the epistemological problem of how an individual is individuated as distinct within a species by an identifying phrase (e.g., the discoverer of neurons), a proper name (e.g., Broca), or an indicator (e.g., “he,” “you,” “I,” “then,” “there”). The pragmatics of language takes up the same problem, but in such a way that speaking distinguishes the individual human being who is speaking from people in general; any statement one makes implies the phrase “I say [that statement].” In this initial step, we are concerned with the issue of determining of whom one speaks in designating a person, or who speaks when addressing someone else (all such speech acts are subsets of a broader consideration of identifying actors through their actions).

Ricoeur constructs his theory around language use and builds a strong separation between the discourses associated with language and those of biology and the natural sciences. However, when considered in light of Maturana and Varela’s (1992) point about the emergence of personal experience out of our primate heritage, our ability to communicate through language seems a clear case of a natural drift of primate communication, as Maturana and Varela would say, which explains the relation between Ricoeur’s (1988) spoken narrative and Mimesis1—something that Tomasello, Gust, and Frost (1989) have shown distinguishes even very young children from chimpanzees and other primates of comparable linguistic ability using sign systems.Footnote 10

The second step (identification) involves separating the “I” from “I say”: In this way, someone is self-identified by saying “I,” which generates a second transition to narrative identity: Narratives operate to configure the pragmatic engagement of someone in space and time. For Ricoeur, narrative identity is not distinct from experience and need not be accompanied by an explicit narrated story about an “I.” Rather, the history of a life is the identity of a character with that life. What matters is that life events are (or could be) emploted into a narrative, an idea taken up by many psychologists (Gergen, 1991; McAdams, 1990, 1993; McLean, 2005. Variations between sameness and self are attested to by imaginative narrative variations; narrative seeks out and engenders such variations, and fiction is a vast laboratory for thought experiments—or as Oatley (1999) says, “simulations”— designed to test them and determine their merits. Narratives of personal identity thus oscillate between two extremes: At one extreme are characters who have a set role in a plot (the wise man, or villain); at the other are characters portrayed through their stream of consciousness, in which plot is put at the service of character. In limit cases, such as Musil’s (1953) The man without qualities, we seem to have reached the point in which there is no longer a character. But even here, Ricoeur (1987) reminds us, we can always ask “Who is without qualities?” And that “who” remains a character within the narrative.Footnote 11

On this view, narratives are essentially a mimesis of action and life, not of persons. Here Ricoeur (1992) joins Fischer and Bidell (2006) in claiming that the development of a sense of self depends on skilled practices that are developed, coordinated, and subordinated within the context of constitutive rules established by others within a culture. Thus, we need to consider how specific practices (skills)—and the self-regulation needed to employ those skills—are related to general life projects (involving e.g., family, profession, community, and political action). Ricoeur (1992) proposes that skills and projects are coordinated through life plans. Charles Taylor (1989), echoes this important connection between narrative, personal meaning making, and human agency. Our sense of our lives as expressed in a narrative, or a life story, is bound up with our notion of what it means to be a person. For Taylor (1985), as for Ricoeur, “a person is a being with a certain moral status, or a bearer of rights”—a moral status that depends on one’s having certain capabilities:

A person is a being who has a sense of self, has a notion of the future and the past, can hold values, make choices; in short, can adopt life plans. […] A person must be a being with his own point of view on things. The life plan, the choices, the sense of self must be attributable to him as in some sense their point of origin. A person is a being who can be addressed, who can reply. (p. 97)

Indeed, evidence shows that future orientation, in the form of life plans (Lachman & Burack, 1993), ideal selves (Higgins, 1987, 1991), and possible selves (King & Raspin, 2004; Markus & Nurius, 1986; Oyserman, Bybee, Terry, & Hart-Johnson, 2004; King & Raspin, 2004) are crucial to understanding personal narrative and personal development. One of the most significant achievements of adolescence is the acquisition of formal operations, which allow going beyond the world of concrete experience and mentally travel through the world of ideas and ideals—the world of possibilities and necessities (Case, 1998, Fischer & Bidell, 2006; Chapter “Mental Attention, Multiplicative Structures, and the Causal Problems of Cognitive Development”). This achievement, together with a general increase in working memory capacity allows new kinds of self-examination more specifically, it allows the simultaneous consideration of remembered past, known current events (actual self), and imagined future events (possible and ideal selves), and their integration into a coherent view of oneself as one would want to be (Allport, 1937; Damon & Hart, 1988; Higgins, 1987; Inhelder & Piaget, 1958; McLean, 2005; Sullivan, 1953; Thorne, McLean, & Lawrence, 2004).

Ricoeur’s third step (imputation) in the development of a personal identity is ethical and moral: promises, for example, by committing someone to acting toward another in the future make individuals “self-constant.” Self-constancy means that one can be counted on and is accountable for one’s actions, that one is responsible. This sets up a fruitful dialectic between constancy and change. One can ask oneself, “Who am I, so inconstant, who nevertheless remains someone you can still count on?” Here we see that possession of unchanging experiences is not essential to self, for Ricoeur, but rather the ethical primacy of the other by which the self makes itself available to others (and so making us co-authors, not authors or the stories of our lives). The ethical and moral determinants of action become important (i.e., what is considered good and obligatory, respectively): Here the dialectic of self and other is developed further and the autonomy of the self is shown to be tightly bound to solicitude for one’s friends and neighbors. Ricoeur ultimately proposes another transition, cosmopolitical—in which one acknowledges that obligations are always set within a sociohistorical framework that implies some idea of justice; we must keep our promises not only to one another, but to sustain a society in which promises matter and we do our fair share for ourselves and for others. Thus, justice becomes central both for individuals to whom one is personally committed, as well as to others in one’s society and culture (justice as a personal obligation, and justice as fair institutions and laws).

Self-Fashioning and Intentional Personal Development

Promises are made not only to others, but also to ourselves. These promises are integral to our assessment of our lives, and to our efforts to intentionally author, fashion, or develop ourselves (Baxter Magolda, Creamer, & Meszaros, 2010; Blasi, 2004; Stanovich, 2004). This ability to intentionally develop ourselves into persons able to live a good life has long been a hallmark of wisdom (Foucault, 2004; Plotzek, Winneskes, Kraus, & Surmann, 2002; Ricoeur, 1992).

What Is Intentional Personal Development?

Intentional personal development highlights the openness and plasticity of human development. The importance of self-development is a very old idea.Footnote 12 In discussions of the Renaissance, intentional self-development has been called “self-fashioning” and includes both instrumental and aesthetic considerations (Greenblatt, 1980). Contemporary philosophers refer to processes of self-development as “taking ourselves seriously” (Frankfurt, 1971, 1988, 2004), “care for the self” (Foucault, 1978, 2004), or “self-improvement” in which one “work[s] on oneself with the regard to the possibilities of […] becoming something different from what we have been made” (Hacking, 1986, p. 233). This notion continues to find voice in contemporary psychology in recent discussions of intentional self-development (Brandtstädter, 2006) and of student “self-authorship” (Baxter Magolda et al., 2010; King et al., 2009).

For Frankfurt (1988, 2004), “caring for oneself” and “taking oneself seriously” highlight the importance of adopting a caring relationship and require taking a stand on one’s life and oneself as a person that necessarily incorporates values and is closely tied to one’s understanding of meaning of one’s life. Not only of what is possible, but what one desires to be, which necessarily includes one’s understanding of what constitutes a good life (King & Napa, 1998; Plotzek, Winneskes, Kraus, & Surmann, 2002; Seligman & Csikszentmihalyi, 2000; Taylor, 1989; Varela, 1992/1999). Deep insight into what it means for oneself to live a good life has been called personal wisdom—wisdom not in the form of general understanding or advice about human life, but wisdom about one’s own life and how to live it well, which studies show is harder to come by (Staudinger et al. 2005).

Intentional personal development and associated concepts (e.g., free will, agency, and self-actualization), remain a central notion of existential philosophy (e.g., Sartre, 1956) and psychology (e.g., Maslow, 1943, 1954, 1962; Rogers, 1961). Goldstein’s (1939/1995) and Maslow’s concept of self-actualization (1954, 1962) helps articulate the aim of intentional personal development as personal flourishing. Following a very long philosophical tradition, these and other authors argue that all beings, including human beings, have a natural tendency to actualize their potential. However, potential includes not only existing biological capacities but also possibilities that are dynamically shaped by choices people make and the education they receive (Egan, 1997). That is, people not only have personal narratives, but they have second-order evaluations and desires about how to advance or change those stories that are the source of their willed desires to fashion their lives to the extent they can, in conjunction with others with to whose lives they are linked (Blasi, 2004; Frankfurt, 1988; Taylor, 1989).

Following Erikson (1959), Case (1998) and others, we believe that cognitive achievements of adolescence, together with accompanying physical social changes, play an important role in how adolescents understand their lives, as they review their past and plan for the future (Elder, 1999; Hitlin, Brown, & Elder, 2007). As they enter adulthood, adolescents become increasingly able to orchestrate and structure their values and ideals about what a person and what a good life should be. They also engage in new kinds of self-evaluation that lead them to positioning themselves in the world around them (Harré & Moghaddam, 2003; Harré et al., 2009). They, also begin to consider the meaning and purpose of their lives (Damon, Menon, & Bronk, 2003; McAdams, 1990, 1993) and to orient toward the future with an “‘I do care’ attitude” (Allport, 1937, p. 223). All these changes lead to the formulation of “self-guides” (Higgins, 1991) and life plans that guide efforts at intentional personal development. Likewise, Sternberg and Spear-Swerling’s (1998) concept of personal navigation connect self-understanding and self-efficacy as “the means by which self-awareness is translated into a plan of action for one’s life” (p. 222).

Similarly, “life programs” incorporate future goals for oneself and provide motivation for self-regulation (Steinberg et al., 2006) and self-change (Higgins, 1991; Inhelder & Piaget, 1958). Naturally, self-guides are not produced in a social vacuum. Mentors and models of a particular kind of self-practice—including, e.g., meditative self-practices and other ways of forging a certain kind of self that include schooling—are all critical to the choices people make and to their efforts to develop both general and personal wisdom. All this contributes ability to intentionally develop themselves. Recognizing that to people’s merely having a plan and direction is not enough in face of unforeseen obstacles, Ryan and Deci (2000) propose that a desire for self-determination is another critical aspect of intentional personal development.

Changes in capacity that occur in adolescence create both opportunities and challenges for a developing individual and can lead to significant alterations of developmental trajectories (Cicchetti & Toth, 1996; Compas, Hinden, & Gerhardt, 1995; Rutter & Rutter, 1993; Steinberg et al., 2006)—for better or for worse. Thus, although adolescence is usually conceptualized as a time of increased vulnerability to the emergence of psychopathology or the worsening thereof (Cicchetti & Rogosch, 2002; Steinberg et al., 2006), it is also recognized as a time of increased opportunity for overcoming, lessening, or controlling an already existing psychopathology (e.g., stuttering; Finn, 2004). Research shows that whether children with an existing psychopathology will get better in adolescence depends on both external and internal resources: For example, the availability and quality of the social support network (Thompson, Flood, & Goodvin, 2006) and the meaning they assign to their experience and their reaction to it (Nolen-Hoeksema, 1991; Rutter, Kim-Cohen, & Maughan, 2006, respectively).

Thus, intentional personal development should be studied by assessing people’s self-insight in ways that are both retrospective—life review—and prospective—life planning. Such studies might use methods that resemble those of cultural anthropology. In other words, that study people as they live their lives, and explore how they plan and cope with the events that occur: sometimes tragic events, naturally accompanied by grief, that nevertheless allow an opportunity for personal growth and insight into the deepest values of life. And this is where we believe current models of identity development (e.g., Case, 1996) need some adjustments to allow for a person’s own role in developing their identity.

Intentional Personal Development and Education (Autopoiesis Through Emulation and Bildung)

As mentioned earlier, our assessment of what it means to develop into a good person necessarily expresses ideals of human development that integrate both biology and culture across the lifespan. One key form of cultural participation is formal education, not only for children and adolescents, but also for adults who continue to educate and develop themselves in pursuit of a better life, a sense of education that is closer to the German notion of Bildung Footnote 13 than to rote learning.

For Egan (1997), public schools have three competing agendas: Job preparation, teaching truths about the world (e.g., scientific and historical truths), and promoting personal flourishing. Public schools are well equipped students to prepare for jobs, for example by teaching particular skills, such as reading, math, or computer programming. Schools also devote a lot of effort to teaching scientific truths that are not readily apparent—for example, low to overcome deep misconceptions that even advanced students fall into when studying advanced physics or Darwin’s theory of evolution by natural selection has been the focus of a huge effort in cognitive science and education. What about personal flourishing?

Certainly there are specialized schools, often religious schools with entire programs for improving one’s sense of well-being. These schools can be said to focus on different aspects identified by Damasio as components of the self. Various forms of Yoga, for example, aim to train the body and through it a sense of well-being that encompasses the entire person. Mindfulness meditation, and other spiritual exercises (what is sometimes called “contemplative science”) are gaining advocates for their incorporation within the public school system (Rosch, 2008; Siegel, 2007; Wallace, 2007; Wallace & Shapiro, 2006), promoting curricula that encourage students to focus on the experience of the present moment, or allow them to heighten the attention and clarity of their experience, or develop greater compassion.

We also find efforts to promote intentional personal development as part of the standard Liberal Arts Curriculum. King and her colleagues (2009) describe efforts to document the development of self-authorship among college students, and provide recommendations for curriculum design to help promote its development. Building on the work of Kegan (1982) they propose four broad themes of developing self-authorship that are manifested differently by students who define themselves by external criteria (e.g., ethnic or religious traditions), internally (e.g., by personal ideals and choices), or a mixture of the two: (1) increasing awareness, understanding, and openness to diversity; (2) exploring and establishing the basis for their beliefs, choices, and actions; (3) developing a sense of personal identity that guides their choices; and (4) increasing awareness of their responsibility for their own learning. They propose that educational strategies be tailored to students’ particular style of meaning making (i.e., external, internal, or mixed): In standard classrooms, this implies educators should use a variety of approaches since their classrooms will have students with a range of different meaning-making orientations. Generally, they suggest that educators provide students with the chance to experience and reflect on the diversity and complexity of the world and invite them to make sense of that complexity. Educators should also find ways to help students to make the best sense of different perspectives, in light of their own background and upbringing, and to apply this deeper appreciation of diversity into the choices they make in their own lives. These calls for colleges to provide “developmentally effective experiences” are very much a plea for education as Bildung.

The aims toward which one should aspire through intentional personal development, or self-authorship, are often expressed through models to be emulated. This is certainly the approach of character education programs that claim to teach for wisdom, such as Project Wisdom and Wise Skills. A more fully articulated program was developed by Renaissance humanists: Humanist education in the Renaissance was devoted to learning skills, but also to the emulation of models from antiquity, for example Ovid. But Quintilian (95/2009) had warned long ago that models should not be followed slavishly, but adapted to the rhetorical needs of particular times and circumstances. A recent paper by Dixon (2009) suggests that the point of Shakespeare’s Titus Andronicus was to show what happens when people draw immoral lessons from historical examples that are meant as cautionary tales. Of course, flourishing also depends on different ideals of a good life, and these can be intimately tied to one’s temperament and to other aspects of one’s neurobiology (Rothbart, & Bates, 2006; Siegel, 2007).

Unfortunately, most public education, at least in North America, does not concern itself with intentional personal development (bildung), something increasingly important given the disproportionate overabundance of negative role models presented through various media.

Alternative Pathways of Intentional Personal Development

All of these points suggest that we need a wide-ranging approach to consider the developmental relations between mind, brain, and education. In particular, any consideration of their developmental relations is most likely to be successful when engaged at the level of persons’ own efforts at self-fashioning and intentional development, as opposed to the subpersonal level characteristic of classical cognitive science.

What about atypical development? Here we agree with Fischer and Bidell (2006) that one must be alert to potential alternative pathways to learning and to personal well-being. Consider the case of Asperger syndrome, which characterizes individuals with relatively high intellectual functioning who are nevertheless placed on the autism spectrum due to deficits in social sensitivity. We suggest that such individuals represent an alternative (non-normative) pathway with a unique path to personal flourishing, in school and out. Presumably, individuals diagnosed with Asperger syndrome, even if less social than the average person, will need to develop an expertise in the fundamental pragmatics of their own lives characteristic of personal wisdom (Ardelt, 2003; Staudinger et al, 2005), if they are to navigate all of the obstacles they encounter to living a good life on their own terms. And they will need to develop themselves in ways that promote their own personal flourishing.

In fact, several studies/clinical reports indicate that intentional personal development might be a crucial factor in social adjustment of individuals with conduct, autistic, and other disorders. For example, Rogers, Kell, and McNeil (1948) found that the most important predictor of adjustment of adolescents with conduct disorders was self-insight, that included self-understanding, self-acceptance, and taking responsibility for oneself (a notion very similar to intentional personal development). Similarly, based on his follow-up study of the first group of children ever diagnosed as autistic, Leo Kanner (1971) concluded that self-awareness and acting on, it could crucially contribute to good life outcomes of autistic individuals. More recently, Hauser, Allen, and Golden (2006) confirmed the importance of these factors in their study of resilience of adolescents with troubled early lives who, against all odds, became well adjusted (according to their own and others’ criteria). More specifically, they showed how self-reflection and agency (along with relatedness) appear to be crucial for overcoming adversity and for positive growth. Our own in-depth case studies (Vuletic, Ferrari, & Mihail, 2005; Vuletic, 2010) have also shown the importance of intentional personal development to quality of life in adolescents and adults with Asperger syndrome.

It seems plausible that an important aspect of education for individuals with Asperger syndrome will include efforts to emulate others with Asperger syndrome who have been successful in crafting a good life—as expressed in their autobiographies (e.g., Temple Grandin [Grandin, 1995; Grandin & Scariano, 1986], Liane Holliday Willey, 1999, Stephen Shore, 2001, John Elder Robison, 2008, Daniel Tammet, 2006, and Donna Williams, 1992, 1994, 1996, 2004). Such emulation would allow them to intentionally develop themselves in the way that knowledge of their cognitive and genetic differences from typical individuals will not.

Self and Will as Illusions?

Intentional personal development is sometimes documented at the biological level in terms of the neuroplasticity of the brain, (Doidge, 2007)—but it is equally well documented at the personal level. It claims that, at least as there the limits of our human potential, our flourishing or suffering, our bildung, depends on our own efforts as much as it depends on our environment or our genes (King et al., 2009; Stanovich, 2004). This is a very old idea, best expressed by Pico della Mirandola (1486/1956), who famously wrote in his Oratio:

[God] made man a creature of indeterminate and indifferent nature, and, placing him in the middle of the world, said to him ‘Adam, we give you no fixed place to live, no form that is peculiar to you, nor any function that is yours alone. According to your desires and judgment, you will have and possess whatever place to live, whatever form, and whatever functions you yourself choose. All other things have a limited and fixed nature prescribed and bounded by our laws. You, with no limit or no bound, may choose for yourself the limits and bounds of your nature. We have placed you at the world’s center so that you may survey everything else in the world. We have made you neither of heavenly nor of earthly stuff, neither mortal nor immortal, so that with free choice and dignity, you may fashion yourself into whatever form you choose. To you is granted the power of degrading yourself into the lower forms of life, the beasts, and to you is granted the power, contained in your intellect and judgment, to be reborn into the higher forms, the divine.’ / Imagine! The great generosity of God! The happiness of man! To man it is allowed to be whatever he chooses to be! As soon as an animal is born, it brings out of its mother’s womb all that it will ever possess. Spiritual beings from the beginning become what they are to be for all eternity. Man, when he entered life, the Father gave the seeds of every kind and every way of life possible. Whatever seeds each man sows and cultivates will grow and bear him their proper fruit.

Despite the power of this imagery, some contemporary philosophers claim that the idea of a single or enduring self and the notion of free will are illusions. These ideas, they say, are holdovers from a Judeo-Christian understanding of the soul—well expressed by Pico—that modern science has shown to be false. The notion that will is an illusion begets support from empirical reviews purporting to show that there is no one area of the brain devoted to the self, or to capacities that we associate with the ability to have and recognize an enduring sense of self (Gillihan & Farah, 2005). Dan Wegner (2002, 2005) and others have made several studies purporting to show that our sense of free will is an illusion; that our choices are actually influenced by many subpersonal factors of which we are not aware; and that our declarations of having decided to act in a particular way are pure confabulations.Footnote 14 However, it is somewhat ironic to consider that these studies are deliberately designed to show that will is an illusion and depend on the willful action of a confederate who steers the subject wrong in order to induce an “illusion of will.”

Certainly, we are not even always the best judge of our own physical boundaries; in cases of asomatognosia (associated with damage to the right parietal lobe), people feel that parts of their body are not their own, or that they have an ability to move and choose not to when they are clearly unable to do so (Benson, 2003; Feinberg, 2001). Likewise, damage to the corpus callosum, medial frontal cortex, or some other more posterior parts of the brain (Scepkowski & Cronin-Golomb, 2003) leads people not to feel their movements as their own (the alien hand syndrome). Both of these phenomena show dramatically that even our core sense of our immediate selves depends on the proper functioning of our brain.

Likewise, Luria (1972) famously documented the case of brain damage that led a Russian soldier Zazetsky to be unable to emplot the fragments of his life into a coherent narrative, and more recent studies show that the ability to recall and organize our first-person experiences is severely affected by the brain deteriorization associated with Alzheimer’s disease or severe traumatic brain injury (Piolino et al., 2003; Piolino, Desgranges, Manning, North, Jokic, Eustache, 2007).

These and other findings lead philosophers like Dennett (1991, 2003) to conclude that self is nothing more than the center of narrative gravity, with no more reality than the earth’s physical center of gravity. Even Varela et al. (1991) and Ricoeur (1992) agree that ultimately self-narratives are personal creations that are, in a Buddhist sense, “empty of inherent being”—there is no permanent self-substance and our sense of ourselves can be mistaken, or naïve.

We certainly acknowledge the truth of these claims, as far as they go, as does Brandtstädter (2007): There is no denying that there are many subpersonal influences on our actions. But one of the best defenses of the notion of free will was given by William James, inspired by Renouvier, who wrote in his April 30, 1870, diary entry that his “first act of free will shall be to believe in free will”—precisely because it has such pragmatic importance for our lives; indeed, this affirmation helped lift James out of a personal depression (Richardson, 2007).

This is precisely what we understand Ricoeur to propose when he claims that the reality of identity is not in a constant core of experience—the idea of which is certainly an illusion—but in our imputation of personal responsibility to ourselves and to others. In this way, we construct a personal identity that persists despite lower-level influences that might have derailed it. For example, promises are kept, job obligations are fulfilled, and many other things happen that show people to be dependable and to fulfill important aspects of their social identity.Footnote 15 Our identity is intimately bound up with our perceived ability to choose among possible courses of action and the resulting possible selves that we will create by our actions.

Not only promises to others, but promises to ourselves to develop ourselves in particular ways these second-order desires are precisely what guide us and make us alert to subtle influences that weaken or derail our efforts to intentionally develop and fashion ourselves in ways that we value. In other words, our personal identity is not given as our biological or spiritual birthright; it must rather be constructed and co-constructed through our actions as members of a linguistic community of individuals for whom selves matter. But this selfhood is also an expression of basic biological realities, too, in which our continued survival as autonomous beings depends on creating the conditions of our own flourishing and of caring for ourselves in the most basic senses of the term. For this reason, we believe it is better to consider the self a personal creation—more akin to a work of art than to an illusion. As such, it is self-authored (or co-authored) and self-fashioned for the parts we wish to play (and play well). It is in this lived narrative that we fully engage life and avoid a nihilism, sometimes just below the surface of much subpersonal cognitive science (Ricoeur, 1992; Varela et al., 1991).

Thus, the reality of self is not attested to by some inner certainty of our existence that seems to characterize Damasio’s core self, for example. Rather, we propose that the kind of certainty to which the hermeneutic approach to personal identity advocated by Maturana and Varela, and Ricoeur, can aspire is one of “attestation”—a conjunction of analysis and reflection. For Ricoeur (1992), attestation requires less than the Cartesian exaltation of the ego but more than its Nietzschean annihilation. Attestation is fundamentally opposed to the notion of science as ultimate and self-founding knowledge—whether subjective or objective—and in this sense it joins the Buddhist notion of self as “empty” of inherent being. Attestation is belief—not a doxic “I believe—that …” but, rather, the credence of “I believe—in …” (Ricoeur, 1992) that echoes James’ (1897) “will to believe.” Attestation links to testimony about which there is no absolute certainty, because the questions “who speaks?” “who acts?” and “who is responsible?” can be answered in many different ways. But the best answer to false testimony is not to declare that all testimony is an illusion; rather, it is to seek better testimony (Ricoeur, 1992). Thus, an essential aspect of credence in self-testimony is trust and, ultimately, it is this trust that allows us all to believe that we are real, that our lives matter, and that we can intentionally make life better for ourselves and our community.