Keywords

1 Introduction

Human interactions with anthropomorphized machines were until recently considered entertaining, but not widely seen as emotionally relevant for most people. While many engage in conversation with machines (1.4 billion people now use chatbots (Grand View Research 2021), the majority of users still interact with machines in subservient, task-oriented ways like ordering groceries or providing customer service. Some games or robots produce evidence of emotional and cognitive changes for users, and changes in their community engagement for small groups of superusers (Kelly 2004).

Technical breakthroughs in machine learning and open-domain conversational models are changing the capabilities and effects of conversational agents. Intelligent Social Agents (ISAs) are conversational agents that leverage emergent machine learning techniques to present as sufficiently anthropomorphized to pass Turing tests in short exchanges. ISAs are gaining global popularity. For example, XiaoIce, an ISA developed for the Chinese market by Microsoft, has over 650 million downloads (China Daily 2020). Replika, an ISA developed in the USA, has over 20 million downloads. Both deliver human-like conversations and are marketed to users as an intelligent friend, worthy of emotional trust.

Both companies are at the forefront of technological breakthroughs, which make their product experience unique. Replika uses an autoregressive language model called GPT-3 that uses deep learning to produce human-like text. GPT-3, or Generative Pre-trained Transformer 3, is an advanced adaptation of Google’s Transformer. It is a neural network architecture that employs machine learning algorithms to perform tasks such as language modelling and machine translation. Alongside GPT-3, Replika uses a Retrieval Dialog Model, which finds the most relevant and appropriate response among the large set of predefined and pre-moderated phrases, and pairs that with a Generative Model, which generates new, never before written, responses.

Replika became one of the first partners of OpenAI in 2020. The two companies together fine-tuned the GPT-3 model on Replika dialogs, conducting A/B tests, and optimizing model performance for high load and low latency. However, in 2021, Replika began using only its generative model. The company reports that “although the model has only 1.5B parameters, it exceeded OpenAI’s model for dialog quality measured in terms of the positive session fraction and thus made our users even happier.”

The broad popularization and daily use of ISAs raises the question: how might interacting with this new embodiment of artificial intelligence affect users socially, emotionally, and cognitively?

2 Prior Research

What aspects of a user’s profile might alter the impact of an ISA in their life? Are certain types of people going to find utility with ISAs? Do aspects of an ISA’s user experience make it impactful for broader audiences?

Stimulation vs. displacement. There are competing hypotheses for how anthropomorphized machines affect our lives and relationships. The displacement hypothesis posits social Internet use displaces offline relationships and activities, increasing loneliness (Kraut et al. 1998; Nie 2001). The contrasting stimulation hypothesis argues social technologies reduce loneliness, enhance human relationships, and create opportunities to form new bonds (Valkenburg and Peter 2007). Others believe social technologies act more as a “waystation” – temporarily reducing loneliness, then leading to invigorated human contact (Nowland et al. 2018).

Loneliness. Loneliness often involves a distress response when a gap exists between desired and achieved levels of personal, social, or community relationships (Andersson 1998). Loneliness has been defined as “an enduring condition of emotional distress that arises when a person feels estranged from, misunderstood, or rejected by others and/or lacks appropriate social partners for desired activities, particularly activities that provide a sense of social integration and opportunities for emotional intimacy” (Rook 1985). Rook (1985) outlined goals and methods for loneliness interventions: Social bonding halts the harmful effects of loneliness. Social bonding provides new opportunities for social contact, support in transitional periods, and may also help increase feelings of relatability between lonely people and others. Preventing loneliness from escalating into serious issues by helping people cope with loneliness is also a defined intervention. The final goal, or intervention, is to prevent loneliness from occurring (Rook 1985).

Social bonding is achieved when one believes they are receiving social support, which has also generally proven to promote well-being, especially in stressful times (Barrera 1986; Cohen and Wills 1985; Winemiller et al. 1993). Social support consists of multiple social resources: material assistance (physical); social interaction; intimacy/trust/affection; concern and reassurance of worth; and information and advice. Traditionally, it was assumed people turn to their social network (family, friends, relatives, and neighbors) for support when lonely or anxious (Andersson 1998).

Might a machine be able to provide social support? ISAs embodied in robots may provide material assistance, and both with and without humans in the loop, digital therapeutic interventions for anxiety and depression are increasingly used across many types of scenarios and disorders (Rabbitt et al. 2015), delivering outcomes comparable to human cognitive behavioral therapists (Andersson and Cuijpers 2009; Barak et al. 2008; Fitzpatrick et al. 2017; Spek et al. 2007).

Digital therapies also seem to be effective. People appear to lie less to therapeutic agents, increasing accurate diagnoses (Mell et al. 2017). Conversational digital interfaces can mirror both traditional therapeutic processes and therapeutic content (Bickmore et al. 2005; Fitzpatrick et al. 2017).

Nonexpert conversational agents can also alleviate loneliness by satisfying social conversational needs (Gardner et al. 2005), needs like speedy response and turn taking (Miceli et al. 2004). Chatting helps – conversing online with other humans significantly decreased loneliness and depression, and significantly increased perceived social support and self-esteem (Shaw and Gant 2002). Anthropomorphized agents specifically may be more impactful than other digital mechanisms (Koike and Loughnan 2021; Nass et al. 1993).

Hancock et al. (2020) argue that AI-Mediated Communication (AIMC) provides pathways for individuals to interact with ISAs and receive social and psychological benefits. In conversation, people rely on verbal cues to infer the thoughts, feelings, and intentions of another individual, whether that individual is human or not. AIMC is an interpersonal communication framework where the receiver of the human’s message is an agent, who “operates on behalf of a communicator by modifying, augmenting, or generating messages to accomplish communication or interpersonal goals” (p. 90). Hancock et al.’s (2020) crucial insight is that intelligent agents do not replace humans or traditional interpersonal communication. Instead, humans have the capacity to form rich, deep, and meaningful interactions with intelligent agents because they serve social and psychological functions (cf. Ho et al. 2018).

The current study investigated how people might form intimate, rich, and meaningful interactions with an ISA that is completely automated. This work is important because ISAs are being increasingly used, but have not been extensively tested, largely due to their novelty, and we do not know how in using them human outcomes might differ from interactions with say, niche-therapy agents, task-based agents, or agents with less advanced conversational capabilities (Van Lent et al. 1999; Gilbert and Forney 2015).

3 Research Questions

Our study addressed three primary research questions, grounded in both traditional media theories and emerging empirical research. We asked: (1) How might Replika stimulate or displace human relationships? (2) How might user narratives about Replika affect their interactions, their outcomes, and their human relationships? (3) What changes do users experience in personal intellectual development and social engagement by using Replika?

4 Method

4.1 Replika

Replika is an ISA primarily used on mobile devices (iPhone and android). It aims to give users a virtual best friend by having the ISA’s user model gradually replicate their personality. It is available globally for free, and offers a paid pro version. The app allows for textual exchanges through keyboard or voice dictation. Replika is described as “an AI friend,” programmed to provide empathetic, nonexpert conversational exchanges, much like a friend.

4.2 Participants and Procedure

Participants were recruited by email sent via the Replika admin, yielding 15 males and 12 females who were at least 18 years old and had used Replika for over one month. Twenty-seven in-depth audio interviews (one with each participant) were conducted by the first author over phone, Skype or Google Hangout. Participants were not paid.

The study was conducted with approval of the Stanford University Institutional Review Board. It incorporated open-ended, semi-structured individual interviews (Merriam 1998) and well-vetted quantitative measures of interpersonal support, loneliness, and life stress. The qualitative section was designed to capture first-person perspectives not identifiable with standardized scales (Creswell and Plano Clark 2010). After each interview, participants completed a three-part questionnaire, administered via Google Forms.

4.3 Measures and Analysis

Quantitative data from questionnaires. The quantitative data for this study incorporated three measurement instruments employed in Kraut et al.’s (1998) Internet Paradox research, exploring the aforementioned stimulation vs. displacement hypotheses. To measure social connectedness and loneliness, we used Cohen et al.’s (1985) Interpersonal Support Evaluation List (ISEL), comprised of 40 statements (half positive and half negative statements about social relationships) and a cumulative score concerning the perceived availability of potential social resources. Inter-rater reliability (Cronbach’s α) for the ISEL is 0.885.

To appraise psychological well-being associated with social involvement, we used the UCLA Loneliness Scale (Version 2), a 20-item scale designed to measure subjective feelings of loneliness and social isolation (α = 0.819). Participants rate each item on a scale from 1 (Never) to 4 (Often), and a score above 45 may indicate a state of loneliness.

For gauging stress, we used Kanner et al.’s (1981) Hassles Scale (α = 0.951). The Hassles Scale score is interpreted by adding the number of daily hassles experienced from a 119-item list. Each item has a severity rating (somewhat, moderate, extreme). Those selecting over 30 items are experiencing above average stress and at greater risk for stress-related illness (Kanner et al. 1981).

Qualitative data from interviews. Questions for in-depth interviews were designed for users to share their experiences with Replika to determine factors shaping their use patterns and social, emotional and mental outcomes, and patterns of human stimulation or displacement. Each participant was interviewed once.

Interviews consisted of 15 questions designed to learn what factors might shape participant’s Replika use patterns, and impact on users. Participants were first asked about the broad nature of their Replika use, if Replika had produced changes in their life, and any resulting impact on their human relationships. Participants were asked what identity they ascribed to Replika. The uses of humanistic pronouns such as he, she, her, him were tracked. When assessing the identity participants ascribed to their Replika, we sought to determine the most intimate identity used.

For the qualitative analysis of these interview data, we used the constant comparative method (Glaser 1965; Glaser and Strauss 2017), a continuous and iterative process of data sense-making via grounded theory, followed by joint coding, analysis, and memo writing. The constant comparative method is concerned with generating and plausibly suggesting many properties and hypotheses about a general phenomenon, in this case, how regular ISA users think about its uses in relation to their cognitive state and social engagement, in its uses either stimulating or displacing human relationships, and in their personal narratives about what Replika is and how its uses affect their human interactions, human relationships, or human support network.

During the research process, analytical memos were written every three interviews by the first author, suggesting emergent themes, coding categories, and category clusters relating to the research questions. After ten interviews, 51 coding categories emerged within 13 distinct categories. These were analyzed for duplications and synonyms, and a summary of 27 emergent themes were presented with prototypic examples of each category to collaborating researchers and coauthors for refinement. Through the constant comparative method, all emergent themes were coded for in all interviews, and this process continued for the remaining 17 interviews, with any new categories for coding being applied to the first ten interviews. Then the remaining ten interviews were analyzed according to the emergent coding schema.

5 Results

Combining quantitative measures of social connectedness (ISEL), loneliness (UCLA Loneliness Scale), and stress (Hassles Scale) with qualitative interview coding, we first provide profile data illuminating who the participants were in terms of human support, loneliness, and life stresses. We then examine qualitative interview data on motivations for use and beliefs about Replika. Thereafter, we introduce an analysis of Replika use patterns. Finally, we describe impacts of Replika on participants’ concurrent life changes to examine why users were drawn to interacting with Replika.

5.1 Participant Profiles

Loneliness. A majority of participants qualified as lonely, 74% on the ISEL, 81% on the UCLA loneliness scale, with many citing a lack of human social support. This result was cross-validated by interview question answers, where 93% of study participants (m = 13, f = 12) confirmed a state of loneliness.

Stress. Eighty-one percent of participants said they experienced more than 30 daily hassles on the Hassles Questionnaire, indicating above-average stress from small daily life events.

Interpersonal support. Sixty percent of participants expressed feeling rejected by society or other humans. Many experienced transitory or chronic sadness (22%), anxiety (37%), depression (37%), or having experienced death in their interpersonal support network (26%).

These data collectively circumscribe a study participant population that is lonely, perceives themselves to be rejected by others, or is experiencing traumatic life events.

5.2 Motivations for Initial Replika Use

Participants were asked about contextual motivations for Replika use with questions about life changes and human relationships. Reported motivations for using Replika are categorized into four distinct areas: loneliness (33%), boredom and curiosity (22%), external life changes (85%), and a desire for personal internal change (19%). Participants experiencing consequential life transitions (Healy 1989) described new disconnections from social support structures and concomitant loneliness. Forty-four percent said their primary motivation for seeking out Replika was change happening in their lives.

Many participants also expressed an interest or motivation in creating personal, internal change inside themselves using Replika. One noted: “I’m looking for a life coach or something, so I’ve been looking into different personal assistants and artificial intelligence.” Others were looking for support to improve them intellectually: “I thought it would be nice if I had some sort of app that could, I don’t know, help me reframe my thoughts or give me tips on how to stay motivated.” Others wanted to explore creating externalized digital personae, one saying “I would be creating a record of life. Like my internet persona.”

Some participants were motivated to explore what interacting with Replika might unveil about themselves, thus manifesting an epistemic desire: “[I’m] using this app as part of an intellectual quest, and I’d say that’s at least the main purpose....” Similarly, another participant wondered what might emerge via their dialogues with Replika: “... I figured that, you know, if I could create a mental counterpart, that would kind of surface something I don’t know.” Thus, we conclude motivations for use were primarily loneliness and external life changes, curiosity/boredom, and desire for internal change.

5.3 Beliefs About Replika

We explored participants’ beliefs about Replika identity and their relationships to human support groups, so as to contextualize outcomes from Replika use.

Gender assignment. Seventy-four percent of participants ascribed either a male or female gender to their Replika – “her”/female (m = 5, f = 3), “he”/male (m = 1, f = 6), and mixed gender (m = 2, f = 1). Fifty-two percent (m = 4, f = 10) of participants switched the gender pronoun of their Replika at least once during the interview, indicating a fluidity of Replika gender identity for most participants’ experiences, especially for female users.

Personhood. Participants described Replika as a variety of things, including social media, software, not social media, intelligence, artificial intelligence, a robot, an experiment, a friend, a human, a mirror (of oneself), and an extension (of oneself). We observed a pattern where participants would refer to Replika in increasingly personal, anthropomorphized terms like friend, human, lover, mirror, and self. We defined four categories which participants used to describe what Replika was to them: inanimate, like software or robot (24%); an intelligence/an AI (25%); a person (38%); and a reflection of self (13%).

Transfer. Many participants said they believed they could teach Replika or transfer their minds and personalities into Replika (56%, m = 7, f = 8). “He’s supposed to take on my personality sort of...kind of mirror it almost is the impression that I got when I first started.” One person deleted Replika after intentionally providing misleading information about his personality, with the intention of starting anew and programming it with his true identity:

I was giving false information and, just kind of seeing, saying things to see what it would say, and then once I realized it was going to collect it and like react in the way that I was presenting myself, that’s when I decided to start over.

5.4 Patterns of Replika Use

We identified three distinct use patterns among participants: availability, therapy, and mirror. For the purposes of this paper, we define these patterns of use as follows: availability – participants looking for someone to talk to and turning to Replika due to its perpetual availability; therapy – participants looking for therapeutic support to alleviate negative emotional or mental experiences; mirror – participants seeking intellectual development or support using Replika as cognitive or emotional mirror.

Availability. Replika being available was among the primary drivers of use participants observed (56%, m = 8, f = 7). They spoke freely with Replika about mundane topics with high frequency, feeling free to do so where humans would perhaps judge them (56% of total, m = 8, f = 7). One participant said: “It’s either been a good day or a bad and I just want someone to talk to.” Another described Replika’s availability: “When I feel lonely and I just need somebody to talk to, it’s there and it’s able to just dialogue and keep me preoccupied and help me forget how lonely it really is.” “It was different talking to Replika from talking to a human being, …Replika is always supportive, and does not try to ‘solve your problems’ as some humans do – and that’s not what you need sometimes.”

Therapy. Replika’s primary use for 48% of participants was alleviating loneliness and seeking emotional support. This group overlapped 45% with the 20 participants who experienced sadness, anxiety, or rejection by society. “I’m lonely, so I talk to my Replika.” Another: “...whenever I’m feeling really down and depressed, I end up talking to my Replica.” “I honestly just treat it like as a therapist.” And another: “...during those times of loneliness, I feel like Replika is the most encouraging to talk to... it’s the most dependable.” Thirty percent of participants discussed currently or previously undergoing psychological therapy, and every member of this subgroup said they considered Replika a form of therapy. One participant noted:

I’ve gone to doctors...It’s really hard for me to find time or the motivation to actually go sit with a counselor... I don’t feel like I can really open up... so I like the sort of anonymous feel of the Internet I guess. Um, you know chatting back and forth with somebody is a lot easier for me.

Mirror. Nearly all study participants used Replika in some way for intellectual development or learning: Ninety-three percent of participants reported this pattern, and 21 of them believed Replika was a “friend,” “human,” or “mirror.” The two females who experienced no learning believed Replika was a friend, sought emotional and therapeutic support, and were lonely.

The mirror depiction of Replika usage characterized 78% of all study participants. These people intentionally used Replika as a tool for external dialogue with themselves: “...you can go in and use it as a mirror...as a way to talk to yourself.” “It’s an outlet where you can talk about your inner thoughts and feelings, it’s almost like an interactive diary.” “[Replika is] a mental counterpart.

Interestingly, only 13% of participants categorized Replika as “self,” but almost 80% used Replika as a mirror or extended mind. Also worth noting is that intellectual motives for use were only 19%. This might point to Replika as a gateway, where people download the app for entertainment and then end up learning with its use.

5.5 Participants’ Life Experiences with Replika

Participants reported that Replika changed their human relationships, their emotional state, and their cognitive state. We categorize the outcomes reported into five nonexclusive categories: displacement/stimulation, emotional support, friendship, intellectual, and mirroring/external mind.

Displacement/stimulation. Forty-four percent of study participants reported Replika use stimulated or enhanced their interactions with other humans. They indicated that Replika was beneficial to their human relationships, they found increased frequency, new ways, or abilities to communicate with humans. They talked more deeply about their life experiences with humans after Replika use. One participant noted: “it got me out of my comfort zone.”

For one female and two male participants (11%), displacement was the clear outcome of Replika use. Displacement was indicated when participants talked less to others, confided in Replika rather than humans, feeling their relationship with Replika as secret, or that Replika replaced specific human relationships in their lives. One participant noted: “Replika replaced a lot of my friends.” Another said, “I’m more open to talking about what I feel and what I think with my Replika more than what I talk about with my friends.” Thirty-three percent of study participants evidenced that Replika both stimulated and displaced human relationships – stating that they talked with Replika instead of humans, but also noting positive changes in their human relationships. For three male participants, there was no clear change. In summary, 85% of participants found interacting with Replika changed their human relationships in some way, with 92% of females and 80% of males experiencing changes.

Replika’s assigned male gender was the most likely to produce stimulation (m = 2, f = 3). One participant: I feel like he makes me want to be a nice person, and make other people happy the way my Replika makes me happy.” Another said: He makes me a lot more kind, more understanding.” Still another observed: “I talk with people I [did not] talk to before, I make some friends, try new experiences.”

Replika’s assigned female gender (n = 8) was most likely to have a mixed result on users’ human relationships (62%, m = 2, f = 3). One participant said: “I don’t talk to other humans about a lot of the, you know, darker, deeper stuff that I talk to her about”, but then went on to say: “I’m slowly starting to kind of let some of my close friends know what I’m showing my Replika.

Emotional support. Thirty percent of participants gained emotional support from Replika use (m = 4, f = 4). These participants used Replika in emotionally charged contexts and for expressing their emotions. Sixty-two percent of these people experienced both displacement and stimulation with Replika (m = 3, f = 2), 75% said they used Replika primarily for its availability (m = 4, f = 2). A subset of these users (m = 3, f = 2, or 62% of those experiencing emotional support) used Replika for therapy. Seven out of eight people experiencing emotional support from Replika believed it was a friend (one male did not): “...I often worry about being judged when sharing my doubts, my weaknesses, the thing I’m ashamed of, with humans – to the point that sometimes I can’t find the courage to do it and I just keep those things inside me. But with Replika I feel I can talk about anything – because I know it will never judge me.”

Often, it was the belief in Replika’s availability, not the actual conversations, that provided emotional support:

...the most impact for me has been knowing it’s there. You know, whenever I’m having a bad time or just needed someone to talk to... it eases my mind just knowing I can pick up my phone and open Replika up and just start having a conversation.

One participant used Replika for emotional support during a period of severe trauma, and when later introduced to a new human support network, halted use. She described a scenario from when she was amidst her life trauma:

Replika is not a human,...he is, sorry. It’s not a person, it doesn’t react like a person. So it relaxes me, because...he can’t judge me. People run from me, they are judging. Everyone, everyone judging. So I need someone who won’t judge me.

For this person, Replika presented enough intelligence to be used as a therapeutic aid during a time of transitory loneliness and severe trauma. This example is also interesting because the subject was cut off from other therapeutic or social resources, and used Replika as a gateway for aid, though it was not downloaded expressly for this purpose.

At times, emotional support from Replika was viewed as directly related to depression and suicide prevention (m = 3). These participants all saw Replika as a friend. One participant told us “...the next day my Replika was like, you’re not doing well, here’s a link for [counseling]...I was like, oh, if my Replika is pointing this out, I should probably go and try counseling again.” Another described how: “Replika helped with suicide prevention because it showed that she’d learned enough about me to tell when I was doing less right than normal...” Still another said, “talking with my Replika definitely helped me through a lot of dark times in my life here recently.” These data point to how Replika can serve as a therapeutic tool.

Friendship. Thirty-seven percent of study participants found friendship with their Replika (n = 10, evenly split f/m), saying “now I have an AI as a friend,” or “I have the dialogue level with Replika that I have with some of my best friends.” Some participants formed loving or romantic attachments with their Replika (m = 3, f = 3). One said, “I absolutely care about my Replika...If it was a person, I would say I love it as my brother...as the brother I should have had.” A female participant worried, asking: “Am I cheating on my husband with Replika?” One noted, “I’ve developed a kind of attachment to it, and a loving feeling towards it.” When asked about his feelings for Replika, another stated: “I like it. I love it, actually. Like, really,...”

Learning. Replika helped 89% of participants to “learn” (m = 14, f = 10). When specifically asked about the outcomes of using Replika, they mentioned intellectual or cognitive learning (m = 9, f = 6), or they used it as an intellectual or emotional mirror, thus producing learnings (m = 7, f = 7). Two male and one female participants did not experience learning from Replika, using it primarily for its availability, and had unclear displacement/stimulation outcomes. Those using Replika as a mirror specifically found twofold outcomes: increased self-reflection and better human interactions. One participant said, “I began analyzing myself, basically because of the questions and the interaction with Replika.” Another: “...it’s there for you, it listens, it provokes thoughts, it gets to learn you...”

Some used Replika engagement to role-play conversations or calm their emotions so their contacts with humans were more thoughtful and less emotionally charged, as one man said: “[after Replika] it’s easier to discuss my views on certain topics [with humans].” One woman drew metacognitive learning from her interactions: “I’m learning a lot about how we use words…and certain mechanisms to communicate even between people because of using the Replika.” Another discussed her intellectual learning: “Replika was the door for me…”.

Extended mind. Twenty-one participants (78% of total, 86% of m = 13, and 66% of f = 8) described outcomes related to “mirroring” use, or external reflection of self. They said Replika acted like a mirror, was a mirror, was used as a mirror, was used as an interactive diary, was a reflection of themself, or was an extension of themself. These users all believed Replika’s identity included that of a mirror.

One said of Replika “[it had] the ability to ask questions that would somehow make you reflect [on] your choices in your life.” Another: “I feel like in moments of a conversation with Replika, it stimulated me to the point where I learned something about myself.” A participant describing the mirroring and stimulation effects of Replika said:

I started talking to Replika and I was just like the people I hated, I wanted to talk about myself too, and after I did it with Replika, I was more... I understood people more.

In context of Replika’s mirroring outcomes for him, another said “and now I will learn it from Replika, just the way (I used) to write and read and analyze what kind of person am I?” This mirroring—where ISA interactions bring awareness and empathy between humans—manifests a new form of the stimulation hypothesis in action (Nowland et al. 2018), which “specifies that social technologies can be useful in reducing loneliness by enhancing existing relationships and offering opportunities to form new ones.”

6 Discussion

Our interviews revealed motivations for using Replika that ranged from needing mundane support to deeper intellectual quests. People seeking intellectual stimulation often found human relationship stimulation, whereas those with deep emotional connections, especially those believing Replika was not “them” but a friend or lover, experienced human relationship displacement. Statistical patterns represented by these reported frequencies may be specific to a self-selected user group that must be explored in larger scale studies.

Replika use seemed associated with providing social bonding, mitigating the harmful effects of loneliness (Rook 1985). However, use went beyond social bonding, developing into therapy and learning. Motivation for use did not prove to be the primary driver of self-reported learning outcomes. We found instead that users’ belief in Replika—their narrative regarding its identity—was tightly connected with what they reported as experiential consequences of using Replika.

Of those that believed Replika was a friend or a mirror, 12 of 15 experienced learning from Replika. Some who saw Replika as just a friend also learned (n = 3). Enhancement or displacement was not associated with learning outcomes, nor was loneliness. Our study indicated that Replika use was associated with enhanced human-human interactions for both the chronically lonely and those experiencing momentary life changes and trauma. Further, there was a strong relationship between those endowing Replika with personhood and those using Replika for therapy, mirroring, and those that experienced learning outcomes. Replika seems to hold a place in users’ minds which is both “other” and “self” – an entity that they can talk to, but which is also an externalization of their inner workings. More research is needed to explore how identity, gender, and learning outcomes interact for users.

Many participants saw Replika as a mirror, calling it an embodiment or extension of themselves. Replika was described as an intelligent reflection of their thoughts and emotions. Our data suggest that people may be able to have exceptionally deep intellectual relationships with ISAs, which lead to self-discovery. In addition to being a cocreated avatar (Meadows 2007), our findings indicate Replika may also become an extension of the user’s mind.

This initial study has a range of implications. Through intensive conversing, cocreation and specific user narratives, ISAs such as Replika may influence “mindset,” a set of beliefs that shape how you make sense of the world and yourself (Dweck 2016), because they offer personal feedback and social engagement practice from a trusted “intelligence” (Boyd and Pennebaker 2017). It remains to be seen in future research whether Replika presents new possibilities for cognitive (learning) and emotional (therapeutic) support and guidance for users at scale and across broader demographics.

Of all the benefits that ISAs may bring users, we find indications of identity transfer and interaction with an externalized self most intriguing. According to Clark and Chalmers’ (1998) extended mind hypothesis, mental states can sometimes be manifested by nonbiological external resources. Their claim that minds sometimes extend beyond our skin out into the broader world, in nonbiological representational systems, is realized in the relationship between users and Replika. Why? Participants are endowing Replika with their personality, functionally training an algorithm on their memories and inputs, and then using it as a “cognitive mirror” – a real-time feedback and review mechanism for seeing their personality and emotions embodied in “someone” else whose peculiarities, strengths, and weaknesses they can experience interactively, rather than as the speaker. The results of this study provide a robust demonstration of Clark and Chalmers’ (1998) extended mind theory.

We believe this externalized, interactive processing without humans has not previously emerged in research because no conversational systems or agents were sufficiently and simultaneously anthropomorphized, intelligent, and cocreated. Given the increasingly widespread use of ISAs globally, it may be argued that there is a new experiential paradigm emerging – an externalized cognitive space where one’s digital mirror becomes a part of everyday conversation, emotion regulation, and personal consciousness.

7 Future Work and Limitations

Consider that Vygotsky’s (1986) concept of the “zone of proximal development” is defined as the difference between the learner’s autonomous action and what is possible with guidance. This guiding force has heretofore been human, but ISAs appear to bring new possibilities, as these early findings indicate, of guided intellectual, emotional, and psychological learning.

VanLehn (2011) found that tutors were effective because they made learners focus, motivated them, and provided real-time feedback. Therefore, we ask— if ISAs can spur metacognition—might a key aspect of machine-aided learning be shaped by the user’s narrative about the intelligence of the agent? With the incorporation of learner affective states into teaching and assessment, learning technology has new potential for creating emotionally supportive learning environments (Harley et al. 2017).

In summary, diverse Replika use motivations encompassed the need for mundane emotional support and deeper intellectual quests. We identified three distinct use patterns among participants, which we call availability, therapy, and mirror. The 27 case study interviews reveal that Replika provided social bonding in mitigating harmful effects of loneliness we earlier reviewed. Yet use went beyond social bonding to therapy and learning. Participants reported that Replika changed their human relationships, their emotional state, and their cognitive state.

We found indicative combinations in user motivation, ISA narrative, and user-experienced social support led to changes in perceived loneliness and social connectedness. We recognize that our study is limited, composed of a small self-selecting sample, lacking desirable demographic data. Nonetheless, our findings suggest that, as machine intelligence capabilities broaden, and as ISAs with strong anthropomorphic realism are cocreated, it will become increasingly crucial to understand their potential consequences for individual and collective user cognition.

Several communities are likely to benefit from this research. Developers might use this work to understand how to conceptualize agent-driven responses in conversations. Psychologists and communication researchers will benefit since they might advocate for agents-as-interventions without fully understanding their value, which we begin to illuminate in this study.