We’re in this Together: Intentional Design of Social Relationships with AIED Systems

Article

Abstract

Students’ relationships with their peers, teachers, and communities influence the ways in which they approach learning activities and the degree to which they benefit from them. Learning technologies, ranging from humanoid robots to text-based prompts on a computer screen, have a similar social influence on students. We envision a future in which AIED systems adaptively create social relationships with their learners in addition to modeling student learning and providing adaptive cognitive support. By deliberate design of the relationships between learners and systems, the systems can have maximal impact on student learning and engagement. This deliberate design would include careful consideration of the type of learning technology and its channels of communication with the learner, the broader context of the interaction, and the history of the relationship between the student and the learning technology. To achieve this vision, as a field, we will need to build understanding of how relationships form in human-technology settings, and how educational technologies should be advanced to support the assessment and monitoring of meaningful relationship factors over time.

Keywords

Social technologies Emerging technologies Systems that care 

Introduction

A student stares at the screen. First day of geometry, but already wrong again. A message pops up: “Maybe we should think about the definition of isosceles triangle - do you remember what we said about the three sides?” The student relaxes.Oh yeah, we learned about isosceles triangles already,” she thinks, “and at least Im not doing this alone.

While the technology in the above vignette has no face and no voice, the learner still reacts in a social way. This may seem to be a fantasy, but recent results have shown that students can in fact learn more when an AIED system employs the type of polite language used by acquaintances (e.g., McLaren et al., 2011; Wang et al., 2008). These results are anticipated by both pioneers in learning theory, who suggest that all learning is social (Vygotsky, 1978), and in human-computer interaction theory, who suggest that people respond to technology in similar ways as they respond to humans (Nass & Reeves, 1996). Socio-motivational factors such as students’ perceptions of instructors, peers, and others in a learning environment, their own identity within that environment, and their cultural context have a strong influence on the ways in which people learn. This influence holds as students engage with all kinds of technologies, ranging from humanoid robots to text-based prompts on a computer screen.

Why are social relationships so important in learning contexts? Imagine that Sina has been assigned to help Randall on a unit Sina has just completed. If Randall compliments Sina’s tutoring abilities, Sina feels like she is succeeding at the task. As she approaches future related topics, she will feel confident and able to learn them as well. If Randall challenges Sina playfully on a particular point, they both may reflect more on their conceptions of the problem, and even revise them. On the other hand, if Randall tells Sina in annoyance that she is a terrible tutor, Sina may feel self-conscious and nervous. She may lose confidence in her abilities and stop attending to the task, while Randall ignores her suggestions and strikes out on his own. The interaction between the two students has direct consequences for how much they benefit from the current task as well as their future efforts.

One might argue that if Sina and Randall instead spent the bulk of their time working individually, reading from a textbook and solving problems on a worksheet, these social factors would not need to be considered. But, even this kind of learning activity does not occur in a vacuum. In classroom environments, a teacher needs to assign and evaluate the work, and the relationship between the student and teacher has a dramatic impact on how students engage with the class and how they benefit from it (cf Christophel, 1990; Richmond et al., 1987). Further, other students in the same class are working on similar problems in the same locations, and parents and communities support students in doing their homework or pursuing after school learning opportunities. Sina’s day-to-day social interactions will influence how she approaches the textbook and worksheet, even if those learning materials are not interacting with her directly.

Moreover, unlike a physical textbook, AIED technologies do interact with the learner directly, and thus can create a social relationship beyond those already fostered within the learning community. As AIED technologies have become more intelligent, there has been growing attention to the ways in which they can socially engage learners. One of the earlier innovations in supporting a variety of social and motivational interactions was the idea that AIED systems can take on a variety of the roles that humans do in a learner’s environment (Chan & Baskin, 1988). For example, they can be tutors, motivational coaches, learning companions, or even teachable agents, where students learn by teaching the agent about the subject domain. There is building evidence that socially sensitive technologies have a more positive impact on learning than technologies that do not behave in social ways (Kumar et al., 2010). Politeness, rapport, and feelings of accountability for one’s collaborative agent-partner have all been shown to impact how much students learn from interactions with virtual agents (McLaren et al., 2011; Wang et al., 2008; Ogan et al., 2012a, b, c; Chase et al., 2009).

As the field of AIED evolves, there is increasing recognition that building social behaviors into AIED systems is important. So would our systems all benefit from using language templates that look like the vignette above? What if this was the 103rd interaction the student had with the system, rather than the 3rd? While a polite feedback message is likely to be appropriate for the first interaction with a learner, it may seem distancing after six months of working closely with the same tutoring system. Instead, we can view each interaction between a learner and her AIED system as part of the process of building a strong social and academic relationship. After six months, polite speech might be replaced by rapport-building teasing, “I finally got it - corresponding angles! No thanks to you…”, or off-topic self-disclosure, “Yeah, I like computer games too. Of course, I play as the computer.”

In this paper, we argue that, while AIED systems have been demonstrating early success by incorporating social features into their algorithms, there is significant opportunity for a next-generation advance in the ways in which we incorporate socio-motivational factors into our systems. Currently, while students have social responses to educational technologies, these responses and their connection to learning are not always considered in the design phase. We envision a future in which AIED systems adaptively create social relationships with their learners in addition to modeling student learning and providing adaptive cognitive support. By deliberate design of the relationships between learners and systems, systems can have maximal impact on student learning and engagement. This focus on the design of the relationship implies that: 1) the system can assess and understand at any given moment the relationship with the learner, and 2) the system can dynamically deploy social behaviors to influence both individual interactions with the student and the relationship as whole. This future would hold a transition from the use of static roles and isolated behaviors to a holistic vision of how the technology and the student work together to create a productive learning environment.

In the following sections, we outline various dimensions that influence how social relationships can be engendered between humans and computer learning systems. In the section “Importance of designed relationships for learning,” we begin by defining what we mean by relationships and designed relationships, examining the relationship types and properties that may be effective in learner-technology relationships. In “Features of technology that support relationship formation,” we discuss how relationships are formed with technology, focusing on how the affordances of technologies can inform the relationships formed. In “Dynamic dyadic relationships,” we discuss how the designer needs to consider how relationships vary based on context (cross-sectional features), and evolve based on the interaction history (longitudinal features). Finally, in the section “Achieving this vision,” we propose a model for how AIED systems should approach designed relationships, and then five research guidelines that will help the AIED community to achieve this model.

Importance of Designed Relationships for Learning

Mariana said to her virtual learning companion, “So where do we start?” She spoke slowly and very quietly, feeling a little shy, which was typical for her when working with somebody or something new. The tone of her voice rose slightly at the end of the sentence, indicating her uncertainty. The agent replied, “I think were supposed to begin by subtracting fifteen from both sides?” It matched Mariana by also speaking slowly, quietly, and turning the statement into a question. Mariana thought, “Phewshes as nervous as I am.

As two individuals make a connection, they enter into a relationship. Relationships can be platonic or romantic, personal or social, voluntary or exogenously established (i.e., imposed, as in a work colleague; VanLear et al., 2006). They have characteristics such as how symmetrical, intimate, or conflictual they are (cf Burgoon & Hale, 1984; Kerns, 2000). These characteristics are influenced by factors such as the situational context, the interaction history, individuals’ perceptions of themselves and each other, their motivation and goals, and people’s previous experiences in relationships in general (Hinde, 1996). Given this definition, many researchers believe that a relationship is dyadic, in that it emerges from the interactions between two individuals, and dynamic, in that it evolves based on repeated patterns in what participants say, do, and feel (e.g., Fisher & Adams, 1994).

Students’ relationships influence learning in many ways. Martin & Dowson (2009) in an extensive review argue that relationships teach students academic beliefs, orientations, and values. In addition, feelings of relatedness and belonging inherently produce positive emotional responses. Both functions of relationships can help to motivate productive student behaviors within learning contexts. There is a long history of research on social goals, community norms, and the impact of relationships on learning. Some research conceptualizes knowledge creation as an entirely social process, where learning occurs through participation in learning communities and the advancement of knowledge as a community (Paavola et al., 2004). Strong communities can influence participation goals and lead students to adopt the positive learning behaviors normalized by the community (Urdan & Maehr, 1995). For example, when looking at what motivated people to answer questions on OpenStudy, an online Q&A site for learning, Walker et al., (2014a) found that combinations of social factors dictated participation, such as community norms of giving help, not answers; the desire to build reputation within the site; and the need to build and maintain friendships within the learning community.

Beyond their broader community membership, students’ relationships with their peers and teachers have a strong influence on their learning outcomes (e.g., Jennings & Greenburg, 2009). Extensive research has investigated the phenomenon of teacher immediacy, a construct that indicates closeness between a teacher and his or her students, including nonverbal behaviors such as smiling and eye contact, as well as verbal features like inclusive references and praise. Studies consistently show that increased teacher immediacy corresponds to increases in students’ motivation to attend to the class material and produces better learning outcomes (Christophel, 1990; Plax et al., 1986; Richmond et al., 1987). Rapport between peer learners is also directly related to learning outcomes. Friends who are placed in pairs learn more from a collaborative activity than strangers do (e.g., Ogan et al., 2012b). These friends lean on their relationship to allow them to critique and correct each other without harming their self-efficacy or identities as learners. These social bonds have cognitive implications. While watching learning partners work on problems and make errors, students reflect, noticing their own misconceptions; as they give explanations to friends, they elaborate on their knowledge and construct new knowledge (Ploetzner et al., 1999; Roscoe & Chi, 2007).

We propose that AIED systems include designed relationships, or particular care taken to construct the socio-motivational relationship between the AIED system and the student. As we note above, a growing body of literature suggests that socially-designed interactions with educational technologies can produce similar outcomes as social interactions amongst teachers and students or peer collaborators. Social relationships do not need to be complex to influence students’ performance. Even simply being told that they are on a team with a computer leads users to find that particular computer’s information more trustworthy (Nass et al., 1996). Further, even technologies that do not plan for social interaction can produce unanticipated social effects. For example, in an exploratory robotic assistant system deployed in a hospital setting, nurses became annoyed at the robot’s tendency to interrupt at inappropriate times, saying “Can’t you see I am on the phone?” (Barras, 2009). If a system can be designed such that it develops a relationship with students akin to the positive relationships the students form with friends, with teachers, and within their learning communities, it is likely that students will experience more positive outcomes.

Given that social relationships affect learning in many ways, technology developers can draw on a range of specific goals they may wish to influence with their systems. For example, if the goal is simply to motivate students to attend to the course material, a design might incorporate more instructional immediacy into the interaction, following techniques used by skilled teachers (e.g., messages might address the student by name). If promoting better cognitive reflection is critical, then a dialogic relationship that involves teasing, constructive conflict, and challenging of ideas might be a design direction to explore. On the other hand, if the goal is to make a low self-efficacy student feel more comfortable in a learning situation, a designed relationship that involves interactions featuring more praise than contradiction might be appropriate. In the vignette at the beginning of this section, by varying her tone of voice, the agent is attempting to build initial rapport, indicating to her collaborative partner that they are on the same page and working together to solve the problems. Relationships can help the instructional technology designer better achieve their goals within a particular context. In the next section we discuss whether students can actually develop relationships with technology and in what ways.

Features of Technology that Support Relationship Formation

James scanned his bookshelf. Normally Bio and Chem fought for his attention, but they both looked pretty calm. On the other hand, History was glowing red, indicating that there was some kind of reading assignment due within the next few days. James grabbed the history textbook off the bookshelf, and the textbook responded to his action, the light travelling up and down its spine. James could tell it was excited to finally get its chance. The light danced around a little bit, showing that the textbook thought he should get started right away. James laughed to himself – “Its so worried Im going to get behind!” he thought.

People establish and maintain types of relationships through a glance, a wave, shared or dissimilar appearances, through their speech and dialog (synchronous or not) – even through the ways they arrange their bodies relative to each other and the space around them. As technology becomes increasingly sophisticated, so does its ability to assume human mannerisms and interact in human-like ways. A variety of foundational technologies such as speech recognition and natural language processing, robotic systems, and sophisticated graphical interfaces allow technologies to engage in relationship-building across various modalities and within different aspects of a system.

Designed Relationships Across Modalities

These advances are perhaps most exemplified in the technological genre of embodied conversational agents, which are virtual agents that live on a screen. These agents are designed to teach and motivate students through their dialogue, nonverbal features such as gesture, and other visual features such as appearance (Moreno et al., 2001). For example, Ada and Grace are “twin computer scientists” who process natural language interactions from learners in a museum setting, engaging young future programmers in asking questions about the field (Swartout et al., 2010). There is evidence that the design of such agents does in fact affect learning. Alex is a life-sized virtual agent system that models appropriate scientific talk for second and third grade learners while collaborating with them on engineering and natural sciences problems (Rader et al., 2011). At the same time, its appearance, speech, and other properties have been carefully designed to support learners who vary from the mainstream in their own speech, and it was indeed found that students learned more from a version of the agent who spoke like they did. The design of agents also has been seen to affect their social perception by learners; in work from Ogan (2011), students learning from a system with virtual agents felt more trusting of, more like they shared a perspective with, and more like they were working on a team with agents who used a social model of dialog than those who were task-focused. This social model also led to better learning outcomes (Ogan, 2011).

Emerging educational technologies such as robots will provide new opportunities for learning environments to interact socially by engaging additional modalities. Robotic learning companions leverage many of the same communication channels as virtual learning companions, such as speech and facial expressions, but add an embodied presence and the ability to sense, move within, and act on the physical world (see Fong et al., 2003, for review). Like with embodied conversational agents, people respond to the perceived emotional and social state of a robot in fundamentally social ways, and this has advantages for improving motivation and learning (Breazeal, 2002; Kanda et al., 2012; Saerbeck et al., 2010; Leite et al., 2010). Additionally, the mere physical presence of a robot might directly influence engagement (Looije et al., 2012), a robot may be able to convey social cues using channels such as proximity to the learner (Walters et al., 2006), and, more practically, the mobility of the robot can allow it participate in embodied tasks and connect with multiple learners. In the rTAG system, students interact with Quinn, a Lego Mindstorms robot with an expressive face animated on an iPod Touch, and teach Quinn how to solve coordinate geometry problems. Quinn and the student occupy the same physical space; we project a coordinate system on the floor, and Quinn can move around the system, plot points, and draw lines. We have found that students respond to Quinn’s movement, speech, and facial expressions in social ways, for example by empathizing with Quinn’s feelings after correct or incorrect responses (Muldner et al., 2014). Moreover, based on our observations, the students who engage physically with Quinn (e.g., by moving close to Quinn prior to giving her instructions) appear to benefit the most from the activity (Girotto et al., 2014). Michael Timms, in this special issue, expands more on how technological advances might inform the design of AIED systems (Timms, this issue). We must take advantage of the additional modalities afforded by new technologies to connect socially with learners in more advanced ways.

While the above two example technologies utilize multiple modalities to engage learners in social interactions, perhaps more surprising is that minimally interactive, very simple, non-humanoid technologies can prompt social responses in students. For example, McLaren and colleagues found that the incorporation of social cues such as politeness in text-based messages changes the way low-knowledge learners process those messages (McLaren et al., 2011). The vignette at the beginning of this section represents a vision for how a digital textbook can connect socially with a student without dialogue. If the textbook feels excited when it is read, left out when it is not, visually indicating to the student that it cares if the student learns from it, the above results suggest that student motivation to learn from and use the textbook will be enhanced.

Relationships with Multiple Technological Features

In considering that people can connect socially with embodied and non-embodied technologies alike, it also follows that relationships with technology are not necessarily one-to-one; multiple types of relationships with a student can exist within a single learning environment, and these relationships can influence each other. A prototypical example of this is when there are multiple pedagogical agents in a learning environment, such as the versions of AutoTutor where there is a tutor and virtual learner both interacting with the human learner (Graesser et al., 2014). In addition, in cases where one learning environment delivers feedback in multiple ways, different types of feedback might be perceived differently by students. In APTA, an intelligent tutoring system for peer tutoring, there were two adaptive interventions: a dialogue agent giving feedback in the chat, and step-by-step feedback in an equation solver window (Walker et al., 2014b). Students perceived the dialogue agent as a separate entity from the other intelligent components of the technology. These multiple relationships are not entirely independent, however; in Ogan et al.’s (2011) work, interactions with one agent changed how people responded to and felt about other agents in the same learning environment. This finding would suggest that in APTA, students’ experiences with the dialogue agent could influence how receptive they are to error messages from other components of the system.

It is critical to think about what channels of social communication an educational technology might be using to form a relationship with students, how these differ across different facets of the system, and how they might interact. We argue that relationships will be formed with technologies with different affordances, spanning pedagogical agents, dialogue systems, robots, simple feedback messages, or even less traditionally interactive technologies such as digital textbooks. Additionally, the student may view different aspects of the technology (both embodied and non-embodied) as having social agency in different ways. These two points provide an initial framework for thinking about how to construct the overall desired social relationship in order to promote positive motivational and learning outcomes and mitigate negative ones. In the following section, we discuss in more detail what choices must be made regarding students’ relationships with technology.

Dynamic Dyadic Relationships

Today was math day on the intelligent tutoring system. Fraction addition. Sofia loved math, as numbers always made so much sense to her. Biology was another matter. It seemed to be an endless stream of facts to remember. Sofia logged on and checked her assignment. Yup, she was supposed to tutor Robofriend. Perfect. Typically she taught Robofriend math, and Robofriend taught her biology. They got along pretty wellafter they got to talking, it turned out a they shared a mutual love for baseball and puns.

In the previous section, we discussed the various modalities of technology that can be utilized to create interactions that influence the relationship with the student. Once these have been identified, it is important to determine characteristics of the relationships the designer desires to engender. As laid out by Hinde (1997), all relationships are influenced by both the socio-cultural structures and the physical environment in which they exist. We therefore argue that the kind of relationship formed with an AIED system will depend on the context of the relationship, spanning both cross-sectional contextual features (e.g., the current location of the learner, who the learner is working with in the moment, what the learning activity is, the broader cultural context) and longitudinal contextual features (e.g., how long the learner has been working with the technology, the learner’s previous experience in relationships, the learner’s individual characteristics). Below are a few illustrative examples of how these features might influence how relationships form and evolve.

Cross-Sectional Features

We define cross-sectional features as those that apply in a particular interaction with a learner and do not require adaptation over time. One cross-sectional feature that might inform the relationships that technology forms with students is the platform on which the learning content is delivered (e.g., mobile device, tablet, laptop, or large scale display). To illustrate, we take Lui et al.’s (2014) conception of a smart classroom, where a classroom is converted into an immersive simulation consisting of large-scale projections, interactive whiteboards, and personal tablets that students use to explore the simulations. Students engage in a variety of learning tasks, spanning interacting with the projected simulations, working individually and collaboratively using the tablets, and discussing large-scale emergent visualizations of class work. In this scenario, students have one kind of relationship with their personal tablets, which is sometimes private to the student and sometimes shared with others in their small group, and a different relationship with the public large scale displays. It opens up questions such as: Should the tablet-based learning environment behave differently if is being used by its student rather than being shared with others? How can the public display keep students socially engaged? Do students’ relationships with their tablets influence how they approach the public display, and how can this be leveraged?

While platform is a cross-sectional feature that can vary across an individual’s interactions with an educational technology, there are also cross-sectional features that are likely to be consistent across a particular student’s interactions. One such cross-sectional feature is the national culture of a student, such as the degree of individualism, power distance, or other values that regulate interactions in a given sociocultural environment (see e.g., Hofstede & Hofstede, 2001). For example, in Latin American contexts, we have found that students exhibit far more collaborative behavior with their peers while using the same educational technology as students in the United States (Ogan et al., 2015, 2012c). While these values are likely to be consistent and do not need to be continually assessed as the interaction progresses, it is nonetheless important to track appropriate values for these features in the learner model in order to design socially-sensitive systems (as in Mohammed & Mohan, 2010). Our findings showed that the amount of collaboration observed in student use of the above learning environment was lower than in a classroom setting in which technology was not being used - likely due to the individualistic design of the system. When designing a technology for such a context, it is important to both recognize that students will be inclined to use your technology collaboratively, and then further design the technology to appropriately support the collaborative interactions (i.e., to support students in their current cultural practices). For example, if you design an agent for that system it may need to interact with multiple parties. Or perhaps not every student needs an agent. Or maybe they should each have a “different” agent, and the agents should work collaboratively too!

Longitudinal Features

Features unique to a particular dyad, which we label longitudinal, will also affect the relationship between the student and the agent. This is the component that has been understudied in any AIED system to date. These features will have a basis in the history of interactions with the learner and require tracking and continual assessment over time. It is well known that human relationships evolve over time (Spencer-Oatey, 2000). Strangers become acquaintances, acquaintances become friends, and as they do so, the social guidelines that underlie their interactions also shift in often predictable ways. For example, strangers tend to be more polite with one another, and follow more prototypical social norms (Brown & Levinson, 1987). As relationships grow stronger, friends often discard these politeness norms, allowing for more playful and even seemingly negative behaviors to occur (Tickle-Degnen & Rosenthal, 1990). There is already evidence that this shift occurs in learners’ behavior as they interact with intelligent systems as well (Ogan et al., 2012a, b, c). Thus we envision system behavior that changes over time as well, as the relationship between learner and system grows. Efforts in this direction may start with something as basic as acknowledging that the learner has interacted with the system previously, and eventually could grow to include making jokes about shared conversations (i.e., referencing common ground; Clark & Brennan, 1991) or reflecting appropriate empathy when the student is discouraged (Picard et al., 2004).

The designated social roles the learner and the agent take should also have the freedom to evolve longitudinally. For example, AIED systems have typically been envisioned with a single paradigm; either the system is expected to have all of the knowledge, and thus acts as a tutor or guide, or the onus is placed on the learner to “teach’ a system that may “know” only what the learner tells it. When systems have taken multiple approaches, such as in Graesser et al. (2014), they have tended to split the roles across multiple agents, where one agent might act as a “mentor” and another a “cheerleader”. In the real world, however, teachers typically fill multiple roles at once, and peers even more so. It is natural to therefore envision fluid roles in which a system might request to be taught concepts that the learner understands reasonably well, provide knowledge when the learner needs it the most, or even work collaboratively together with the learner as a true peer. This will support the system in not only achieving cognitive goals, but also meta-cognitive goals such as help-seeking, In order to achieve these goals, this will require the ability to flexibly handle these transitions in ways that are socially appropriate. A student who is teaching their agent would understandably be upset if the agent suddenly said, “Actually, I know more about this than you do. I am going to teach YOU.”

In the future of AIED technologies, system social behaviors are dependent on understanding of the interaction context, spanning cross-sectional and longitudinal features. Cross-sectional features are elements of the learning context, and may be consistent for a particular individual or vary between contexts. Longitudinal features depend on an individual’s experiences with a particular system, with technology, and with learning more broadly. These features do not operate in isolation, but interact with each other to influence an individual’s behaviors and perceptions. For example, a student’s culture might influence how quickly they will form a relationship with an intelligent system over time, or a student’s previous experiences with a particular platform might influence how open they are to using that platform in a learning context. There are currently many open questions relating to how educational technologies can be designed to be sensitive to these features and their interactions, and developing a computational model of how learners respond to different combinations of these features might initially appear to be a daunting task. In the following section, we discuss some concrete steps for achieving this vision of designed social relationships.

Achieving this Vision

We have outlined our vision of designed social relationships in AIED systems, which will be a key component of ensuring that intelligent learning environments in the future engage students and encourage them to persist with learning tasks across contexts. In 25 years, students will be interacting with personalized companions of all technological forms that evolve with them throughout their lifelong learning. Here, we describe concrete steps we as a community can take to realize this vision.

Modeling Social Relationships

To develop socially adaptive AIED systems, we will need tos apply intelligent tutoring systems approaches to the realm of managing social interactions and relationships. VanLehn (2006), in his paper “The Behavior of Tutoring Systems”, introduced the concept of an inner loop and outer loop, where the inner loop assesses student performance on a problem-solving step and provides remediation when necessary, and the outer loop monitors student task performance, and selects the next problem-solving task. We propose that this model be applied to designed relationships with an AIED system as well. In such a model, the inner loop would assess and respond to the moment-to-moment student-technology interactions that contribute to an emergent relationship. It would select among a set of potential interactions based on their estimated effects on the relationship state, which may vary based on the cross-sectional and longitudinal contextual features. The outer loop would store the values for these cross-sectional and longitudinal features, and track the current relationship status, goals, and typical patterns of interaction shared by the pair. Based on the assessment of each student-technology interaction by the inner loop, it would compare the current relationship state to the desired relationship state. It would then set goals that would allow the system to achieve the desired relationship state. The two loops interact: The “better” and “worse” states in the inner loop are determined by the goals set by the outer loop, and, based on the assessments returned by the inner loop, the outer loop updates its representation of the relationship (e.g., modifies the longitudinal features, stores the current relationship state; see Fig. 1).
Fig. 1

Model of a designed social relationship within an AIED system. The model consists of an inner loop, which monitors and responds to individual student-technology interactions, and an outer loop, which tracks the broader context and relationship state

These inner and outer loops will require the integration of cognitive, metacognitive, affective, motivational, and social models in facilitating the technology in choosing the next course of action. As intelligent tutoring systems have become more complex, incorporating student interactions with pedagogical agents, the architectures that support these interactions already enable some of the necessary building blocks for expressing affect and social expression. For example, Basilica supports dialog production (Adamson & Rosé, 2012), PsychSim tracks psychological traits and states (Marsella et al., 2004), and BEAT allows for the automatic production of gaze and gesture behaviors (Cassell et al., 2004). What we are proposing will additionally require system modules that track the interaction history, and current and desired relationship state and characteristics. For different genres of technology, different modalities or channels of interaction such as gesture, speech, and gaze may be supported as they are determined to be critical for student engagement. Further, there will need to be failure management strategies in place for situations where student actions are misinterpreted or students do not react as expected to the system actions.

On a final note, social ITSs will truly need to adapt to the individual, in the sense that they will need to take a student’s relationships with others’ and perceptions of technology into account in addition to the interaction context and history of the student interactions with the technology. Certain students may be more likely to perceive a technology as social than others. Certain students might have had more positive experiences with authority figures, or more enjoyable peer collaborations than others. An ITS with designed relationships will require a full theoretical foundation to link relationship goals, social interactions, contextual factors, individual differences, and the cognitive, metacognitive, affective, and motivational elements of student-technology interactions.

Research Guidelines for Developing Student-Technology Relationships

The above computational model required to produce a socially adaptive technology is complex, with many interactions between its different components. Developing it will require expansive literature reviews, the use of solid empirical research, and iterative design methods. The following are guidelines for developing computational models for designed relationships:
  1. 1.

    Treat human-technology relationships as though they were a long-term proposition. It will be important to draw more deeply from existing social theory on the evolution of human-human interaction in order to view human-technology interactions through the lens of an emergent relationship. For example, social theory would already suggest that politeness is appropriate at the beginning of a relationship but decreases as interlocutors become friends (Brown and Levinson, 1987). Longitudinal development is not yet addressed in intelligent tutoring systems, even those that incorporate social features in their interactions.

     
  2. 2.

    Study human-human interaction in learning contexts to inform pedagogical decisions made with human-technology interactions. By studying human-human interactions, it is possible to deeply understand the nature of human relationships in specific learning contexts, and how social theory applies to those contexts. This understanding can inform the design and development of human-technology relationships, although the design will need to be adjusted to accommodate situations in which humans respond to technology differently than to people (e.g., people are more inclined to self-disclose with technology; Moon, 2000). Both verbal and nonverbal communication should be coded and linked to collaborative interactions and learning outcomes.

     
  3. 3.

    Conduct Wizard-of-Oz experiments to minimize technological investment. In a Wizard of Oz experiment, the technology is controlled by a human rather than acting autonomously (Kelley, 1984). By emphasizing Wizard of Oz explorations, it is possible to explore a wide range of technological interventions at low cost without committing to a particular course of action. Wizard of Oz experiments also allow the researchers to understand quickly the ways in which human-human relationships are mirrored in human-technology relationships, and the ways in which they are different - our research suggests that students develop playful relationships with agents much more quickly than with human learning partners.

     
  4. 4.

    Fail fast, iterate quickly. It will be important to borrow from user-centered design techniques to test multiple diverse ideas and iterate quickly on ideas. Brainstorming, storyboarding, and prototyping are all techniques that can be used to probe ideas quickly. Multiple factors or relationships should be tested simultaneously, to better understand the relationship context and which technology actions are contributing to the relationship. The focus should evolve from design studies to controlled experiments.

     
  5. 5.

    Embed assessments to probe social processes and relationships. Make sure there are objective and subjective measures in place to probe student affect, motivation, and learning behaviors. All this process data can later inform the model being built, and be correlated to outcome data.

     

While the above guidelines might help in gaining traction on the problem, this proposal is so complex that it will likely require collaboration among multiple research teams, each furthering aspects of a computational model of relationships. There has been much recent discussion surrounding standards and procedures for sharing data. In the future, we may have standards for representing relationship models and procedures for sharing them and integrating them.

Conclusion

Franzs mobile phone vibrated on the bus home. He pulled it out to see who was calling. It was his personal learning companion, Mark.Hi, Mark!” he said. Immediately, he could hear Mark start to cry uncontrollably.Whats wrong?” “They said they were going to fire me,” Mark said between sobs.They said its my last chance. If you keep skipping your homework, theyre going to delete me, and instantiate a more effective companion.Franz immediately felt his heart sink.No!” He reassured Mark.I wont let that happen! I promise.

In this paper, we have described a vision of the future where students form social relationships with their educational technology that are context-sensitive, evolve over time, and are carefully designed to enhance positive outcomes and avoid negative ones. While a daunting task, we believe it will be possible to realize this vision through theory-driven and design-based research that aims to understand how features of relationships can be created and what effects they have. As technology evolves and becomes more sophisticated in the ways that it interacts with people, it is likely that these relationships will be created through the use of technology as a matter of course. It is thus important to reflect critically on how to design these relationships to be maximally effective, whether the technology being designed is an embodied conversational agent, an intelligent tutoring system with text-based feedback, an immersive classroom simulation, or a simple digital textbook. While reaching this future may seem difficult, it is an achievable vision and one of great worth.

Approaching this research area will require a thorough examination behind the ethics of manipulating the relationships students have with technology. Is it acceptable if technology lies to students? If it is purposefully manipulative? Is it the designer’s responsibility to avoid encouraging students to get too involved with the technology? In this section’s vignette, Franz is being pressured to do his homework under threat of his companion, Mark, being fired and replaced, which is a highly manipulative and emotionally distressing tactic. While this is an extreme example, when purposefully engendering close relationships between technology and a student in order to influence learning outcomes, it is important to be highly cognizant of the effects of that influence and related ethical implications. Manipulating students’ relationship with technology is a powerful tool, and one that may have a large payoff with respect to student outcomes, but requires careful consideration.

In the future, our conception of a social relationship with a technology is going to be highly sophisticated, as relationships will evolve over time, roles will be fluid, and technologies will be context-sensitive. Is there an upper bound on the extent to which a person could develop a relationship with a technology? Perhaps, and it is an empirical question to determine how different individuals perceive technologies over time and how deeply they engage with them. However, an equally appropriate question relates to the limitations of human interactions with each other, and how a technology may be able to provide a more attentive and personalized experience. In the end, while human-technology relationships will not replace human-human relationships, they can complement each other to create deeply engaging effective learning experiences. Nearly 20 years ago, John Self proposed that the strength of AIED technologies is their ability to care about the student – that is, their ability to understand student knowledge, misconceptions, and goals (Self, 1998). We extend Self’s vision to propose that, 25 years in the future, AIED technologies will address the entirety of the students’ social and cognitive learning experience, fulfilling their potential to care.

Notes

Acknowledgments

This work was supported in part by NSF CISE-IIS-1451431, NSF CISE-IIS-1249406, NSF-CISE-IIS-1464204, and NSF CISE-IIS-1523162. We thank Kurt VanLehn, Nikki Lubold, H. Chad Lane, and the anonymous reviewers for their helpful comments that greatly improved this work.

References

  1. Adamson, D., & Rosé, C. P. (2012, January). Coordinating Multi-dimensional support in collaborative conversational agents. In Intelligent Tutoring Systems(pp. 346–351). Springer Berlin Heidelberg.Google Scholar
  2. Barras, C. (2009). Useful, lovable and unbelievably annoying. New Scientist, 204(2738), 22–23.CrossRefGoogle Scholar
  3. Breazeal, C. (2002). Regulation and entrainment in human-robot interaction. The International Journal of Robotics Research, 21(10–11), 883–902.CrossRefGoogle Scholar
  4. Brown, P., & Levinson, S. C. (1987). Politeness: Some universals in language usage. Cambridge: Cambridge University Press.Google Scholar
  5. Burgoon, J. K., & Hale, J. L. (1984). The fundamental topoi of relational communication. Communication Monographs, 51(3), 193–214.CrossRefGoogle Scholar
  6. Cassell, J., Vilhjálmsson, H. H., & Bickmore, T. (2004). BEAT: the behavior expression animation toolkit, In Life-Like Characters (pp. 163–185). Berlin Heidelberg: Springer.Google Scholar
  7. Chan, T. W., & Baskin, A. B. (1988). Studying with the prince: The computer as a learning companion. In Proceedings of the International Conference on Intelligent Tutoring Systems (Vol. 94200).Google Scholar
  8. Chase, C. C., Chin, D. B., Oppezzo, M. A., & Schwartz, D. L. (2009). Teachable agents and the protégé effect: Increasing the effort towards learning. Journal of Science Education and Technology, 18(4), 334–352.CrossRefGoogle Scholar
  9. Christophel, D. M. (1990). The relationships among teacher immediacy behaviors, student motivation, and learning. Communication Education, 39(4), 323–340.CrossRefGoogle Scholar
  10. Clark, H. H., & Brennan, S. E. (1991). Grounding in communication. Perspectives on socially shared cognition, 13(1991), 127–149.CrossRefGoogle Scholar
  11. Kumar, R., Ai, H., Beuth, J. L., & Rosé, C. P. (2010). Socially capable conversational tutors can be effective in collaborative learning situations, In Intelligent Tutoring Systems (pp. 156–164). Berlin Heidelberg: Springer.CrossRefGoogle Scholar
  12. Fisher, B. A., & Adams, K. L. (1994). Interpersonal communication: pragmatics of human relationships. McGraw-Hill Humanities Social.Google Scholar
  13. Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3), 143–166.CrossRefMATHGoogle Scholar
  14. Girotto, V., Thomas, E., Lozano, C., Muldner, K., Burleson, W., & Walker, E. (2014). A Tool for Integrating Log and Video Data for Exploratory Analysis and Model Generation. In Intelligent Tutoring Systems (pp. 69–74). Springer International Publishing.Google Scholar
  15. Graesser, A. C., Li, H., & Forsyth, C. (2014). Learning by communicating in natural language with conversational agents. Current Directions in Psychological Science, 23(5), 374–380.CrossRefGoogle Scholar
  16. Hinde, R. A. (1996). Describing relationships. The diversity of human relationships, 7-35.Google Scholar
  17. Hinde, R. A. (1997). Relationships: A dialectical perspective. East Sussex, UK: Psychology Press.Google Scholar
  18. Hofstede, G. H., & Hofstede, G. (2001). Culture's consequences: Comparing values, behaviors, institutions and organizations across nations. Sage.Google Scholar
  19. Jennings, P. A., & Greenberg, M. T. (2009). The prosocial classroom: Teacher social and emotional competence in relation to student and classroom outcomes. Review of Educational Research, 79(1), 491–525.CrossRefGoogle Scholar
  20. Kanda, T., Shimada, M., & Koizumi, S. (2012). Children learning with a social robot. In In Proc. of ACM/IEEE International Conference on Human-Robot Interaction (pp. 351–358).Google Scholar
  21. Kelley, J. F. (1984). An iterative design methodology for user-friendly natural language office information applications. ACM Transactions on Information Systems, 2(1), 26–41.MathSciNetCrossRefGoogle Scholar
  22. Kerns, K. A. (2000). Types of preschool friendships. Personal Relationships, 7(3), 311–324.CrossRefGoogle Scholar
  23. Leite, I., Mascarenhas, S., Pereira, A., Martinho, C., Prada, R., & Paiva, A. (2010). ”Why Can’t We Be Friends?” An Empathic Game Companion for Long-Term Interaction. In J. Allbeck, N. Badler, T. Bickmore, C. Pelachaud & A. Safonova (Eds.), Intelligent Virtual Agents (Vol. 6356, pp. 315–321): Springer Berlin Heidelberg.Google Scholar
  24. Looije, R., vander Zalm, A., Neerincx, M. A., & Beun, R. (2012, September). Help, I need some body the effect of embodiment on playful learning. In RO-MAN, 2012 IEEE (pp. 718–724). IEEE.Google Scholar
  25. Lui, M., Kuhn, A. C., Acosta, A., Quintana, C., & Slotta, J. D. (2014, April). Supporting learners in collecting and exploring data from immersive simulations in collective inquiry. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems (pp. 2103–2112). ACM.Google Scholar
  26. Marsella, S. C., Pynadath, D. V., & Read, S. J. (2004). PsychSim: Agent-based modeling of social interactions and influence. In Proceedings of the international conference on cognitive modeling (Vol. 36, pp. 243–248).Google Scholar
  27. Martin, A. J., & Dowson, M. (2009). Interpersonal relationships, motivation, engagement, and achievement: Yields for theory, current issues, and educational practice. Review of Educational Research, 79(1), 327–365.CrossRefGoogle Scholar
  28. McLaren, B. M., DeLeeuw, K. E., & Mayer, R. (2011). Polite web-based intelligent tutors: Can they improve learning in classrooms? Computers & Education, 56(3), 574–584.CrossRefGoogle Scholar
  29. Mohammed, P., & Mohan, P. (2010, February). Combining digital games with culture: A novel approach towards boosting student interest and skill development in Computer Science programming. In Mobile, Hybrid, and On-Line Learning, 2010. ELML'10 (pp. 60–65). IEEE.Google Scholar
  30. Moon, Y. (2000). Intimate exchanges: Using computers to elicit self-disclosure from consumers. Journal of Consumer Research, 26(4), 323–339.CrossRefGoogle Scholar
  31. Moreno, R., Mayer, R. E., Spires, H. A., & Lester, J. C. (2001). The case for social agency in computer-based teaching: Do students learn more deeply when they interact with animated pedagogical agents? Cognition and Instruction, 19(2), 177–213.CrossRefGoogle Scholar
  32. Muldner, K., Lozano, C., Girotto, V., Burleson, W., & Walker, E. (2014). The impact of a social robot’s attributions for success or failure in a teachable agent framework. In ICLS (pp. 278–285). ISLS.Google Scholar
  33. Nass, C., Fogg, B. J., & Moon, Y. (1996). Can computers be teammates?.International. Journal of Human-Computer Studies, 45(6), 669–678.CrossRefGoogle Scholar
  34. Nass, C., & Reeves, B. (1996). The media equation: How people treat computers, televisions, and new media as real people and places.Google Scholar
  35. Ogan, A., Finkelstein, S., Mayfield, E., D'Adamo, C., Matsuda, N., & Cassell, J. (2012a). Oh dear stacy!: social interaction, elaboration, and learning with teachable agents. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 39–48). ACM.Google Scholar
  36. Ogan, A., Finkelstein, S., Walker, E., Carlson, R., & Cassell, J. (2012b). Rudeness and rapport: Insults and learning gains in peer tutoring, In Intelligent Tutoring Systems (pp. 11–21). Berlin Heidelberg: Springer.Google Scholar
  37. Ogan, A., Aleven, V., Kim, J., & Jones, C. (2011). Persistent Effects of Social Instructional Dialog in a Virtual Learning Environment. In In Proceedings of the 15th International Conference on Artificial Intelligence in Education, AIED ‘11. Springer-Verlag, Berlin (pp. 238–246).Google Scholar
  38. Ogan, A. (2011). Supporting Learner Social Relationships with Enculturated Pedagogical Agents. Institute of Education: Doctoral dissertation.Google Scholar
  39. Ogan, A., Walker, E., Baker, R., Rebolledo, G., & Jimenez-Castro, M. (2012c). Collaboration in Cognitive Tutor Use in Latin America: Field Study and Design Recommendations. In Proceedings of the Thirtieth Annual SIGCHI Conference on Human Factors in Computing Systems. CHI '12. ACM, NY, NY, pp. 1381–1390.Google Scholar
  40. Ogan, A., Yarzebinski, E., Fernández, P., & Casas, I. (2015, June). Cognitive Tutor Use in Chile: Understanding Classroom and Lab Culture. In Artificial Intelligence in Education (pp. 318–327). Springer International Publishing.Google Scholar
  41. Paavola, S., Lipponen, L., & Hakkarainen, K. (2004). Models of innovative knowledge communities and three metaphors of learning. Review of Educational Research, 74(4), 557–576.CrossRefGoogle Scholar
  42. Picard, R. W., Papert, S., Bender, W., Blumberg, B., Breazeal, C., Cavallo, D., et al. (2004). Affective learning—a manifesto. BT Technology Journal, 22(4), 253–269.CrossRefGoogle Scholar
  43. Plax, T. G., Kearney, P., McCroskey, J. C., & Richmond, V. P. (1986). Power in the classroom VI: Verbal control strategies, nonverbal immediacy, and affective learning. Communication Education, 35, 43–55.CrossRefGoogle Scholar
  44. Ploetzner, R., Dillenbourg, P., Preier, M., & Traum, D. (1999). Learning by explaining to oneself and to others. In P. Dillenbourg (Ed.), Collaborative Learning: Cognitive and Computational Approaches (pp. 103–121). Elsevier Science Publishers.Google Scholar
  45. Rader, E., Echelbarger, M., & Cassell, J. (2011). Brick by Brick: Iterating Interventions to Bridge the Achievement Gap with Virtual Peers. In Proceedings of the CHI'11 Conference, May (pp. 9–12). BC: Vancouver.Google Scholar
  46. Richmond, V. P., Gorham, S., & McCroskey, C. (1987). The relationship between selected immediacy behaviors and cognitive learning. In M. A. McLaughlin (Ed.), Communication Yearbook 70 (pp. 574–590). Newbury Park, CA: Sage.Google Scholar
  47. Roscoe, R. D., & Chi, M. (2007). Understanding tutor learning: Knowledge-building and knowledge-telling in peer tutors’ explanations and questions. Review of Educational Research, 77(4), 534–574.CrossRefGoogle Scholar
  48. Saerbeck, M., Schut, T., Bartneck, C., & Janse, M. D. (2010). Expressive robots in education: varying the degree of social supportive behavior of a robotic tutor. In Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Atlanta, Georgia: USA.Google Scholar
  49. Self, J. (1998). The defining characteristics of intelligent tutoring systems research: ITSs care, precisely. International Journal of Artificial Intelligence in Education (IJAIED), 10, 350–364.Google Scholar
  50. Spencer-Oatey, H. (2000). Rapport management: A framework for analysis. Culturally speaking: Managing rapport through talk across cultures, 11-46.Google Scholar
  51. Swartout, W., Traum, D., Artstein, R., Noren, D., Debevec, P., Bronnenkant, K., Williams, J., Leuski, A., Narayana, S., Piepol, D., Lane, C., Morie, J., Aggarwal, P., Liewer, M., Chiang, H., Gerten, J., Chu, S., & White, K. (2010). Ada and Grace: Toward realistic and engaging virtual museum guides, In Intelligent Virtual Agents (pp. 286–300). Berlin Heidelberg: Springer.Google Scholar
  52. Tickle-Degnen, L., & Rosenthal, R. (1990). The nature of rapport and its nonverbal correlates. Psychological Inquiry, 1(4), 285–293.CrossRefGoogle Scholar
  53. Timms, this issueGoogle Scholar
  54. Urdan, T. C., & Maehr, M. L. (1995). Beyond a two-goal theory of motivation and achievement: A case for social goals. Review of Educational Research, 65(3), 213–243.CrossRefGoogle Scholar
  55. VanLear, C. A., Koerner, A., & Allen, D. M. (2006). Relationship typologies. The Cambridge handbook of personal relationships, 91-110.Google Scholar
  56. VanLehn, K. (2006). The behavior of tutoring systems. International journal of artificial intelligence in education, 16(3), 227–265.Google Scholar
  57. Walker, E., Girotto, V., Zhang, C., Fernandez, A., Chen, G., Hsieh, G. (2014a). Understanding Peer Help in an Online Learning Community. Presented at the CHI 2014 Workshop on Learning Innovations at Scale. Toronto, Canada.Google Scholar
  58. Walker, E., Rummel, N., & Koedinger, K. R. (2014b). Adaptive intelligent support to improve peer tutoring in algebra. International Journal of Artificial Intelligence in Education, 24(1), 33–61.CrossRefGoogle Scholar
  59. Walters, M. L., Dautenhahn, K., Woods, S. N., Koay, K. L., Te Boekhorst, R., & Lee, D. (2006). Exploratory studies on social spaces between humans and a mechanical-looking robot. Connection Science, 18(4), 429–439.CrossRefGoogle Scholar
  60. Wang, N., Johnson, W. L., Mayer, R. E., Rizzo, P., Shaw, E., & Collins, H. (2008). The politeness effect: Pedagogical agents and learning outcomes. International Journal of Human-Computer Studies, 66(2), 98–112.CrossRefGoogle Scholar
  61. Vygotsky, L. S. (1978). Mind and society: The development of higher mental processes.Google Scholar

Copyright information

© International Artificial Intelligence in Education Society 2016

Authors and Affiliations

  1. 1.School of Computing, Informatics, and Decision Systems EngineeringArizona State UniversityTempeUSA
  2. 2.Human-Computer Interaction InstituteCarnegie Mellon UniversityPittsburghUSA

Personalised recommendations