Self-regulated learning: importance of goal setting
During online learning activities, students are often given more autonomy over their learning process. The “when” and “where” of students’ working on the online contents is often up to the students themselves to decide which significantly increases the level of student self-regulation (Hartley & Bendixen, 2001). According to Zimmerman (2013), self-regulation theory contains three phases (forethought, performance, self-reflection), with each phase representing a process that occurs before learning, during learning and after the learning effort. The forethought phase refers to the stage where students prepare for learning. Goal setting is a key element in this phase which refers to students deciding on the intended outcomes of a learning effort (Schwarzenberg & Navon, 2020). The performance phase refers to use of strategies such as self-monitoring to keep track of one’s performance during learning. The self-reflection phase refers to processes that occur after learning where students reflect on their learning experience and outcomes. These self-reflections can influence subsequent forethought processes (Zimmerman, 2013).
Self-regulated students are more likely to perform better in their learning (Lachmann & Kiefel, 2012). However, many online students lack self-regulation skills and are not ready for conducting autonomous learning (Wong et al., 2021). This can cause students to feel disengaged in the online activities. Students who lack self-regulation skills of their own learning tend to be unsuccessful in online learning (Yilmaz & Keser, 2017).
Recent investigations of self-regulation strategies in online learning settings stressed the key role of goal-setting (Pedrotti & Nistor, 2019). As Latham and Locke (1991) argued in a seminal paper, “goal setting facilitates self-regulation in that the goal defines for the person what constitutes an acceptable level of performance” (p. 234). A self-regulated learner will first set specific learning goals at the beginning of their learning experience. In the goal setting process, learners will define the outcomes for their learning and propose how they plan to reach their goals (Nussbaumer et al., 2011). Actions that fall short of a described goal level will result in a negative performance evaluation while actions that attain or exceed the desired goal will lead to a positive performance evaluation (Latham & Locke, 1991).
In this study we focused on the goal-setting process since clear goals are essential for student self-regulation, and successful completion of courses (Handoko et al., 2019). Many students with lower self-regulation skills tend to have ambitious and unrealistic goals, which may disappoint the learners after their learning (Shih et al., 2010). To help students set goals, previous studies have employed a variety of means such as the use of prompts embedded in lecture videos (Moos & Bonde, 2016; van Alten et al., 2020; Wong et al., 2021). Students were required to think about the prompts (e.g., What are your goals when learning from this homework video?) and answer the question in order to continue the video (e.g., van Alten et al., 2020).
Although the use of prompts displayed as questions can help students formulate their learning goals, some of these questions can be quite vague in nature. For example, questions such as “what are your goals when learning from this video” may not help students set effective goals that are measurable and achievable. In this study, we chose the SMART goal framework (Doran, 1981). The SMART framework contains five key elements of writing goals: specific, measurable, achievable or assignable, realistic, and time-related. The SMART method is commonly considered as the standard for developing effective, measurable goals and is widely used for developing program goals and objectives (Bjerke & Renger, 2017). In this study, the goal setting chatbot would engage participants at the start of the course with five questions developed based on the SMART method (see the Study 1 section for more detail).
Sense of isolation: facilitating social presence
Besides the lack of self-regulation skills, online students also feel a sense of isolation (Bączek et al., 2021; Chametzky, 2021) due to the lack of interactions among the participants in the online space (Wut & Xu, 2021). For example, a student may pose a question online, or ask for help regarding a lesson activity but fail to get a response from other people. Even though recent video-conferencing technologies such as Zoom can enable online conversation that mimics real-time face-to-face conversation, 33% of 400 respondents stated that they were less willing to respond to questions during Zoom sessions compared with traditional in-person lessons (Cavinato et al., 2021).
One way to alleviate isolation during online activities is to facilitate a sense of social presence. A heightened level of social presence can increase the frequency of online interaction (Tu & McIsaac, 2002). Social presence can be defined as “the ability of participants to project themselves socially and emotionally, as real people” (Garrison et al., 1999, p. 94). Previous studies have suggested that social presence can reduce stress and the sense of loneliness (Whiteside et al., 2014), and enhance student satisfaction with the course (Richardson et al., 2017). It is important to note that social presence does not mean supporting engagement simply for social purposes (Garrison, 2011). Instead, social presence in an educational setting means creating a climate that supports open and cohesive communication so that participants feel comfortable to ask questions without the fear of hurting somebody’s feelings and damaging a relationship (Garrison, 2011).
In the present study, we explored the use of a chatbot to facilitate a sense of social presence. According to social response theory (Nass & Moon, 2000), human–computer interactions are basically social where humans have the innate tendency to perceive computers as social beings, even when they know that machines do not have feelings or intentions. By interacting with an anthropomorphized machine, a user may perceive a sense of social presence (Adam et al., 2021).
Unlike a human being, a chatbot is available online 24/7 for participants who wish to converse with it. To help us design the chatbot, we refer to the specific categories of social presence described by Garrison (2011): interpersonal communication, open communication, and cohesive communication. Interpersonal communication refers to messages that create a sense of belonging with other people such as through affective expressions (e.g., emoticons, emojis). Open communication refers to messages that explicitly recognize other people through quoting from other participants’ messages, agreeing with, and expressing appreciation. Cohesive communication aims to build a sense of community through the use of phatics messages (commonly known as small talk, e.g., “nice morning today!”), vocatives (e.g., addressing participants by name), salutations, and inclusive pronouns such as “we”. In this way, this study examined and tested the usefulness of the interpersonal, open and cohesive strategies in the context of chatbot communication to induce social presence.
Overview of chatbots
A chatbot is a software tool that can interact with users by means of text or voice using natural language (Smutny & Schreiberova, 2020). The first chatbot, ELIZA, developed in 1966, simulated a psychotherapist’s conversation with humans, where users could communicate with ELIZA by entering text inputs, and the chatbot would respond in kind (Weizenbaum, 1966). Since then, voice inputs and responses have become possible modes of user-chatbot interaction (e.g., Apple Siri), although the majority of chatbots today still utilize text-based communication without physical or dynamic representations (Adam et al., 2021).
In recent years, the use of chatbots has become widespread in many sectors such as retail customer service and internet banking (Følstad & Brandtzæg, 2017). Many organizations, for example, have employed chatbots on social media platforms such as Facebook Messenger, WhatsApp, and WeChat to answer customers’ questions at any time of the day (Insider Intelligence, 2021). Insider Intelligence forecasts that consumer retail spending via chatbots will reach $142 billion by 2024, compared to just $2.8 billion in 2019. The use of chatbot has also made inroads into the educational field. Chatbots have been used for student skill improvement particularly in language learning, such as vocabulary and grammar development (Wollny et al., 2021). For example, the Wordsworth chatbot provides users quiz questions to test their vocabulary skills (Smutny & Schreiberova, 2020).
Figure 1 shows an instructional design framework of chatbot activity. Basically, a chatbot runs four working procedures during a human–chatbot conversation: (a) Question Analysis, where the chatbot uses natural language processing techniques to classifies user’s inputs by breaking down one sentence into several parts, labelling each part, then understanding the sentence; (b) Hypothesis Generation, where the chatbot searches possible contents related to responses from knowledge bases which can be designed in advance according to different topics; (c) Hypothesis Scoring, where the chatbot, using ranking algorithm techniques and reasoning capabilities, scores the consistency between hypothesis and inputs; (d) Ranking and Confidence Estimation, in which the chatbot ranks the highly matched hypothesis as correct answers and applies machine learning to train classifiers with known accurate responses in order to generate an intelligent chatbot that can process similar inputs from more users in the future.
In general, chatbots can be developed either using programming languages such as Python or a chatbot development platform (Nayyar, 2019). In the case of the latter, a developer can use a chatbot visual flow builder to develop a chatbot without any coding. Examples of some common chatbot visual flow builders include IBM Watson Assistant system, and Google Dialogflow. Both IBM Watson and Dialogflow are AI-powered systems that uses machine learning and natural language to facilitate the development of chatbots (Zuckerman, 2020).
In this study, we use Google Dialogflow platform to build our chatbot activities. A typical chatbot dialog in Dialogflow consists of three components: intents, entities, and the responses. Intents refer to users’ possible question or responses. Entities are keywords or synonyms which help the chatbots to recognize a user’s localize words. For example, when a user asks the questions “What is experiential learning?” or “Can you explain the definition of experiential learning?”, these two sentences will be recognized as one same question asking for the definition of experiential learning by searching out intent definition and entity experiential_learning. A response would be the answer provided by the chatbot.
A developer can test the accuracy of a chatbot by inputting as many questions as possible. A chatbot can be trained if its output is wrong. Developers can manually classify the confused inputs into correct intents; meanwhile the chatbot knowledge base will be updated and enriched automatically. With more and more data from users stored into the chatbot knowledge base, a better intelligent chatbot will be produced.
The design and development of two chatbots
Study 1: the goal-setting chatbot
To recall, we focused on the goal-setting process in this study because clear goals are essential for student self-regulation, and successful completion of courses (Handoko et al., 2019). The goal-setting chatbot helped students set personal learning goals concerning the course they attended. Students were required to interact with the goal-setting chatbot on Moodle webpage before coming to the first class. The goal-setting chatbot engaged students with five goal-setting questions based on SMART framework (Table 1). We used the Google Dialogflow platform to build the chatbot. According to Azran (2019), Dialogflow provides the easiest and quickest way to create a custom conversational bot.
For each question, potential options were offered to inspire the directions of students’ self-regulated learning goals and expectations of the course. Options can boost a chatbot’s engagement with a user because it reduces the potential for misunderstandings between the chatbot and the user (LiveChat, 2022). Additionally, providing students with options to respond generally yield higher response rate than open-ended questions (Reja et al., 2003). The options listed by the chatbot were developed in consultation with the course teacher who had in-depth knowledge of the general expectations or concerns faced by students based on his previous experience teaching the course over many years.
For example, before students attended the first lesson, the chatbot asked students “could you tell me what you want to gain most from this course?” followed by three options (Fig. 2). The three options represented three possible learning goals. All students were given the opportunity to express their expectations by choosing the options as their learning goals. Student in Fig. 2 replied to the chatbot with “I may choose 2nd one”, which means this student chose option B. Based on the student’s answer to each question, the goal-setting chatbot would provide relevant recommendations that suit each student’s preference. Hence, in Fig. 2, the goal-setting chatbot responded to the student with “I see … I believe you can do better than this!” According to students’ responses in the interviews and open-ended surveys, none of the students mentioned that they felt lost or overwhelmed. Instead, many students indicated that this activity helped them clarified learning goals and they hoped to get more recommendations during the process. Together with the recommendations after each choice, students can have a better idea of achieving the particular goals they set for the course.
The instructor of this course participated in the goal-setting chatbot design and development. Aligned with the course learning outcomes, the instructor worked out the goal-setting questions and relevant self-regulated learning recommendations based on the SMART framework. For example, in terms of “Measurable”, students were expected to set learning goals that could be measured by evidence. The instructor drafted the question “May I know what grade level you want to get in this course?” and possible answers, like “Grade A”, “Grade B”, or “Grade C”. Then the chatbot developer categorized the data into intents, entities, and responses according to the Dialogflow system. The intents were pre-defined keywords of students’ possible inputs. For example, an intent named “Grade A” was created. The data set in this intent were all possible answers related to the keyword “Grade A”, such as “I would like to gain Grade A” and “I want to have an A”. The entities were the synonyms and misspellings of keywords for different intents. For example, we added some synonyms for intent “Grade A” as the entities, including “level A”, “A−”, and “A+”. Recommendations for self-regulated learning strategies were fed into the responses database.
To minimise students’ off-topic replies, we labelled three potential options (A, B, or C) for each question posed. Figure 3 shows the flowchart for designing the goal-setting chatbot. The goal-setting chatbot was tested by the instructor and the developer before it was launched on Moodle (Fig. 4).
Study 2: the learning buddy chatbot
Out-of-class practice is essential to students’ improvement in foreign language learning due to the insufficient time in classroom (García Botero et al., 2019; Kennedy & Levy, 2009). However, students’ engagement in the out-of-class practice is relatively low because they need more opportunities to interact with instructor and peer students to alleviate the sense of isolation while learning (Jia & Hew, 2022). Social presence can be a solution, because it has been shown to reduce stress and the sense of loneliness (Whiteside et al., 2014).
In out-of-class online learning, students usually complete online learning activities (e.g., watching videos and reading materials) by themselves before having a synchronous online meeting with teachers and classmates (Jia et al., 2022). To facilitate a sense of social presence in the out-of-class online learning, a social presence chatbot was created to act as a learning buddy, who guided students through the daily EFL listening practice. Students were provided with daily listening practices, in the form of dictation exercises. A total of 10 sessions were administered. Daily tasks were estimated to be completed in 10 min. While students interacted with the chatbot on Moodle course webpage, the chatbot guided the students to complete listening tasks, provided immediate feedback, and employed specific communication strategies to project social presence into its interaction with participants. We used Google Dialogflow to build the chatbot conversation. The chatbot was integrated into Moodle via embedded code and presented in a format of activity page. Once students clicked that activity page, the learning buddy chatbot would pop up and start conversation with students (Fig. 5).
The design of the learning buddy chatbot was underpinned by the specific communication strategies of social presence described by Garrison (2011) (Table 2). Interpersonal communication was presented with the help of emoticons. Emoticons are applicable to express respect and welcome to participants when physical and vocal cues are not present (Garrison, 2011). In terms of the open communication, the learning buddy chatbot was able to reply students’ inputs and guide students step by step to complete the tasks (continuing a thread).
Common mistakes reported by previous students were listed out after students finished the tasks (quoting from others’ messages). EFL learners have long recognized listening as a challenging skill to acquire (Nushi & Orouji, 2020). Worse still, they tend to blame themselves for the lack of ability in listening (Cauldwell, 2018). Cauldwell (2018) pointed out that it is horrible for students when they feel that “only me” have this problem, and everyone else is doing good. Therefore, to reduce the self-blame feeling of inadequacy as a learner listener, it is important to share with students the mistakes and difficulties encountered by other students when practicing listening. The learning buddy chatbot expressed appreciation throughout the learning process. For example, when students tried several times but still could not figure out the task, the chatbot appreciated students’ hard work to protect students’ self-esteem. Once students got correct answers, prompt agreements were given by the chatbot. To project cohesive communication, the learning buddy chatbot greeted students every day when students entered the system, addressed students’ name during the conversation, and used inclusive pronouns (i.e., we, us, and our) to create a collaborative learning environment (Fig. 5).
Participants and context
Study 1 involved 29 postgraduate students (26 females, 3 males) who enrolled in a fully online course at a large public university in Asia. Study 2 involved 38 s-year undergraduate students (29 female, 9 male) of a fully online EFL listening course also at a public university in Asia. Ethical approval to conduct the two study was granted by the university’s Institutional Review Board. We used purposeful sampling to select the courses in Study 1 and Study 2. Purposeful sampling is widely used in qualitative research to identify information-rich cases related to the phenomenon of interest (Palinkas et al., 2015). The “phenomenon of interest” refers to the use of chatbots in fully online classes. We chose these fully online classes because the instructors of the two courses were interested to use chatbots in their lessons. The researchers worked closely with the instructor of the course in Study 1, and the instructor in Study 2 to develop the chatbots.
Data collection and analysis
To address the first research question “what are the effects of the goal-setting chatbot and the learning buddy chatbot on students’ behavioral engagement in online learning”, we evaluated students’ behavioral engagement by measuring their conversation records with the goal-setting chatbot and the learning buddy chatbot (utterance turn, session length, goal completion rate). Utterance turn refers to the number of back-and-forth exchange between a chatbot and a user (Yao, 2016). Session length is the amount of time that elapses between the moment a user starts to converse with a chatbot and the moment they end the conversation (Mead, 2019). The goal completion rate in this study would be the number of times the chatbot was successful in helping students complete the learning tasks.
To evaluate students’ perceived usefulness and ease of use of the chatbots, a five-point scale questionnaire, ranging from 1 (i.e., strongly disagree) to 5 (i.e., strongly agree), was used (adapted from Davis, 1989, p. 340). Perceived ease of use can be defined as “the degree to which a person believes that using a particular system would be free of effort” (Davis, 1989, p. 320). There were four items for perceived usefulness. The sample item were “Using the chatbot made it easier to complete my goal setting process” (for study 1) and “Using the chatbot made it easier to complete my daily listening practices” (for study 2). Perceived usefulness is “the degree to which a person believes that using a particular system would enhance his or her job performance” (Davis, 1989, p. 320). The scale of perceived ease of use also included 4 items. The sample item was “I found it is easy to use the chatbot to communicate.”
To answer the third and fourth research questions regarding students’ perceptions and suggestions for improving the chatbots, individual interviews and open-ended surveys were used respectively. We conducted individual interviews to analyze students’ perceived role that chatbot played in assisting students to set learning goals (Study 1) and to complete the EFL listening practices (Study 2). Examples of the interview questions were “How did the chatbot help you to set your learning goals for this course?” (Study 1) and “In what aspects did the chatbot help to conduct social interaction with you?” (Study 2). Open-ended surveys were conducted to obtain students’ suggestions for improving the chatbots. A sample question was “Do you have any suggestions for the future design of the chatbot activity?”.