In parallel to the advances that have been made in the last 25 years in AIED, the field of robotics has also made considerable advances. Researchers in robotics are currently exploring how socially assistive robots can help with everyday tasks like housework, guiding shoppers at the mall or train station, as well as more specialised tasks like rehabilitation after injury. There has also been work on robots in the classroom. Over the next 25 years robots will appear in many aspects of our lives and it seems inevitable that education is a sector where they could be very helpful.
I believe that over the next quarter century we will develop what I would like to call Educational Cobots. A cobot is a robot co-worker that works alongside humans to help them perform their work, so an educational cobot is a robot designed to support human teachers. This idea of humans and robots working collaboratively is an area of interest in robotics (for example, Green et al., 2008). As discussed later in this paper, robots have the potential to capture and retain the attention of learners in the classroom and this, on its own would useful to a teacher as attention is a precursor to learning. Teachers typically have class sizes of 25 students or more and even the most experienced teacher finds it challenging to differentiate their instruction and have students working on different tasks at different paces. Having educational cobots in the classroom would aid the teacher to differentiate instruction so that learners receive more tailored teaching. This is where the work already done in AIED will become essential to achieve this vision. If educational cobots had AIED capabilities they could do things like monitor learners as they engage with Intelligent Learning Environments (the ILEs could liaise directly with the cobot); flag and attend to learners who need extra help that the ILE cannot provide; keep learners engaged and interested; and answer questions that a learner might have.
We already know a considerable amount about how to structure learning tasks and assist students through them in ways that produce lasting learning. We also have begun to work on detecting the affective state of learners to better understand their progress and to intervene when needed. At the same time, researchers in robotics are investigating the best ways for robots to be designed in terms of appearance and actions to interact effectively with people and there is research on how children interact with robots. Combining the fields of robotics and AIED has the potential to create true educational technologies that are designed for the specific purpose of assisting in the teaching and learning enterprise.
Why Physical Embodiment is Important
There are good reasons to port our AIED systems into robotic bodies. It is argued by some cognitive researchers that the computational components of intelligence, such as those that AIED has been mainly focused on to date, is insufficient on its own to be regarded as the whole of human intelligence. They argue that human intelligence sits within a physical body, whose sensorimotor systems play a strong part in how we make sense of the world around us and act in it. This embodied cognition approach proposes that our physical bodies influence our brains in the same way as the mind influences bodily actions. Human beings interact with one another through our visual, auditory and sensory system, using skills that we have evolved over millions of years, such as recognizing objects, moving around physically, judging people’s motivations, recognizing a voice, setting appropriate goals, paying attention to things that are interesting. These are also the kinds of skills that teachers use.
Australian roboticist, Rodney Allen Brooks (2002) argues that if we want robots to perform everyday tasks alongside of humans, then their higher cognitive abilities, should be based on sensorimotor actions in their surrounding environment and their proprioceptive sense (awareness of its positioning and movement in its surroundings) supported by coordination of its visual, auditory and touch sensors. If a robot is to navigate the dynamic space of a modern classroom in which a variety of furniture moves position and many human bodies race around, then it will have to have the kind of intelligence that enables this, but it will have to work in conjunction with its AIED intelligence too so that it can use standard classroom equipment like white boards (or smartboards), paper (yes, that will probably still be around in 25 years) or specialist equipment like chemistry glassware.
Embedding AIED into robots would bring advantages over being constrained to an immobile box. Being able to move around the classroom as students are working on projects, recognizing the faces and voices of individual students, being able to point or gesture, and being capable of facial expressions would all enhance the interaction with students. Also, robots that could use equipment in particular subject areas such as physics or chemistry would be able to demonstrate phenomena in a physical way and not just via video or simulations.
Would robots need to be humanoid in appearance to be useful in the classroom? At present, robots in use in our society are not humanoid in appearance. Robots on production lines look like machines and the robot vacuum cleaners in our homes do not even look like conventional vacuum cleaners, let alone like human cleaners. For the most part, robots do not need to be humanoid, but would this be true of the classroom? There is evidence from neuroscience too that our acceptance of intelligent technology is influenced by how human-like it appears. Krach et al. (2008) used functional magnetic resonance imaging (fMRI) to study the brain activity of study participants who played a game against four different kinds of opponent; a computer, a functional robot, a human-like robot and a human. Unbeknown to the participants, all their opponents were just playing randomly. However, participants showed activation in the areas of the brain associated with ‘theory of mind’ (i.e., attribution of human intention) in an order of increasing human-like features (computer < functional robot < human-like robot < human). So embedding AIED in human-like robots would enhance their acceptance by the learner.
The strongest general argument for humanoid robots is that the designed world is made for humans. For example, cars have steering wheels that a human can grip and foot pedals that we can press. Doors are wide enough for humans to get through and can be opened by using our hands on the doorknobs. If a robot is made in a human-like form and size, with our limbs and basic physical capabilities, then it can navigate the existing designed world as we do, without the need to re-engineer everything from doorknobs to car controls. Of course, in the next 25 years cars will be driving themselves anyway!
However, some of the things that humans find simple, like recognizing objects in the surroundings, picking them up and manipulating them are very challenging for robots to do. For example, robotics researchers at UC Berkeley have designed a robot that can fold clean clothes that have been laundered, which is a major achievement because although robots have been able to interact with rigid objects, handling soft ‘deformable’ objects like a towel is very difficult (Prigg, 2015). However, the robot cost US$280,000 and the PR2 robot still cannot reach into the washing machine to take the clothes out of the dryer. But, advances will continue to be made over the next 25 years so at some point we can expect robots that could navigate the layout and furniture in a classroom of the future and use the teaching and learning equipment, whatever that may be in the future. You can also expect that, as more robots appear in our designed world, designers of objects might start to embed devices in those objects that would be helpful to robots to identify them and interact with them.
Another benefit to making a humanoid robot is that it is easier for humans to interact with, which would be important in the classroom both from the point of view of the teacher and the students. We humans need to learn how to operate non-humanoid robots like our robotic floor cleaners, and the interfaces are not always easy for us to come to grips with. As more and more robots appear in our lives over the coming quarter century, if there is no standardization of interfaces we will be driven crazy having to learn new ways of interacting with our car, home helper, gardener, and who-knows-what-else robots. The obvious way to standardize is to make them humanoid. We know how to operate with humans because we have done it all of our lives and we are hard-wired to do so and practiced at doing so through the filters of our socio-cultural settings. If the robots in our lives operated within this humanistic socio-cultural setting by speaking and understanding our language, recognizing our gestures, and sensing our emotions then they would be easier for us to operate with. This could be very important for learning in the classroom. If a robot is going to support a teacher, then she should be able to say to her cobot teacher aide, “I have noticed that the group of students in the corner is having trouble setting up the ramp experiment from the force and motion module. I am busy helping this group, can you go and make sure they set it up correctly and that they are on task.” An AIED system embodied in such a robot would be able to move to the group, check the experimental setup and then engage in dialogic tutoring to assist learning. We already know how to do the tutoring part, but it would become even more powerful if it could be deployed in flexible ways.
The learners would also likely respond more naturally to the robot teacher aide. In a study conducted at the Temporal Dynamics of Learning Centre (TDLC), one of the Science of Learning Centers funded by the US National Science Foundation, they found that young children engaged with a robot called Rubi initially for about 20 min before their attention went elsewhere. However, when the robot was programmed to react to their gaze and look at them, the attention span skyrocketed. We are biologically wired by evolution to respond to facial cues in interpersonal interactions. Rubi does not look much like a human, but researchers in Japan have been experimenting with much more lifelike robots. In a Tokyo elementary school a talking humanoid robot named Saya, that was originally developed as a receptionist robot in 2004 by professor Hiroshi Kobayashi of the Tokyo University of Science, has been used in the classroom in a limited way. Saya can speak multiple languages, and an array of motors in her head stretch the soft synthetic skin so that she can display emotions ranging from happiness and surprise to sadness and anger. But Saya could do little more than calling out student names and giving basic classroom control commands like saying, “Be quiet.” The robot did prove popular with the students and provoked emotions among them, including one student who cried when told off by Saya. Imagine, though, if Saya had pedagogical skills facilitated by AIED research. Kiesler et al. (2008) investigated how humanoid a robot would need to be for effective use in social assistive situations like in the classroom and found that people preferred interaction with a robot than a comparable on-screen agent. They attributed the robot with stronger, more positive personality traits and felt it was more lifelike. An interesting finding that could be relevant to the classroom was that participants were more inhibited with the robot than with the agent and showed less socially undesirable behaviour, as has been found in other research. Perhaps students will behave better with an Educational Cobot than with an on-screen agent, and this is one thing that could be investigated. As the field of robotics is relatively new, like AIED, we humans have not had a lot of experience in interacting with robots. Some of the robots used in South Korean classrooms have been little more than a TV screen on wheels, with the facial expression being handled simply as video in contrast to the multiple motors of the Saya robot. Even with the clever synthetic appearance of Saya, it was obvious that she was a robot and students went and poked the skin of her face, something they would never do with a real teacher. Once their curiosity was satisfied though and they knew what they were dealing with, they could interact with Saya.
Other researchers have investigated what roles a robot might play in the classroom. For example, Oshima et al. (2012) explored how Robovie-W, a communication robot, might support engagement as a partner in a cooperative learning group of pre-service teachers. They found that the style of student interactions with Robovie-W was similar to that with the human participants in the learning group and that adaptive learners interacted best with the robot. While these results are encouraging, the robot was not programmed with any intelligent learning software, simply turning its head towards a human who was speaking and reciting pre-scripted explanations of a text that was being discussed by the group. The intelligent interactions beyond that relied upon a human listening to student voices via a microphone and typing responses via a keyboard that were rendered as speech by the robot. So it was, in effect, a Wizard of Oz system for the most part. The next step could be to deploy full AIED systems via a robot. Shiomi et al. (2015) investigated whether the Robovie social robot might stimulate students’ curiosity for science if it were placed in a 4th and 5th grade elementary science classroom where students were free to interact with it during break time. As before, Robovie was used in a semi-autonomous mode where some interactions were automated but more complex ones were handled in a Wizard-of-Oz manner. In this study the robot was introduced as a new student in class, so it took on the role of a peer, rather than a teaching or teacher’s assistant role. Overall the presence of the robot did not increase the science curiosity of the students but the science curiosity of the students who asked Robovie questions did increase.
Other, off-the-shelf robots like Nao from Aldebaran Robotics have been used in classrooms and although they have a basic human form (head, eyes, body, arms, hands, legs and feet) they are short and not human sized. They are obviously robots, but French researcher Pierre Dillenbourg has programed the Nao to help students to learn to write letters of the alphabet (Loos, 2015). In that study the robot again took the role of a learning partner and the students had to teach the robot to form letters correctly. Kennedy et al. (2015) also used the Nao robot but in a tutoring role. Students participating in their study used a large touch screen to learn about prime numbers and the robot was aware of the students’ moves via a feed from the touch screen, although it could not use the touch screen itself. The study investigated whether a more social robot would produce higher learning gains than one that was asocial. An unexpected finding was that the asocial robot produced higher learning gains, which is contrary to similar research. Clearly there is more research to be done. Bothe the Dillenbourg and Kennedy studies illustrate that AIED researchers could start with off-the-shelf robots and see what their intelligent software could do via the robot’s body before moving to work with larger, perhaps purpose-built instructional robots that could assist teachers.