Introduction

Schools are places where much learning takes place in social and collaborative settings, and teachers play a large part in planning, shaping and steering the sequence of instruction. To teachers, many of the AIED systems that have been developed to date are mostly delivered via computers (with a few exceptions) and to teachers they are just a ‘black box’ in the sense that they leave teachers no real part to play in the use of the system. As a consequence, AIED currently forms little or no part of the activities in a typical classroom. This article takes a speculative look at how, over the next twenty-five years, AIED could break away from being delivered mainly through computers and pads so that it can engage with students in new ways and help teachers to teach more effectively.

One of the things that you do not hear questioned very much is why we are trying to use technologies like computers that were designed primarily for office workers as devices to promote learning in the classroom. Perhaps it is because we envision that we are preparing students for the modern workplace which is full of computers and tablets and that it makes sense for them to get used to the tools they will have to use in the future. Or more likely, no technology manufacturer sees education as its sole target market because they can sell more “boxes” if they are a one-size-fits-all device. They leave it to computer scientists to figure out how to make the technology perform a range of tasks. Of course, the flip side of the current situation is that schools and homes have computers, tablets and smartphones, so designing AIED systems for these enables immediate reach. However, new devices come very quickly. The first iPhone appeared in 2007 followed by the first Android phone in 2008, and the iPad was launched in 2010. Now smartphones and tablets are common technology, so it is conceivable that the technologies we have now will have been superseded by different generations of devices over the next 25 years.

We can see from Volume 1 of the this special issue of IJAIED that the field of AIED has come a long way in 25 years and that advances in technology have really boosted what we can make a business machine do. This article examines two main areas of fast-moving research where AIED could be applied to support and enhance learning; the first is in robotics and the second is in the creation of ‘smart classrooms’ that make use of sensors and the Internet of Things (IoT). The application of robotics for education is proposed as a way to create systems that transcend the limitations of the traditional computer paradigm and provide more social interaction that fits our biological predispositions in learning. The application of AIED to the IoT is proposed as a way to provide new ways of supporting learning and for leveraging the big data from rich interactions with ‘smart’ technologies to produce value for the teacher and classroom. In addressing these new potential futures new areas of research in which AIED could be applied are discussed. These include (1) creating systems that provide social interaction (an extension of the Computer Supported Collaborative Learning that already exists), (2) exploring new modalities for interfaces between learners, teachers and supporting technologies and (3) investigating the application of the Internet of Things in education.

Some Assumptions

Let us imagine for a moment a future in which the technologies used in learning are designed for that purpose and not just adapted from business machines. What would a teacher want help with in the classroom? What would a student want from a learning system? And, how will AI be integrated in this future world of education?

Before we embark on this thought experiment, I should declare some of my assumptions about what will happen to education in the next 25 years. Firstly, it seems that the school as an institution is highly durable and has become an accepted part of our society in all developed nations and in developing nations too. I am using the term school loosely to mean that a place where learners and teachers will gather physically and virtually to engage in teaching and learning activities. As agriculture became mechanised in the industrial revolution and the need for an educated workforce increased, schools became increasingly mandatory and over time the number of years that students spend at school has increased. If you were to remove schools from western society we would not be able to adapt easily as parents would have to stay home to look after children, even if they were able to attend school virtually. And these days many families have two parents working so they could not afford to halve their incomes. In short, unless there are some huge societal shifts I think that schools in some form or other will be with us for a while.

My second assumption flows from the first. If “schools” continue to exist then so will “teachers”. By “teachers”, I mean an adult who will oversee and promote learning among the students. Ideally they will hold content and pedagogical qualifications, but even in current schools there are skills shortages and teachers who are teaching outside their subject expertise.

Aied in Educational Cobots

In parallel to the advances that have been made in the last 25 years in AIED, the field of robotics has also made considerable advances. Researchers in robotics are currently exploring how socially assistive robots can help with everyday tasks like housework, guiding shoppers at the mall or train station, as well as more specialised tasks like rehabilitation after injury. There has also been work on robots in the classroom. Over the next 25 years robots will appear in many aspects of our lives and it seems inevitable that education is a sector where they could be very helpful.

I believe that over the next quarter century we will develop what I would like to call Educational Cobots. A cobot is a robot co-worker that works alongside humans to help them perform their work, so an educational cobot is a robot designed to support human teachers. This idea of humans and robots working collaboratively is an area of interest in robotics (for example, Green et al., 2008). As discussed later in this paper, robots have the potential to capture and retain the attention of learners in the classroom and this, on its own would useful to a teacher as attention is a precursor to learning. Teachers typically have class sizes of 25 students or more and even the most experienced teacher finds it challenging to differentiate their instruction and have students working on different tasks at different paces. Having educational cobots in the classroom would aid the teacher to differentiate instruction so that learners receive more tailored teaching. This is where the work already done in AIED will become essential to achieve this vision. If educational cobots had AIED capabilities they could do things like monitor learners as they engage with Intelligent Learning Environments (the ILEs could liaise directly with the cobot); flag and attend to learners who need extra help that the ILE cannot provide; keep learners engaged and interested; and answer questions that a learner might have.

We already know a considerable amount about how to structure learning tasks and assist students through them in ways that produce lasting learning. We also have begun to work on detecting the affective state of learners to better understand their progress and to intervene when needed. At the same time, researchers in robotics are investigating the best ways for robots to be designed in terms of appearance and actions to interact effectively with people and there is research on how children interact with robots. Combining the fields of robotics and AIED has the potential to create true educational technologies that are designed for the specific purpose of assisting in the teaching and learning enterprise.

Why Physical Embodiment is Important

There are good reasons to port our AIED systems into robotic bodies. It is argued by some cognitive researchers that the computational components of intelligence, such as those that AIED has been mainly focused on to date, is insufficient on its own to be regarded as the whole of human intelligence. They argue that human intelligence sits within a physical body, whose sensorimotor systems play a strong part in how we make sense of the world around us and act in it. This embodied cognition approach proposes that our physical bodies influence our brains in the same way as the mind influences bodily actions. Human beings interact with one another through our visual, auditory and sensory system, using skills that we have evolved over millions of years, such as recognizing objects, moving around physically, judging people’s motivations, recognizing a voice, setting appropriate goals, paying attention to things that are interesting. These are also the kinds of skills that teachers use.

Australian roboticist, Rodney Allen Brooks (2002) argues that if we want robots to perform everyday tasks alongside of humans, then their higher cognitive abilities, should be based on sensorimotor actions in their surrounding environment and their proprioceptive sense (awareness of its positioning and movement in its surroundings) supported by coordination of its visual, auditory and touch sensors. If a robot is to navigate the dynamic space of a modern classroom in which a variety of furniture moves position and many human bodies race around, then it will have to have the kind of intelligence that enables this, but it will have to work in conjunction with its AIED intelligence too so that it can use standard classroom equipment like white boards (or smartboards), paper (yes, that will probably still be around in 25 years) or specialist equipment like chemistry glassware.

Embedding AIED into robots would bring advantages over being constrained to an immobile box. Being able to move around the classroom as students are working on projects, recognizing the faces and voices of individual students, being able to point or gesture, and being capable of facial expressions would all enhance the interaction with students. Also, robots that could use equipment in particular subject areas such as physics or chemistry would be able to demonstrate phenomena in a physical way and not just via video or simulations.

Would robots need to be humanoid in appearance to be useful in the classroom? At present, robots in use in our society are not humanoid in appearance. Robots on production lines look like machines and the robot vacuum cleaners in our homes do not even look like conventional vacuum cleaners, let alone like human cleaners. For the most part, robots do not need to be humanoid, but would this be true of the classroom? There is evidence from neuroscience too that our acceptance of intelligent technology is influenced by how human-like it appears. Krach et al. (2008) used functional magnetic resonance imaging (fMRI) to study the brain activity of study participants who played a game against four different kinds of opponent; a computer, a functional robot, a human-like robot and a human. Unbeknown to the participants, all their opponents were just playing randomly. However, participants showed activation in the areas of the brain associated with ‘theory of mind’ (i.e., attribution of human intention) in an order of increasing human-like features (computer < functional robot < human-like robot < human). So embedding AIED in human-like robots would enhance their acceptance by the learner.

The strongest general argument for humanoid robots is that the designed world is made for humans. For example, cars have steering wheels that a human can grip and foot pedals that we can press. Doors are wide enough for humans to get through and can be opened by using our hands on the doorknobs. If a robot is made in a human-like form and size, with our limbs and basic physical capabilities, then it can navigate the existing designed world as we do, without the need to re-engineer everything from doorknobs to car controls. Of course, in the next 25 years cars will be driving themselves anyway!

However, some of the things that humans find simple, like recognizing objects in the surroundings, picking them up and manipulating them are very challenging for robots to do. For example, robotics researchers at UC Berkeley have designed a robot that can fold clean clothes that have been laundered, which is a major achievement because although robots have been able to interact with rigid objects, handling soft ‘deformable’ objects like a towel is very difficult (Prigg, 2015). However, the robot cost US$280,000 and the PR2 robot still cannot reach into the washing machine to take the clothes out of the dryer. But, advances will continue to be made over the next 25 years so at some point we can expect robots that could navigate the layout and furniture in a classroom of the future and use the teaching and learning equipment, whatever that may be in the future. You can also expect that, as more robots appear in our designed world, designers of objects might start to embed devices in those objects that would be helpful to robots to identify them and interact with them.

Another benefit to making a humanoid robot is that it is easier for humans to interact with, which would be important in the classroom both from the point of view of the teacher and the students. We humans need to learn how to operate non-humanoid robots like our robotic floor cleaners, and the interfaces are not always easy for us to come to grips with. As more and more robots appear in our lives over the coming quarter century, if there is no standardization of interfaces we will be driven crazy having to learn new ways of interacting with our car, home helper, gardener, and who-knows-what-else robots. The obvious way to standardize is to make them humanoid. We know how to operate with humans because we have done it all of our lives and we are hard-wired to do so and practiced at doing so through the filters of our socio-cultural settings. If the robots in our lives operated within this humanistic socio-cultural setting by speaking and understanding our language, recognizing our gestures, and sensing our emotions then they would be easier for us to operate with. This could be very important for learning in the classroom. If a robot is going to support a teacher, then she should be able to say to her cobot teacher aide, “I have noticed that the group of students in the corner is having trouble setting up the ramp experiment from the force and motion module. I am busy helping this group, can you go and make sure they set it up correctly and that they are on task.” An AIED system embodied in such a robot would be able to move to the group, check the experimental setup and then engage in dialogic tutoring to assist learning. We already know how to do the tutoring part, but it would become even more powerful if it could be deployed in flexible ways.

The learners would also likely respond more naturally to the robot teacher aide. In a study conducted at the Temporal Dynamics of Learning Centre (TDLC), one of the Science of Learning Centers funded by the US National Science Foundation, they found that young children engaged with a robot called Rubi initially for about 20 min before their attention went elsewhere. However, when the robot was programmed to react to their gaze and look at them, the attention span skyrocketed. We are biologically wired by evolution to respond to facial cues in interpersonal interactions. Rubi does not look much like a human, but researchers in Japan have been experimenting with much more lifelike robots. In a Tokyo elementary school a talking humanoid robot named Saya, that was originally developed as a receptionist robot in 2004 by professor Hiroshi Kobayashi of the Tokyo University of Science, has been used in the classroom in a limited way. Saya can speak multiple languages, and an array of motors in her head stretch the soft synthetic skin so that she can display emotions ranging from happiness and surprise to sadness and anger. But Saya could do little more than calling out student names and giving basic classroom control commands like saying, “Be quiet.” The robot did prove popular with the students and provoked emotions among them, including one student who cried when told off by Saya. Imagine, though, if Saya had pedagogical skills facilitated by AIED research. Kiesler et al. (2008) investigated how humanoid a robot would need to be for effective use in social assistive situations like in the classroom and found that people preferred interaction with a robot than a comparable on-screen agent. They attributed the robot with stronger, more positive personality traits and felt it was more lifelike. An interesting finding that could be relevant to the classroom was that participants were more inhibited with the robot than with the agent and showed less socially undesirable behaviour, as has been found in other research. Perhaps students will behave better with an Educational Cobot than with an on-screen agent, and this is one thing that could be investigated. As the field of robotics is relatively new, like AIED, we humans have not had a lot of experience in interacting with robots. Some of the robots used in South Korean classrooms have been little more than a TV screen on wheels, with the facial expression being handled simply as video in contrast to the multiple motors of the Saya robot. Even with the clever synthetic appearance of Saya, it was obvious that she was a robot and students went and poked the skin of her face, something they would never do with a real teacher. Once their curiosity was satisfied though and they knew what they were dealing with, they could interact with Saya.

Other researchers have investigated what roles a robot might play in the classroom. For example, Oshima et al. (2012) explored how Robovie-W, a communication robot, might support engagement as a partner in a cooperative learning group of pre-service teachers. They found that the style of student interactions with Robovie-W was similar to that with the human participants in the learning group and that adaptive learners interacted best with the robot. While these results are encouraging, the robot was not programmed with any intelligent learning software, simply turning its head towards a human who was speaking and reciting pre-scripted explanations of a text that was being discussed by the group. The intelligent interactions beyond that relied upon a human listening to student voices via a microphone and typing responses via a keyboard that were rendered as speech by the robot. So it was, in effect, a Wizard of Oz system for the most part. The next step could be to deploy full AIED systems via a robot. Shiomi et al. (2015) investigated whether the Robovie social robot might stimulate students’ curiosity for science if it were placed in a 4th and 5th grade elementary science classroom where students were free to interact with it during break time. As before, Robovie was used in a semi-autonomous mode where some interactions were automated but more complex ones were handled in a Wizard-of-Oz manner. In this study the robot was introduced as a new student in class, so it took on the role of a peer, rather than a teaching or teacher’s assistant role. Overall the presence of the robot did not increase the science curiosity of the students but the science curiosity of the students who asked Robovie questions did increase.

Other, off-the-shelf robots like Nao from Aldebaran Robotics have been used in classrooms and although they have a basic human form (head, eyes, body, arms, hands, legs and feet) they are short and not human sized. They are obviously robots, but French researcher Pierre Dillenbourg has programed the Nao to help students to learn to write letters of the alphabet (Loos, 2015). In that study the robot again took the role of a learning partner and the students had to teach the robot to form letters correctly. Kennedy et al. (2015) also used the Nao robot but in a tutoring role. Students participating in their study used a large touch screen to learn about prime numbers and the robot was aware of the students’ moves via a feed from the touch screen, although it could not use the touch screen itself. The study investigated whether a more social robot would produce higher learning gains than one that was asocial. An unexpected finding was that the asocial robot produced higher learning gains, which is contrary to similar research. Clearly there is more research to be done. Bothe the Dillenbourg and Kennedy studies illustrate that AIED researchers could start with off-the-shelf robots and see what their intelligent software could do via the robot’s body before moving to work with larger, perhaps purpose-built instructional robots that could assist teachers.

Aied in the Learning Environment

Now that we have explored some possible directions for using AIED in the classroom via robots, let us think about how AIED might be deployed in the physical classroom itself to create a ‘smart classroom’ that can support the students, the teacher and, of course, the educational cobots within in it.

In the last decade there has been a growth in the use of sensors to monitor our built environment (e.g., homes that control lights in rooms that we are using, opening a garage door as your car approaches, fridges that order groceries before you run out, etc.). This has led to what has come to be known as the Internet of Things (IoT) and companies in many industries are looking at how they can add value to their products by embedding in them sensors, chips and communicators so that they may gather and use data, and communicate with other objects or the Internet. A central tenet of IoT is that objects, people or devices (like educational cobots) are provided with unique identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.

What might this IoT approach look like in the classroom? What if manipulatives like blocks of different size and colors that are used to develop understanding of base ten counting were sensorized. Each block could ‘know’ its identity (e.g., I am a red block representing 10 units) and know where it is in relation to other blocks (e.g., I am lined up with and touching two other red blocks and one green block representing one hundred). If the learner had been asked to pick out blocks that represented thirty, then the block would know that the pattern that it was a part of did not total thirty. The green block would know that its value alone exceeded the target total and could have an LED light inside that flashed to indicate it was in the wrong place. No teacher or cobot would be needed in this interaction, but what would be needed is some AIED that governed the behaviour of the blocks. This is just a simple example, but what would more complex examples look like? For example, imagine a student learning to play the guitar in music lessons at school but still at the stage of mastering the fingering of basic chords. The music score shown on their electronic pad could be sending a signal to the guitar about what chord is to be played next and the finger positioning could be indicated directly on the fretboard of the instrument for the student to follow. Or the guitar fretboard might sense the students’ fingering of the chord and send that information to the music that would light up the bar being played in green if it matches or red when it does not. Again, no teacher is needed in this learning transaction. But what if the teacher wanted to know how that student had been doing, but he was helping another student at the time? The system could have recorded and analysed the student’s mistakes and advised the teacher that the student is still getting confused between chords C and F, it might call the teacher over so that the student gets some personal attention. Or it may just recommend some additional practice on some music that features the C and F chord changes. What sort of AIED will be required for this kind of object-to-object-to-teacher-to-student tutoring and how could systems that we have already be adapted to support it?

The students and the room itself could have sensors that help the teacher with classroom management tasks as well as learning tasks. There is already a wearable badge developed at MIT by Sandy Pentland’s team and produced by Sociometric Solutions that tracks the location of the wearer, senses when other badge wearers are in range and can detect emotion from the affective tone of the voice of the wearer. They currently cost about US$500, but as with all technologies they will doubtless get cheaper and when they do, what could be done to help teachers in the classroom if all students were wearing one? She could know, for example, if a student leaves the classroom without permission, and she could assign students to groups and get alerts if the vocal tone of any group seems to indicate that they are off task. It’s easy for a teacher to spot if a group is noisier than the rest of the room, but not to detect if the level of discussion dropped because they are not talking to each other in a collaborative task for example. Maybe the use of such badges over time would allow data to be gathered and mined so that typical patterns of small group work could be detected and aberrant ones where off task behavior is being displayed could be identified from the outliers. Then the badges of the students might light up to show if they are on task (green) or off task (red) so that the teacher could glance around the room and see easily which groups she needed to attend to as she wanders the class, monitoring the learning activities. Or these data could be sent directly to an educational cobot that could then head for the group that needs guidance.

High definition cameras in classrooms might give other valuable support. Cameras combined with facial recognition technologies could allow camera systems to capture images and sense the emotional state of the learners, recognizing the faces of individuals as it does so. What if we were to data mine the video stream from several well running classrooms to identify the features of a learning space that was productive and build predictive models and then build counterpoint models of classrooms where the overall activity is not productive so that we could tell the difference automatically via just the video stream. This kind of monitoring might be useful for beginning teachers who are less experienced at classroom management and maintaining the right blend of control and freedom than experienced teachers. Such a system could give feedback to the teacher about how to bring the class back to a productive state. How could AIED model and support effective classroom teaching if data from a range of sensors were available in real time?

To obtain more fine-grained data that could help to monitor individual learners rather than the classroom as a whole, students’ behaviour and affect could be captured via sensors in the classroom to supplement the information about the learners from other sources. Motion sensors like the Kinect system and biometric sensors like the wristbands that many people are now wearing for fitness purposes might be used to gather information about how engaged students are and to watch out for off-task behaviour. Data from biometric sensors tracking blood volume pulse and galvanic skin response could be used to detect changes in mood and how focused a student is, things that we already know are related to learning. Other uses might be to monitor students who are working in small groups to detect when a group is ‘in synch’ versus ‘unfocused’ and research on that is already being conducted at the Science of Learning Research Centre in Australia (slrc.org.au). With these potentially dynamic streams of data, AIED researchers will need to work on how to make sense of them in a learning context and what actions to take as a result (e.g., notify the human teacher or the educational cobot).

What is Needed to Move Towards This Vision?

The ideas proposed in this article are linked to those expressed by other authors in this issue. The article by Erin Walker and Amy Ogan focuses on the need to recognize the social role of technology’s use in learning. Educational cobots, if well designed, would take on a social role in the classroom, just like a teacher’s aide at present. Back in 1999, John Self (1999) argued that AIED is the only field of advanced educational technology that is aimed at building “computer-based learning systems which attempt to adapt to the needs of learners and are therefore the only such systems which attempt to ‘care’ about learners in that sense.” Maybe we can extend this idea in AIED to go beyond dealing with just the academic welfare of students and expand to embrace a more general concern for student well-being. Research in neuroscience and the physiology of learning has shown that there is a strong relationship between emotion and cognition (Zull, 2002, 2011). Studies have shown that the students of teachers who show a personal interest in them beyond the immediate subject being learned are more engaged and motivated, and conversely, in the absence of a caring teacher, learning is impaired. Would a caring educational cobot have similar effects?

The ideas expressed here also have links to Nikol Rummel’s article about deeper computational understanding of social interactions, attunement, and other affect-heavy constructs. For educational cobots to be effective they would need to be good at interacting with all kinds of students for long periods of time, monitoring their learning across a range of topics. To do that the AIED research community should apply more effort to developing the kinds of skills that Rummel describes in her article.

Movement towards educational cobots in the classroom has already begun. Japan and South Korea, in particular, are pursuing this goal with larger and more lifelike robots. Switzerland has started to investigate how small, off-the-shelf robots might be used in learning. AIED researchers will need to reach out to roboticists and find out what the state of the art is in robotics. We also need to let robotics researchers know what our field is capable of at present. AIED is a field that is the intersection of multiple fields like computer science, education research, cognitive psychology, instructional design and psychometrics. Those involved in AIED know that doing cross-disciplinary work is really hard. To collaborate effectively, each team member needs to gain at least a rudimentary understanding of the fields of the other partners. That way communication can take place and the team can collaboratively solve design and research problems together. But building this level of shared knowledge and collaboration takes time, so if AIED and robotics wish to come together to solve educational problems, then we need to arrange some initial exchanges of knowledge. Maybe some joint workshops or summits, or even visits to each other’s research labs, sabbatical visits or exchanges of doctoral students or post docs would get the ball rolling. Until AIED and robotics researchers understand more about one another’s disciplines, we will not be able to identify ways that the fields can work together to develop educational cobots, or something else that at this point in time we are unable to foresee or imagine.

To move toward the vision expressed in this article there will need to be greater use of intermediary technologies that allow an expansion of the kinds of human-technology interfaces beyond keyboard and mouse. Intermediary technologies like natural language processing of spoken dialogue, a key part of having a naturalistic interface with robotic or other smart technologies, have advanced greatly in recent years and with the advent of new approaches to machine learning like ‘deep learning’ these will undoubtedly continue to improve. Some researchers in AIED have already incorporated spoken dialogue into their systems and there was a 2009 AIED conference workshop on Educational Natural Language Processing that addressed the particular language processing issued in tutorial dialogues and a review of progress towards conversational dialogue in intelligent tutoring systems appears in Rus et al. (2013). Other intermediary technologies like touch based and tactile interfaces will also play a role in allowing for interactions with cobots and smart classrooms. There are also advances in virtual and augmented reality devices like the Oculus Rift that may have applications for cobots and smart classrooms.

We need to persuade research-funding agencies across the world that this is a new field with potential and to get funding for some proof of concept work to explore what is possible. As a first step, agencies could sponsor some of those ‘getting to know you’ type events and a step after that could be having a group of cross-disciplinary researchers identify what areas of joint research could be funded. The National Science Foundation in the USA did this kind of forward thinking in AIED to shape its funding of research in educational technology in the past and it helped to spawn many of the latest developments in the field. In fact, NSF is already funding projects in this field. For example, AIED researcher Lewis Johnson (2015) is conducting a project called RALLe that is investigating how to design simulation-based learning experiences for language learning by developing a prototype lifelike robot that engages in conversations in foreign language, and study its use in educational settings. See the article entitled Face-to-Face Interaction with Pedagogical Agents, Twenty Years Later by Johnson and Lester in Volume 1 of this special issue for more details. Maybe a way to push this forward is to hold a workshop at a conference such as Human-Robot Interaction (HRI), Multimodal Interaction (ICMI), or at AIED.

There is also a research conference on the Internet of Things that held its 5th gathering in 2015 (http://www.iot-conference.org/) and it is sponsored by The Institute of Electrical and Electronics Engineers (IEEE). The range of devices that are now ‘smart’ in some way is increasing and if AIED researchers want to find out what combinations of them could support teachers and students in the classroom, then they may just have to purchase sets of likely sensors and see what can be done with them. However, maybe AIED researchers should be working with electrical engineering departments at their own universities rather than trying to do this work alone. As with the educational robotics field, some injection of research funding to enable AIED researchers to get to know what technologies are available and who is doing relevant work in other disciplines would be most helpful to push forward in this direction.

Conclusion

The message I have tried to convey here is that it is time for AIED to be let out of the box, meaning the computer so that it can become embedded in other devices such as robots and in smart classrooms. The ideas presented here are meant to be a provocation to start thinking about what might be possible avenues for future AIED research directions and I recognise that some have already been doing research along these lines, so these ideas may not be new to you. However, I do not imagine that we will immediately be able to build a fully effective educational cobot or develop a smart classroom in the next few years, but we could set a course in that direction. Embarking on this course of action would lead to a considerable expansion for the field and lead to new collaborations with other fields such as robotics and electrical engineering. Themes of future research could include creating systems that provide social interaction (an extension of the Computer Supported Collaborative Learning that already exists); exploring new modalities for interfaces between learners, teachers and supporting technologies and investigating the application of the Internet of Things in education. Much of what we have learned about how to model knowledge and what the learner knows together with ways to provide feedback and pedagogy could be transported to robots. Doing so would introduce more ways of sensing what the learner’s affective state is and interacting with them in different ways. This would lead to new challenges, which would be very stimulating for AIED and allow us to expand our models and methods. Embedding AIED into smart classrooms would produce real time data streams from lots of sensors in the classroom and focused on the learners would require huge amounts of Educational Data Mining and would lead to new models of how learners behave in the wider environment of the classroom and not just with the instructional packages that we have developed to date. A practical step forward might be to hold a workshop on Educational Cobots and Smart Classrooms at AIED or a related conference such as Human-Robot Interaction. Now it is up to you where AIED goes next.