Advertisement

Real-time orchestrational technologies in computer-supported collaborative learning: an introduction to the special issue

  • Camillia MatukEmail author
  • Michael Tissenbaum
  • Bertrand Schneider
Article

A CSCL perspective on real-time classroom data

This special issue on real-time orchestrational tools for CSCL classrooms arises from both a need and an opportunity. Increasingly, CSCL classrooms are turning toward open-ended inquiry learning, in which students’ trajectories can be divergent and unanticipated. At the same time, classroom sizes are expanding beyond what many teachers can reasonably manage. The task of knowing how and when to provide timely and specific guidance is becoming increasingly challenging for teachers (Dimitriadis 2012; Tissenbaum and Slotta 2015; Roschelle et al. 2013). Particularly in CSCL classrooms, teachers must monitor group progress on time sensitive tasks, coordinate students’ changing group roles, provide group and individual guidance and assessment, and differentiate resources to groups that are working in tandem (Tissenbaum and Slotta 2015; Dimitriadis 2012). To support their students’ growth, teachers must quickly access and interpret data on their students’ learning (Kuhn 2005; Shute 2008), and make decisions about how to guide them both conceptually and logistically (Dillenbourg et al. 2009; Dillenbourg 2011; Schwarz et al. 2018). Yet, typical assessments do not provide teachers with the information that they need to address students’ learning in a timely manner, and that would impact those students’ outcomes (Pellegrino et al. 2001).

Meanwhile, dashboards are now common features of most learning platforms. These digital displays of data streams and feedback loops typically offer overviews of learners’ states and interactions after they have completed classroom activities (Verbert et al. 2014). While these are valuable for informing curricular adjustments between classes, they do little to help instructors with the many real-time tasks of teaching with technology (Baker and Inventado 2014). In contrast, real-time dashboards offer data on activities as they are occurring. This allows instructors to orchestrate in-class activities, including monitoring the status of the classroom and managing classroom workflow (Dillenbourg, 2011; Dillenbourg 2013).

Advances in data collection tools now offer unprecedented opportunities to capture learners’ and teachers’ behaviors in physical and digital collaborative learning environments (e.g., Raca et al. 2014; Blikstein and Worsley 2016), and to visualize these to support the real-time activities associated with learning and instruction (Baker and Inventado 2014). These tools not only make it possible to perform otherwise prohibitively challenging tasks, but also to augment the ways teachers would otherwise perform them (e.g., Martinez-Maldonado, this issue). They present opportunities to orchestrate entirely new collaborative activities, and to understand their impacts on teaching and learning in ways that were not possible before.

With some exceptions (e.g., Lonn et al. 2015), real-time dashboards have largely been considered from the perspective of human computer interaction (HCI), and less from a CSCL perspective (Verbert et al. 2014). Moreover, few empirical studies examine these technologies in authentic learning settings. As classroom-based, empirical studies begin to appear, so do questions about the actual affordances of these real-time technologies, and about the best strategies for incorporating them into teachers’ practices to promote collaborative learning. This special issue explores how dashboards can be designed to support teachers in real-time, in real-world classrooms. Additionally, it takes a critical look at some of the challenges related to using real-time data, which suggest avenues for future research.

How this special issue came about

The idea for this special issue began with a structured poster symposium at the Conference for Computer Supported Collaborative Learning in Singapore (Tissenbaum and Matuk 2016). This symposium highlighted the various ways that real-time dashboards are being used to support and understand student learning. To highlight and consolidate the work that was being done in this area, we solicited manuscripts from scholars from within and beyond this first symposium, selecting those that met at least two criteria: (1) That they were empirical investigations of real-time dashboards; and (2) that they offered outcomes on either teacher or student learning. The resulting issue contains five contributions that are rigorous in their methodology (the manuscripts went through an internal review process among our guest editor team, and then through ijCSCL’s external double-blind peer review); and representative of a spectrum of perspectives and approaches to incorporating real-time data into CSCL environments. Since undertaking this special issue, we have discovered many other exciting examples of research on real-time orchestrational tools, which, while not represented here, indicate that this is a burgeoning field worthy of attention.

Summary of contributions

The contributions to this issue represent diverse theoretical perspectives, technological platforms, educational contexts, and design approaches to real-time data capture, aggregation, and visualization of collaborative learning. Collectively, they address themes related to the design, theory, and impact of real-time dashboards in authentic CSCL environments, and help advance the conversation by offering critical perspectives and empirical evidence for the value of real-time dashboards in orchestrating collaborative learning in classrooms. We summarize each paper and discuss some common themes below.

The study by van Leeuwen, Rummel and van Gog examines how real-time data can support teacher noticing, a practice that involves identifying which students and behaviors require attention, and determining how to respond (Van Es and Sherin 2002, 2008). Given that teachers must quickly and accurately ascertain when and how they are needed during class time, the authors ask how information can be best presented on a dashboard to promote teachers’ speed in detecting, and accuracy in interpreting issues with students’ collaborative work.

To explore this question, van Leeuwen et al. created a dashboard prototype for MathTutor, an environment that supports dyads in the conceptual reasoning and procedural practice in the domain of mathematics. The researchers first engaged teachers in a co-design process to determine what specific information to display. Then, in a laboratory-based, between-subjects experiment, the authors compared teachers’ response times to, and interpretations of dashboards showing fictitious classroom scenarios of students working on fractions problems. These scenarios were designed to reflect different combinations of social and cognitive issues. For example, one member of the dyad might be monopolizing the activity, students might be taking turns instead of collaborating, a dyad might be approaching the task through trial-and-error rather than through discussion, the dyad might be stuck on a particular kind of problem, and so forth. The authors compared teachers’ abilities to identify the social and/or cognitive issue of these situations given three different levels of real-time information: a mirroring dashboard showed the status of the classroom’s activities; an alerting dashboard showed the classroom status and also highlighted students in need of attention; and an advising dashboard showed the classroom status, highlighted students in need of attention, and also offered advice on how to address those potential issues.

The authors found no significant differences between conditions in teachers’ speed and confidence in interpreting the classroom situations. However, the teachers tended to interpret social and affective reasons for the situations that went beyond the initially identified cognitive or social issue. Van Leeuwen et al. also found that with the advising dashboards, teachers spent longer considering the information displayed, provided richer interpretations of the situations, and even questioned and disagreed with the advice that these dashboards displayed. These findings resonate with other research, which highlights the importance of trust in recommendation systems for ensuring their adoption. They also resonate with research that suggests the benefit of guiding teachers to interpret data. Altogether, van Leeuwen et al.’s findings offer preliminary evidence for the value of advising dashboards, and suggest that such information can complement teachers’ own observations of their students, enrich their interpretations of the situations, and inform their decisions on how to act on that information.

In the second paper Gerard, Kidron & Linn ask how real-time guidance can help teachers support their students’ collaborative revision of science explanations. To do this, they use natural language processing (NLP) to automatically assess middle school students’ written science explanations in the Web-based Inquiry Science Environment (WISE). The system then generates written guidance, based on a rubric informed by the Knowledge Integration perspective (Linn and Eylon 2011). This guidance encourages students to consider missing or inaccurate ideas, and revisit a relevant visualization in the unit in order to verify that idea, and can be customized by the teacher before being sent to students.

The authors conducted a classroom-based implementation of a plate tectonics unit with one teacher of 6th grade students. Based on audio and video recordings of teacher-student interactions, as well as students’ responses to a pretest, posttest, and embedded assessments, the authors identified different ways that the teacher used the real-time feedback to personalize the guidance that they ultimately gave to their students. For example, she directed students with partial understanding to revisit visualizations in order to gather more evidence, and prompted more advanced students to evaluate and identify missing ideas. The authors also found that students made more substantial revisions on the posttest than on the pretest, thus demonstrating that real-time data can support teachers in guiding their students to collaboratively revise their science explanations.

Resonant with the study by van Leeuwen et al., this study highlights the knowledge that teachers bring to their interpretations of classroom situations, and the need for a system to take that knowledge into account. It shows how, by integrating automated assessment and feedback into teachers’ instructional practices, a real-time system can augment teachers’ abilities to guide their students. In this case, pairing system-generated assessment with teachers’ personal knowledge of their students ensured that students received both timely and personalized guidance that contributed to their improved revision practices and learning outcomes.

In the third paper Tissenbaum and Slotta developed and studied the role of real-time software agents in orchestrating collaborative inquiry in a high school physics classroom. Software agents can be programmed to respond to particular conditions in an environment, essentially mining data in real-time, including artifacts, emergent metadata, and other traces of individual and collaborative learning.

Guided by the Collective Inquiry and Learning Communities framework (Slotta et al. 2018), the authors used a design-based research approach to implement a curriculum within a smart classroom environment. They integrated software agents to support various aspects of students’ collaborative activity, including coordinating their changing locations around the room, displaying their community-constructed knowledge base, and showing the time remaining on different tasks. This information passed into the teacher’s tablet, which informed him of student groups’ progress through activities; allowed him to dynamically regroup students based on their previous interactions in the room; and facilitated the distribution of content from the students’ collectively developed knowledge base, according to their real-time needs.

The authors found that by offloading managerial duties, the system allowed the teacher to act as a wandering facilitator of student learning in his classroom. They also found that the teachers’ access to real-time alerts about group work, provided at key moments during the activity, had a significant impact on students’ physics problem-solving approaches. Overall, this study shows how real time data can support students and teachers during complex inquiry, and particularly within environments designed to give leverage to both the physical and digital dimensions of collaborations.

In the fourth paper Olsen, Rummel and Aleven investigated the value of collaborative and individual work on elementary students’ learning about fractions. Their study focused on a collaborative intelligent tutoring system (CITS), which tracks students’ real-time activity, and uses this to provide students with real-time cognitive and social support during their work. For example, the system might stop and redirect students who have proceeded too long in the wrong direction, provide a common focus for partners’ discussion, or offer correctional feedback on their responses. The CITS additionally incorporated group awareness and group accountability features to promote effective collaboration. Thus, student partners sit side-by-side, but view different versions of the same activity on their screens, and through a collaborative script, may be assigned different responsibilities on the same problem.

Olsen et al. conducted a quasi-experimental classroom-based study with 4th and 5th grade students. In their study, they compared the relative benefits for elementary students learning fractions with a CITS when working individually, collaboratively, or through activities that combined individual and collaborative work. The authors found various positive effects of collaboration. For example, students in the combined condition requested fewer hints, and made fewer errors than students in the collaboration-only and the individual-only conditions. These students also finished with higher learning gains than students who only worked collaboratively or who only worked individually, and also reported higher situational interest in the activity.

In contrast to the other studies in this issue, which focused on how real-time data can support teachers, Olsen et al. show how real-time data can serve students directly. By informing students of their partner’s state of knowledge, and by incorporating structures for accountability, this study shows how student-facing real-time data can play a role in enhancing students learning from, and interest in collaborative problem solving.

In the fifth paper Martinez-Maldonado’s study sought to document university instructors’ perspectives on using a mobile orchestration tool in their information science classrooms. Through a two-year participatory design and evaluation process with the instructors, the author designed and developed a mobile dashboard to support them in orchestrational and assessing collaboration and progress in a multi-week interactive tabletop activity. This tablet provided visualizations that gave the teacher insight into individual students’ participation and overall group progress in activities. The tablet also allowed students tabletops to be remote controlled, such as to be paused for a whole class announcement, or advanced to the next activity. Additionally, the tablet provided real-time alerts to the teacher to notify them when time allocated to a task had run out or when a known misconception was detected.

Martinez-Maldonado conducted a longitudinal study of four instructors using the mobile dashboard with 150 students over 72 classroom sessions during a 10-week period. A qualitative analysis of observations and interviews with instructors showed evidence for the potential of the technology for helping instructors to assess group collaboration, monitor class task progression, and highlight groups in need of the instructor’s assistance.

Notably, Martinez-Maldonado’s findings also point to the trade-offs of real-time data and the format in which data are delivered. For example, the instructors commented on the orchestrational load introduced by various data streams and visualizations, raising the question of when, and in what format, more data becomes less, rather than more helpful. As well, having the dashboard on a mobile device was both convenient for allowing instructors to circulate the classroom, and frustrating in that it kept one hand constantly occupied. The findings also flagged the potential issue of instructors’ over-reliance on real-time data, as such data give an inherently incomplete picture of the classroom, and that its immediacy sometimes encourages reaction rather than reflection. Overall, Martinez-Maldonado’s study shows the value of seeking instructors’ perspectives following their long-term use of real-time tools, as these can provide more balanced views of their affordances and trade-offs.

Informing, inviting, and guiding action in CSCL classrooms

Together, these five contributions demonstrate how real-time supports are trending away from simply informing—or even attempting to replace—teachers’ functions, to partnering with teachers in orchestrating CSCL (cf. Gerard et al. 2016). In crafting such partnerships, questions arise about the tasks for which a computer system is best at assuming, and those for which it is best as an advisor. Questions also arise about how such functions are best integrated into teachers’ existing practices, such that the overall effect is to enhance CSCL teaching and learning.

The contributions each begin with the premise that by capturing and displaying real-time data of CSCL activities, teachers can see patterns in student learning, and offer more targeted and timely guidance and coordination. Collectively, they illustrate how different kinds, displays, granularities, modalities, and temporalities of real-time data support different teaching functions. For example, highlighting different aspects of students’ work enables teachers to provide different kinds of support. Making students’ problem-solving processes visible can allow teachers to determine appropriate procedural guidance (Olsen et al.). Seeing areas of students’ confusion can allow teachers to offer timely conceptual guidance (Martinez-Maldonado; Tissenbaum and Slotta 2015). Similarly, students can benefit from an awareness of their partners’ thinking (Olsen et al.). Knowing the status of progress in an activity can enable teachers, or the computer system, to coordinate the logistics of an activity, including shifting between social configurations (Olsen et al.; Tissenbaum & Slotta) and modalities (Dimitriadis 2012), timing the accessibility of key resources (Martinez-Maldonado; Simon et al. 2003; Tissenbaum & Slotta), or pacing class progress (Nussbaum et al. 2009; Roschelle et al. 2010).

Making data available at different times can also support different teaching functions. Real-time data for immediate use include information on the status of a student group’s understanding or progress at a given time in a collaborative activity. Knowing where students are, a teacher can distribute relevant materials when they are most needed, and reconfigure groups in the midst of an activity (Tissenbaum & Slotta). As well, when they are stuck in an unproductive state, students might be redirected to avoid frustration and wasted time (Olsen et al.).

Real-time data for post-activity use can inform teachers’ follow-up instruction to guide students’ ongoing work. For example, a teacher might inspect aggregate visualizations to show patterns in students’ ideas (Martinez-Maldonado; Tissenbaum & Slotta; van Leeuwen et al.), or that identify specific conceptual issues, and use these to inform adaptations to future whole group or individual instruction that target those issues (Gerard et al.).

Real-time data for later use can support teachers in making improvements to instructional or curriculum materials for future classroom implementations. The contributions suggest that these data can promote teachers’ reflection, whether by encouraging them to customize automated student guidance (Gerard et al.), or to spend more time considering a dashboard’s automated advice (van Leeuwen et al.). These behaviors imply that teachers are considering data and its accompanying guidance in relation to their personal teaching values, which can lead to them refining their instructional practices.

Importantly, the studies highlight how data is not simply informational, but an invitation to action. Moreover, the ways that data are offered invite different actions that can have different impacts on both teaching and learning. Accordingly, some of the contributions in this issue explore the value of coupling data with suggestions for action. Sometimes, these suggestions are implicit, such as when a dashboard visualizes the status of a CSCL activity in ways that highlight students in need of attention (van Leeuwen et al.). In other cases, these data are coupled with scripts for orchestrating activities (Martinez-Maldonado), with customizable, theory-informed guidance associated with students’ specific conceptual difficulties (Gerard et al.), and with advice on how to respond to certain CSCL situations (van Leeuwen et al.). Each of these approaches takes a theoretical stance on how data displays and their associated guidance should be designed to promote effective CSCL, whether Knowledge Integration (Gerard, Kidron & Linn), Knowledge Learning and Instruction (Olsen et al.), Teacher Noticing (van Leeuwen, Rummel and van Gog), or Collective Inquiry and Learning Communities (Slotta et al. 2018),

Each contribution critically explores the roles of real-time dashboards in CSCL classrooms relative to the roles of teachers. In most cases, the dashboard’s role is to enable teachers to perform the personal or conceptual functions at which they are best. For example, when a computer system takes over mundane or logistically challenging tasks, teachers are freed from certain classroom management responsibilities to actively facilitate classroom activities (Tissenbaum & Slotta). When the system takes on the role of interpreting students’ conceptual difficulties and formulating appropriate feedback, teachers are freed to spend their time personalizing this guidance for specific student groups (Gerard et al.).

The dashboard’s modality also matters for the teaching functions they enable. Some of the dashboards described in this issue were on laptop computers (Gerard et al.; Olsen et al.; van Leeuwen et al.), a device that is familiar to most teachers and students in spite of its shortcomings (e.g., it physically pulls teachers’ attention away from important in-person classroom activity). Other contributions used handheld devices, which allowed teachers the mobility to wander around the classroom to offer face-to-face assistance (Tissenbaum & Slotta), although also offered shortcomings of their own (Martinez-Maldonado).

Importantly the studies each illustrate various impacts of real-time tools on teaching and learning. These impacts include improving students’ conceptual understanding and metacognitive behaviors (Gerard et al.; Olsen et al.; Tissenbaum & Slotta), increasing students’ situational interest in CSCL activities (Olsen et al.), and enhancing teachers’ abilities to notice their students’ difficulties (Martinez-Maldonado; Olsen et al.). In designing technologies for these roles, the contributors emphasize the need to strike a balance between lifting the load of providing individualized guidance to large classes of students, and maintaining teachers’ autonomy in their instructional goals (Gerard et al.).

Co-design and implementation in real contexts

The range of contexts in which the studies took place give a sense of the possibilities for integrating real-time data into diverse CSCL environments. For example, Gerard et al.’s study took place in a public school, and shows how real-time data can help students and teachers to succeed in typical classroom settings when they are provided with adequate support, both for using technology, and for inquiry-based teaching. Van Leeuwen et al.’s study took place in a controlled laboratory setting, which allowed the researchers to capture detailed information on teachers’ reaction times, and in-depth teacher-reported reflections on their uses of the dashboard, information that is difficult, if not impossible to obtain during class time. Martinez-Maldonado’s study took place in a university classroom equipped with interactive tabletops, and a dashboard on the teacher’s handheld device, and provides a contrasting case of real-time data incorporated into higher education. The physical-technical set-up described in Tissenbaum and Slotta’s study was implemented in a fee-based lab school committed to technological innovation. Although the generalizability of findings from such contexts to more typical ones may be low, such studies are important for demonstrating what is possible given adequate resources and support. They can provide ambitious benchmark toward which researchers and designers can strive.

Importantly, each contribution highlights the value of teachers’ involvement in creating successful orchestrational technologies, whether through formal and informal consultation with teachers during professional development and long-term partnerships, through structured co-design sessions between researchers and designers, or through analyses that give voice to teachers’ first-hand experiences in using the tools. This participatory design process involves testing early prototypes (Martinez-Maldonado; van Leeuwen et al.), reflecting on failures (Tissenbaum & Slotta), improving designs based on evidence of how students and teachers interact with the computer system, and revealing “productive tensions” as foci for future research and development (Martinez-Maldonado). Ensuring that tools are taken up by teachers, effectively integrated into their existing classroom practices, and sustained in the long term, requires that their designs incorporate teachers’ ideas and addresses their concerns (DiSalvo et al. 2017).

Questions for the future

Collectively, the contributions of this special issue move CSCL a step forward by offering a systematic examination of how real-time technologies can support the orchestration of collaborative learning. They describe behaviors that data can illuminate (e.g., students’ progress, confusion), when and how the data can be leveraged (e.g., for immediate use, a post activity or later use), how teaching guidance can be coupled with data (e.g., advice on how to respond to certain situations, theory-informed guidance, scripts for orchestrating activities) and the impact on teachers and learners (e.g., enhancing teachers’ abilities to notice and interpret their students’ difficulties, improving students’ conceptual understanding and metacognitive behaviors, and their situational interest). Taken together, these studies offer insights into the design, measurement, and implementation of real-time technologies in CSCL environments.

More importantly, these contributions highlight critical considerations for designing effective real-time orchestration support. For instance, how much data is too much data? When are aggregate visualizations better than individual data points, particularly in seeking to reduce teachers’ orchestrational load while also enabling them to differentiate their instruction (Prieto et al. 2015; Tissenbaum and Slotta 2019)? What are the trade-offs of real-time availability of data, when, for example, teachers may be inclined to act immediately on seeing it, although it might be better to spend time to reflect on it (Martinez-Maldonado)? When and how can real-time data support different teaching goals, and how we can facilitate its integration into classroom environments? Understanding what data teachers and students need and when, how to represent it to promote meaningful action, and how to guide the effective use of these data are critical questions for CSCL (Wise and Schwarz 2017). Even as the contributions addressed these issues, they also emphasize that they remain goals for future research and design.

Real-time dashboards have the potential to reveal new insights into CSCL teaching and learning. However, it will be important that advances in real-time technology keep pace with what is known about effectively supporting teaching and learning in CSCL classrooms. Moreover, it will be important for researchers to innovate theoretical and methodological approaches for exploring the impacts of dashboards on teaching and learning in CSCL environments, when we may find that existing approaches are no longer adequate. Such efforts will inform ways to incorporate real-time data into CSCL classrooms that complement and enhance teachers’ roles while maintaining their autonomy. Including stakeholders in the design process, as Martinez-Maldonado (and others in this issue) explain, is necessary to achieve this goal.

Finally, we also need to acknowledge that classroom technologies will change. We are barely a decade into the technological boom that introduced smartphones and tablets and their effects on classrooms are still being understood. Meanwhile new technologies, such as wearables and augmented reality headsets (Holstein et al. 2018) are once again offering the potential to radically change how teachers and learners interact in the classroom. However, as a community of researchers, we need to resist the techno-centric hype of these technologies (Rosé et al. 2017), and focus instead on the broader implications that they have for learning and instruction. The five papers in this special issue consider the orchestrational affordances that real-time technologies can play in supporting these goals. While we acknowledge that recent technological advances made this work possible in ways that would have been impossible in years past, we believe that the findings from these papers—from how, when, and what information should be provided, to whom and why is should be provided—transcend the particular technologies used. We look forward to seeing how this work will inform future CSCL studies on the possibilities for real-time classroom orchestration.

About this special issue

This is the first special issue published in IJCSCL. It was important for the editors that the special issue went through a normal review including a double-blind independent review process with experienced reviewers. The results should be an inspiration for the CSCL community to explore the questions, themes, and concerns raised in the five papers. Many scholars talk about data collection and data use in classrooms and in CSCL settings. Without advanced studies that build on a theoretical foundation, accepted methods, CSCL design, and empirical analysis we will not move the field forward. All five papers contribute to the CSCL field in very productive ways.

I want to acknowledge the initiative and all the work the guest editors Camillia Matuk, Mike Tissenbaum, and Bertrand Schneider have done to make this issue of IJCSCL a very important and interesting CSCL contribution.

Sten Ludvigsen, Editor-In-Chief.

Notes

References

  1. Baker, R. S., & Inventado, P. S. (2014). Educational data mining and learning analytics. In Learning analytics (pp. 61–75). New York: Springer.Google Scholar
  2. Blikstein, P., & Worsley, M. (2016). Multimodal learning analytics and education data mining: Using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220–238.CrossRefGoogle Scholar
  3. Dillenbourg, P. (Ed.) (2011). Trends in orchestration: Second research and technology scouting report. Report on orchestration trends of the European Stellar Network of Excellence in TEL. Retrieved from http://telearn.archivesouvertes.fr/docs/00/72/24/75/PDF/20110818_stellar_d1.5_trends-in-orchestration.pdf.
  4. Dillenbourg, P. (2013). Design for classroom orchestration. Computers & Education, 69, 485–492.CrossRefGoogle Scholar
  5. Dillenbourg, P., Järvelä, S., & Fischer, F. (2009). The evolution of research on computer-supported collaborative learning. In Technology-enhanced learning (pp. 3–19). Dordrecht: Springer.Google Scholar
  6. Dimitriadis, Y. A. (2012). Supporting teachers in orchestrating CSCL classrooms. In Research on E-learning and ICT in education (pp. 71–82). New York: Springer.  https://doi.org/10.1007/978-1-4614-1083-6_6.Google Scholar
  7. DiSalvo, B., Yip, J., Bonsignore, E., & Carl, D. (2017). Participatory design for learning. In Participatory design for learning (pp. 3–6). Routledge.Google Scholar
  8. Gerard, L., Matuk, C., & Linn, M. C. (2016). Technology as inquiry teaching partner. Journal of Science Teacher Education, 27, 1–9.CrossRefGoogle Scholar
  9. Holstein, K., McLaren, B. M., & Aleven, V. (2018). Student learning benefits of a mixed-reality teacher awareness tool in AI-enhanced classrooms. In International conference on artificial intelligence in education (pp. 154–168). Cham: Springer.Google Scholar
  10. Kuhn, D. (2005). Education for thinking. Cambridge: Harvard University Press.Google Scholar
  11. Linn, M. C., & Eylon, B. S. (2011). Science learning and instruction: Taking advantage of technology to promote knowledge integration. Routledge.Google Scholar
  12. Lonn, S., Aguilar, S. J., & Teasley, S. D. (2015). Investigating student motivation in the context of a learning analytics intervention during a summer bridge program. Computers in Human Behavior, 47, 90–97.CrossRefGoogle Scholar
  13. Nussbaum, M., Alvarez, C., McFarlane, A., Gomez, F., Claro, S., & Radovic, D. (2009). Technology as small group face-to-face collaborative scaffolding. Computers & Education, 52(1), 147–153.CrossRefGoogle Scholar
  14. Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.Google Scholar
  15. Prieto, L. P., Sharma, K., Wen, Y., & Dillenbourg, P. (2015). The burden of facilitating collaboration: Towards estimation of teacher orchestration load using eye-tracking measures. International Society of the Learning Sciences, Inc.Google Scholar
  16. Raca, M., Tormey, R., & Dillenbourg, P. (2014). Sleepers' lag-study on motion and attention. In Proceedings of the Fourth International Conference on Learning Analytics And Knowledge (pp. 36–43). ACM.Google Scholar
  17. Roschelle, J., Rafanan, K., Estrella, G., Nussbaum, M., & Claro, S. (2010). From handheld collaborative tool to effective classroom module: Embedding CSCL in a broader design framework. Computers & Education, 55(3), 1018–1026.CrossRefGoogle Scholar
  18. Roschelle, J., Dimitriadis, Y., & Hoppe, U. (2013). Classroom orchestration: Synthesis. Computers & Education, 69, 523–526.CrossRefGoogle Scholar
  19. Rosé, C. P., Ludvigsen, S., Law, N., Cress, U., & Stahl, G. (2017). Divisive or facilitative: The two faces of technology in CSCL. International Journal of Computer-Supported Collaborative Learning, 12(3), 215–220.CrossRefGoogle Scholar
  20. Schwarz, B. B., Prusak, N., Swidan, O., Livny, A., Gal, K., & Segal, A. (2018). Orchestrating the emergence of conceptual learning: A case study in a geometry class. International Journal of Computer-Supported Collaborative Learning, 13(2), 189–211.CrossRefGoogle Scholar
  21. Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189.CrossRefGoogle Scholar
  22. Simon, B., Miklós, Z., Nejdl, W., Sintek, M., & Salvachua, J. (2003). Smart space for learning: A mediation infrastructure for learning services. In Proceedings of the Twelfth International Conference on World Wide Web (pp. 20–24).Google Scholar
  23. Slotta, J. D., Quintana, R. M., & Moher, T. (2018). Collective inquiry in communities of learners. In International Handbook of the Learning Sciences (pp. 308-317). Routledge.Google Scholar
  24. Tissenbaum, M., & Matuk, C. (Co-organizers). (2016). Real-time visualization of student activities to support classroom orchestration. Symposium conducted at the 12th International Conference of the Learning Sciences, Singapore.Google Scholar
  25. Tissenbaum, M., & Slotta, J. D. (2015). Scripting and orchestration of learning across contexts: A role for intelligent agents and data mining. In Seamless learning in the age of mobile connectivity (pp. 223–257). Singapore: Springer.Google Scholar
  26. Tissenbaum, M., & Slotta, J. D. (2019). Developing a smart classroom infrastructure to support real-time student collaboration and inquiry: A 4-year design study. Instructional Science, 1–40.Google Scholar
  27. Van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of Technology and Teacher Education, 10(4), 571–596.Google Scholar
  28. Van Es, E. A., & Sherin, M. G. (2008). Mathematics teachers’“learning to notice” in the context of a video club. Teaching and Teacher Education, 24(2), 244–276.CrossRefGoogle Scholar
  29. Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Assche, F., Parra, G., & Klerkx, J. (2014). Learning dashboards: An overview and future research opportunities. Personal and Ubiquitous Computing, 18(6), 1499–1514.Google Scholar
  30. Wise, A. F., & Schwarz, B. B. (2017). Visions of CSCL: Eight provocations for the future of the field. International Journal of Computer-Supported Collaborative Learning, 12(4), 423–467.CrossRefGoogle Scholar

Copyright information

© International Society of the Learning Sciences, Inc. 2019

Authors and Affiliations

  • Camillia Matuk
    • 1
    Email author
  • Michael Tissenbaum
    • 2
  • Bertrand Schneider
    • 3
  1. 1.New York UniversityNew YorkUSA
  2. 2.University of Illinois Urbana-ChampaignChampaignUSA
  3. 3.Harvard UniversityCambridgeUSA

Personalised recommendations