Technology, Knowledge and Learning

, Volume 21, Issue 1, pp 1–3 | Cite as

Learning with Data: Visualization to Support Teaching, Learning, and Assessment

Editorial

Within the established paradigm of learning analytics, this special issue has a specific focus upon factors of learning associated with interactive data visualizations. Learning analytics use dynamic information about learners and learning environments, assessing, eliciting and analysing it, for real-time modelling, prediction, and optimization of learning processes, learning environments, and educational decision-making (Ifenthaler 2015). Opportunities of learning analytics are fostering interactions between students and facilitators as well as the availability of personalised and adaptive help and feedback from peer learners in near real-time (Ifenthaler and Widanapathirana 2014).

The advent of big data requires new perspectives on data processing and analysis including advanced methods and tools to visualise data for supporting learning processes. The primary purpose of visualisations of big data is to communicate complex patterns nested in big data (Chen 2010). However, not only the visualisation of numeric information using dashboards are important areas of research. The visualisation of semantic content are an emerging field of research opening up new perspectives on natural language processing and in-depth analysis of text data (Ifenthaler 2014; Ifenthaler and Pirnay-Dummer 2014). This special issue of Technology, Knowledge and Learning features articles which showcase the latest advances in the field of data visualisation for learning.

1 Paper Selection Process

This special issue is assembled from an open call asking for full manuscripts by the end of May 2015. Each submitted manuscript was assigned to at least two reviewers of the special issue review board. Based on the comments of the reviewers, authors were asked to submit their revised manuscript by the end of December 2015 addressing the comments from the reviewers and from the editors. The final acceptance of manuscripts was completed by the end of January 2016.

2 Contributors to this Special Section

This special section begins with a work-in-progress article, Exploratory analysis in learning analytics. The authors, David C. Gibson (Curtin University) and Sara de Freitas (Murdoch University), summarize the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects.

Brett E. Shelton (Boise State University), Jui-Long Hung (Boise State University) and Sarah Baughman (Boise State University) examine in Online Graduate Teacher Education: Establishing an EKG for Student Success Intervention, a statistical analysis technique using both static and dynamic variables to determine which students are at-risk and when an intervention could be most helpful during a semester.

What We Can Learn from the Data: A Multiple-Case Study Examining Behavior Patterns by Students with Different Characteristics in Using a Serious Game by Min Liu (University of Texas at Austin), Jaejin Lee (Seoul National University), Jina Kang (University of Texas at Austin), and Sa Liu (University of Texas at Austin), investigates students’ behaviour patterns in interacting with a serious game environment using data visualization in order to understand how the patterns may vary according to students’ learning characteristics.

Florence Martin (University of North Carolina) and John C. Whitmer (Blackboard, Inc.) examine in Applying Learning Analytics to Investigate Timed Release in Online Learning, the extent to which the adaptive release feature affected student behaviour in the online environment and course performance.

moocRP: an open learning analytics platform by Zachary A. Pardos (University of California, Berkeley), Anthony Whyte (University of Michigan), and Kevin Kao (University of California, Berkeley) address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community.

Justin Olmanson (University of Nebraska Lincoln), Katrina Kennett (University of Illinois Urbana Champaign), Alecia Magnifico (University of New Hampshire), Sarah McCarthey (University of Illinois Urbana Champaign), Duane Searsmith (University of Illinois Urbana Champaign), Bill Cope (University of Illinois Urbana Champaign), and Mary Kalantzis (University of Illinois Urbana Champaign) explore in Visualizing Revision: Leveraging Student-Generated Between-Draft Diagramming Data in Support of Academic Writing Development, how students use a semantic concept mapping tool to re-present the content and organization of their initial draft of an informational text.

Using Chronologically Oriented Representations of Discourse and Tool-related Activity as an Instructional Method with Student Teachers by Leslie Lopez (University of Hawaii at Manoa) uses a methodological tool for analyzing and deconstructing practice with student teachers, prior to their entrance in the classroom. The student-coded data provided opportunities for micro and macro level overview of the labor process of teaching.

The special issue concludes with an emerging technology report, Experience API: Flexible, Decentralized and Activity-Centric Data Collection, by Jonathan M. Kevan (University of Hawaii at Manoa) and Paul R, Ryan (University of Hawaii at Manoa). The authors describe a new e-learning specification designed to support the learning community in standardizing and collecting both formal and informal distributed learning activities.

The eight papers of this special issue demonstrate the many complex interactions between learning analytics, data visualisation, and educational technology. The insightful findings and innovative technologies showcase future directions for applications and new paradigms of research in big data and learning analytics.

References

  1. Chen, C. (2010). Information visualization. Wiley Interdisciplinary Reviews: Computational Statistics, 2(4), 387–403. doi:10.1002/wics.89.CrossRefGoogle Scholar
  2. Ifenthaler, D. (2014). Toward automated computer-based visualization and assessment of team-based performance. Journal of Educational Psychology, 106(3), 651–665. doi:10.1037/a0035505.CrossRefGoogle Scholar
  3. Ifenthaler, D. (2015). Learning analytics. In J. M. Spector (Ed.), The SAGE encyclopedia of educational technology (Vol. 2, pp. 447–451). Thousand Oaks, CA: Sage.Google Scholar
  4. Ifenthaler, D., & Pirnay-Dummer, P. (2014). Model-based tools for knowledge assessment. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 289–301). New York, NY: Springer.CrossRefGoogle Scholar
  5. Ifenthaler, D., & Widanapathirana, C. (2014). Development and validation of a learning analytics framework: Two case studies using support vector machines. Technology, Knowledge and Learning, 19(1–2), 221–240. doi:10.1007/s10758-014-9226-4.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  1. 1.University of MannheimMannheimGermany
  2. 2.Deakin UniversityMelbourneAustralia
  3. 3.McKinsey Social InitiativeNew YorkUSA

Personalised recommendations