Skip to main content

The Bi-directional Effect Between Data and Assessments in the Digital Age

  • Chapter
  • First Online:
Re-imagining University Assessment in a Digital World

Part of the book series: The Enabling Power of Assessment ((EPAS,volume 7))

  • 1874 Accesses

Abstract

Assessment and feedback, two important factors of any learning experience in higher education, are being significantly disrupted by the emergent ecosystem of increasingly technology-mediated learning environments. Digital learning experiences produce data sets with highly detailed accounts of the interactions among participants. This information brings an unprecedented potential to move the focus of assessment and feedback away from the result to the process by which such result is attained. This new focus may have a profound effect in the process to influence how students engage with their work, its comparison with an appropriate standard, and to increase their self-evaluative capacity. But at the same time these data sets pose substantial challenges on how to integrate their presence in the design, deployment and refinement of learning experiences. In this chapter, we describe the main elements that need to be considered to translate these rich data sets into actions and design aspects that achieve a positive effect in the student experience.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Baker, R. S. (2016). Stupid tutoring systems, intelligent humans. International Journal of Artificial Intelligence in Education, 26(2), 600–614. https://doi.org/10.1007/s40593-016-0105-0.

    Article  Google Scholar 

  • Bakharia, A., & Dawson, S. (2011). SNAPP: A bird’s-eye view of temporal participant interaction. Paper presented at the 1st international conference on learning analytics and knowledge, Banff, Canada.

    Google Scholar 

  • Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice, 18(1), 5–25. https://doi.org/10.1080/0969594x.2010.513678.

    Article  Google Scholar 

  • Bennett, R. E., Persky, H., Weiss, A. R., & Jenkins, F. (2007). Problem solving in technology-rich environments. A report from the NAEP technology-based assessment project, research and development series (NCES 2007-466) (p. 196). Washington, DC: National Center for Education Statistics.

    Google Scholar 

  • Berland, M., Baker, R., & Blikstein, P. (2014). Educational data mining and learning analytics: Applications to constructionist research. Technology, Knowledge and Learning, 19(1), 205–220. https://doi.org/10.1007/s10758-014-9223-7.

    Article  Google Scholar 

  • Biggs, J., Kember, D., & Leung, D. Y. P. (2001). The revised two-factor study process questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71(1), 133–149. https://doi.org/10.1348/000709901158433.

    Article  Google Scholar 

  • Blanchard, E. G., Wiseman, J., Naismith, L., & Lajoie, S. P. (2012). A realistic digital deteriorating patient to foster emergency decision-making skills in medical students. Paper presented at the IEEE international conference on advanced learning technologies, Rome, Italy.

    Google Scholar 

  • Bosch, N., D’Mello, S., Baker, R. S., Ocumpaugh, J., Shute, V. J., Ventura, M., … Zhao, W. (2015). Automatic detection of learning centered affective states in the wild. Paper presented at the 20th international conference on intelligent user interfaces, Atlanta, Georgia.

    Google Scholar 

  • Davies, M. (2011). Concept mapping, mind mapping and argument mapping: What are the differences and do they matter? Higher Education, 62(3), 279–301.

    Article  Google Scholar 

  • DiCerbo, K. E., & Behrens, J. T. (2012). From technology-enhanced assessment to assessment-enhanced technology. Paper presented at the National Council on Measurement in Education 2012 Annual Meeting, Vancouver, Canada.

    Google Scholar 

  • Drachsler, H., & Greller, W. (2016). Privacy and analytics: it’s a DELICATE issue a checklist for trusted learning analytics. Paper presented at the 6th international conference on learning analytics & knowledge, Edinburgh, UK.

    Google Scholar 

  • Ellis, R. A., Han, F., & Pardo, A. (2017). Improving learning analytics – Combining observational and self-report data on student learning. Educational Technology & Society, 20(3), 158–169.

    Google Scholar 

  • Embretson, S. E. (1998). A cognitive design system approach to generating valid tests: Application to abstract reasoning. Psychological Methods, 3(3), 380–396.

    Article  Google Scholar 

  • Ferguson, R., & Buckingham Shum, S. (2012). Social learning analytics: Five approaches. Paper presented at the 2nd international conference on learning analytics and knowledge, Vancouver, Canada.

    Google Scholar 

  • Fincham, E., Gašević, D., Jovanović, J., & Pardo, A. (2018). From study tactics to learning strategies: An analytical method for extracting interpretable representations. IEEE Transactions on Learning Technologies, 12, 59–72. https://doi.org/10.1109/TLT.2018.2823317.

    Article  Google Scholar 

  • Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum, S., Tsingos-Lucas, C., & Knight, S. (2017). Reflective writing analytics for actionable feedback. Paper presented at the 2nd international conference on learning analytics and knowledge, Vancouver, Canada.

    Google Scholar 

  • Heffernan, N. T., & Heffernan, C. L. (2014). The ASSISTments ecosystem: Building a platform that brings scientist and teachers together for minimal invasive research on human learning and teaching. International Journal of Artificial Intelligence in Education, 24(4), 470–497.

    Article  Google Scholar 

  • Hickey, D. T., Wolfe, E. W., & Kindfield, A. C. H. (2000). Assessing learning in a technology-supported genetics environment: Evidential and systemic validity issues. Educational Assessment, 6(3), 155–196. https://doi.org/10.1207/s15326977ea0603_1.

    Article  Google Scholar 

  • Kitto, K., Buckingham Shum, S., & Gibson, A. (2018). Embracing imperfection in learning analytics. Paper presented at the 8th international conference on learning analytics and knowledge, Sydney, Australia.

    Google Scholar 

  • Koedinger, K. R., Baker, R. S. J. D., Cunningham, K., Skogsholm, A., Leber, B., & Stamper, J. (2010). A data repository for the EDM community: The PSLC DataShop. In C. Robero, S. Ventura, M. Pechenizkiy, & R. Baker (Eds.), Handbook of educational data mining (pp. 43–56). Boca Raton: Chapman & Hall/CRC.

    Google Scholar 

  • Krumm, A. E., Waddington, R. J., Teasley, S. D., & Lonn, S. (2014). A learning management system-based early warning system for academic advising in undergraduate engineering. In J. A. Larusson & B. White (Eds.), Learning analytics: From research to practice (pp. 103–119). New York: Springer.

    Chapter  Google Scholar 

  • Lang, C., Siemens, G., Wise, A., & Gašević, D. (Eds.). (2017). Handbook of learning analytics. Vancouver: Society for Learning Analytics Research.

    Google Scholar 

  • Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. https://doi.org/10.1016/j.compedu.2009.09.008.

    Article  Google Scholar 

  • Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149–163.

    Google Scholar 

  • Maier, U., Wolf, N., & Randler, C. (2016). Effects of a computer-assisted formative assessment intervention based on multiple-tier diagnostic items and different feedback types. Computers & Education, 95, 85–98. https://doi.org/10.1016/j.compedu.2015.12.002.

    Article  Google Scholar 

  • Pardo, A. (2018). A feedback model for data-rich learning experiences. Assessment & Evaluation in Higher Education, 43(3), 428–438. https://doi.org/10.1080/02602938.2017.1356905.

    Article  Google Scholar 

  • Pardo, A., Jovanović, J., Dawson, S., Gašević, D., & Mirriahi, N. (2018). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, 50, 128–138. https://doi.org/10.1111/bjet.12592.

    Article  Google Scholar 

  • Pellegrino, J. W. (2018). Assessment of and for learning. In F. Fischer, C. E. Hmelo-Silver, S. R. Goldman, & P. Reimann (Eds.), International handbook of the learning sciences (pp. 410–421). New York: Routledge.

    Chapter  Google Scholar 

  • Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, 16(4), 385–407. https://doi.org/10.1007/s10648-004-0006-x.

    Article  Google Scholar 

  • Poquet, O., Lim, L., Mirriahi, N., & Dawson, S. (2018). Are MOOC forums changing? Paper presented at the 8th international conference on learning analytics and knowledge, Sydney, Australia.

    Google Scholar 

  • Prinsloo, P., & Slade, S. (2015). Student privacy self-management: Implications for learning analytics. Paper presented at the 5th international conference on learning analytics and knowledge, Poughkeepsie, USA.

    Google Scholar 

  • Reimann, P., Kickmeier-Rust, M., & Albert, D. (2013). Problem solving learning environments and assessment: A knowledge space theory approach. Computers & Education, 64, 183–193. https://doi.org/10.1016/j.compedu.2012.11.024.

    Article  Google Scholar 

  • Rogaten, J., Clow, D., Edwards, C., Gaved, M., & Rienties, B. (This volume). Are assessment practices well aligned over time? A big data exploration. In M. Bearman, P. Dawson, J. Tai, R. Ajjawi, & D. Boud (Eds.), Reimagining assessment in a digital world (Chapter 11). Dordrecht: Springer.

    Google Scholar 

  • Rust, C., O’Donovan, B., & Price, M. (2005). A social constructivist assessment process model: How the research literature shows us this could be best practice. Assessment & Evaluation in Higher Education, 30(3), 231–240. https://doi.org/10.1080/02602930500063819.

    Article  Google Scholar 

  • Scalise, K., & Gifford, B. (2006). Computer-based assessment in E-learning: A framework for construction “intermediate constraint” questions and tasks for technology platforms. The Journal of Technology, Learning, and Assessment, 4(6).

    Google Scholar 

  • Shute, V. J., & Rahimi, S. (2017). Review of computer-based assessment for learning in elementary and secondary education. Journal of Computer Assisted Learning, 33(1), 1–19. https://doi.org/10.1111/jcal.12172.

    Article  Google Scholar 

  • Shute, V. J., Leighton, J. P., Jang, E. E., & Chu, M. (2016). Advances in the science of assessment. Educational Assessment, 21(1), 34–59. https://doi.org/10.1080/10627197.2015.1127752.

    Article  Google Scholar 

  • Slater, S., Ocumpaugh, J., Baker, R., Almeda, M. V., Allen, L., & Heffernan, N. (2017). Using natural language processing tools to develop complex models of student engagement. Paper presented at the 7th international conference on affective computing and intelligent interaction, San Antonio, USA. https://doi.org/10.1109/ACII.2017.8273652.

  • Waddington, R. J., Nam, S., Lonn, S., & Teasley, S. D. (2016). Improving early warning systems with categorized course resource usage. Journal of Learning Analytics, 3(3), 263–290. https://doi.org/10.18608/jla.2016.33.13.

    Article  Google Scholar 

  • Winne, P. H. (1997). Experimenting to bootstrap self-regulated learning. Journal of Educational Psychology, 89(3), 397–410.

    Article  Google Scholar 

  • Winne, P. H. (2014). Issues in researching self-regulated learning as patterns of events. Metacognition and Learning, 9(2), 229–237. https://doi.org/10.1007/s11409-014-9113-3.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abelardo Pardo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Pardo, A., Reimann, P. (2020). The Bi-directional Effect Between Data and Assessments in the Digital Age. In: Bearman, M., Dawson, P., Ajjawi, R., Tai, J., Boud, D. (eds) Re-imagining University Assessment in a Digital World. The Enabling Power of Assessment, vol 7. Springer, Cham. https://doi.org/10.1007/978-3-030-41956-1_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-41956-1_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41955-4

  • Online ISBN: 978-3-030-41956-1

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics