Advertisement

Simulating Peer Assessment in Massive Open On-line Courses

  • Filippo SciarroneEmail author
  • Marco Temperini
Conference paper
Part of the Springer Proceedings in Complexity book series (SPCOM)

Abstract

Peer Assessment is a powerful tool to enhance students high level meta-cognitive skills. In this paper we deal with a simulation framework (K-OpenAnswer) allowing to support peer assessment sessions, in which peers answer a question and assess some of their peers’ answers, with the enrichment of “teacher mediation”. Teacher mediation consists in the possibility for the teacher to add information into the network of data built by the peer assessment, by grading some answers. This can be useful to enhance the automated grading functionality of an educational system supporting peer assessment. We present a software system allowing to apply the K-OpenAnswer simulation framework on simulated Massive Open On-line Courses (MOOCs). The system allows to guide the dynamic of the student models and grades evolution, according to the teacher’s intervention. It also allows to appreciate such dynamic and make observations about it. The aim of this paper is to show the functionalities that the teacher can use, and their usefulness on simulated MOOCs, planning the use of the same functionalities in the case of real MOOCs.

References

  1. 1.
    Anderson, L.W., Krathwohl, D.R.e.: A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. Allyn and Bacon (2000)Google Scholar
  2. 2.
    Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer, Berlin (2006)Google Scholar
  3. 3.
    Bloom, B., Engelhart, M., Furst, E., Hill, W., Krathwohl, D.: Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain. McGraw-Hill Inc., New York (1956)Google Scholar
  4. 4.
    De Marsico, M., Sciarrone, F., Sterbini, A., Temperini, M.: Supporting mediated peer-evaluation to grade answers to open-ended questions. EURASIA J. Math. Sci. Technol. Educ. 13(4), 1085–1106 (2017)CrossRefGoogle Scholar
  5. 5.
    De Marsico, M., Sciarrone, F., Sterbini, A., Temperini, M.: Peer assessment and knowledge discovering in a community of learners. In: IC3K 2018—Proceedings of 10th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management, pp. 119–126 (2018)Google Scholar
  6. 6.
    De Marsico, M., Sterbini, A., Sciarrone, F., Temperini, M.: Modeling a peer assessment framework by means of a lazy learning approach. In: Huang, T.C., Lau, R., Huang, Y.M., Spaniol, M., Yuen, C.H. (eds.) Emerging Technologies for Education, pp. 336–345. Springer International Publishing, Cham (2017)Google Scholar
  7. 7.
    Gasparetti, F., Limongelli, C., Sciarrone, F.: Exploiting wikipedia for discovering prerequisite relationships among learning objects. In: 2015 International Conference on Information Technology Based Higher Education and Training (ITHET), pp. 1–6 (2015).  https://doi.org/10.1109/ITHET.2015.7218038
  8. 8.
    Gasparetti, F., Medio, C.D., Limongelli, C., Sciarrone, F., Temperini, M.: Prerequisites between learning objects: automatic extraction based on a machine learning approach. Telematics Inform. 35(3), 595–610 (2018)CrossRefGoogle Scholar
  9. 9.
    Li, L., Liu, X., Steckelberg, A.L.: Assessor or assessee: how student learning improves by giving and receiving peer feedback. Br. J. Educ. Technol. 41, 525–536 (2010)CrossRefGoogle Scholar
  10. 10.
    Limongelli, C., Sciarrone, F., Temperini, M.: A social network-based teacher model to support course construction. Comput. Hum. Behav. 51, Part B, 1077–1085 (2015)CrossRefGoogle Scholar
  11. 11.
    Limongelli, C., Mosiello, G., Panzieri, S., Sciarrone, F.: Virtual industrial training: joining innovative interfaces with plant modeling. In: The 11th International Conference on Information Technology Based Higher Education and Training—ITHET 2012, pp. 1–6. IEEE (2012)Google Scholar
  12. 12.
    Mitchell, T.M.: Machine Learning, 1st edn. David McKay, New York (1997)Google Scholar
  13. 13.
    Palmer, K., Richardson, P.: On-line assessment and free-response input-a pedagogic and technical model for squaring the circle. In: Proceedings of 7th CAA Conference, pp. 289–300 (2003)Google Scholar
  14. 14.
    Sadler, P., Good, E.: The impact of self- and peer-grading on student learning. Educ. Assess. 11(1), 1–31 (2006)CrossRefGoogle Scholar
  15. 15.
    Sterbini, A., Temperini, M.: Supporting assessment of open answers in a didactic setting. In: 2012 IEEE 12th International Conference on Advanced Learning Technologies (ICALT), pp. 678–679 (2012)Google Scholar
  16. 16.
    Sterbini, A., Temperini, M.: Analysis of open answers via mediated peer-assessment. In: 2013 17th International Conference on System Theory, Control and Computing (ICSTCC), pp. 663–668 (2013)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of EngineeringRoma TRE UniversityRomeItaly
  2. 2.DIAG-Dipartimento di Ingegneria Informatica, Automatica e GestionaleSapienza University of RomeRomeItaly

Personalised recommendations