Abstract
Massive Open Online Courses (MOOCs) have gained popularity in the past few years as a new form of open learning. Unlike assessment in classroom settings, the methodology to assess learning in open environments such as MOOCs represents a big challenge from the pedagogical perspective. Thus, there is a need to think about scalable assessment methods for accrediting and recognizing learning in MOOCs in an efficient and effective way. Peer Assessment is increasingly discussed in the recent MOOC literature as a potential solution to address this challenge. The problem remains, however, how to ensure the quality of the peer assessment feedback. In this paper, we investigate the potential of rubric-based peer assessment to make the assessment process in blended MOOCs (bMOOCs) more effective in terms of transparency, validity, and reliability. Moreover, we explore which peer assessment model fits best in a bMOOC context.
Keywords
- Massive open online courses
- Moocs
- Blended MOOCs
- bMOOCs
- Peer assessment
- Collaborative learning
- Rubrics
- Peer feedback
This is a preview of subscription content, access via your institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Argyris, C., Schon, D.: Organizational Learning: A Theory of Action Approach. Addision Wesley, Reading (1978)
Bruff, D.O., Fisher, D.H., McEwen, K.E., Smith, B.E.: Wrapping a MOOC: Student perceptions of an experiment in blended learning. MERLOT J. Online Learn. Teach. 9(2), 187–199 (2013)
Chatti, M.A., Jarke, M., Schroeder, U.: Double-loop learning. In: Encyclopedia of the Sciences of Learning, pp. 1035–1037 (2012)
Chatti, M.A.: The LaaN theory. In: Personalization in Technology Enhanced Learning: A Social Software Perspective, pp. 19-42. Shaker Verlag, Aachen, Germany (2010)
Chatti, M.A., Lukarov, V., Thüs, H., Muslim, A., Yousef, A.M.F., Wahid, U., Greven, C., Chakrabarti, A., Schroeder, U.: Learning analytics: challenges and future research directions. eleed, iss. 10 (2014). (urn:nbn:de:0009-5-40350)
Clow, D.: MOOCs and the funnel of participation. In: Proceedings of the Third International Conference on Learning Analytics and Knowledge, pp. 185–189. ACM (2013)
Coursera: How will my grade be determined?. http://help.coursera.org/customer/portal/articles/1163304-how-will-my-grade-be-determined. Accessed 20 January 2015
Daniel, J.: Making sense of MOOCs: musings in a maze of myth, paradox and possibility. J. Interact. Media Educ. 3 (2012)
Davis, H., Dikens, K., Leon-Urrutia, M., Sanchéz-Vera, M.M., White, S.: MOOCs for Universities and learners an analysis of motivating factors. In: Proceedings of CSEDU 2014 Conference, pp. 105–116. INSTICC (2014)
Díez, J., Luaces, O., Alonso-Betanzos, A., Troncoso, A., Bahamonde, A.: Peer assessment in MOOCs using preference learning via matrix factorization. In: NIPS Workshop on Data Driven Education, December 2013
edX: Open Response Assessments. http://edx-guide-for-students.readthedocs.org/en/latest/SFD_ORA.html. Accessed 20 January 2015
Ghadiri, K., Qayoumi, M.H., Junn, E., Hsu, P., Sujitparapitaya, S.: The transformative potential of blended learning using MIT edX’s 6.002 x online MOOC content combined with student team-based learning in class. Environment 8, 14 (2013)
Gielen, S., Dochy, F., Onghena, P., Struyven, K., Smeets, S.: Goals of peer assessment and their associated quality concepts. Stud. High. Educ. 36(6), 719–735 (2011)
Gielen, S., Peeters, E., Dochy, F., Onghena, P., Struyven, K.: Improving the effectiveness of peer feedback for learning. Learn. Instr. 20(4), 304–315 (2010)
Totschnig, M., Willems, C., Meinel, C., Grünewald, F.: Designing MOOCs for the support of multiple learning styles. In: Hernández-Leo, D., Ley, T., Klamma, R., Harrer, A. (eds.) EC-TEL 2013. LNCS, vol. 8095, pp. 371–382. Springer, Heidelberg (2013)
Hill, P.: Some validation of MOOC student patterns graphic (2013). http://mfeldstein.com/validation-mooc-student-patterns-graphic/
Jordan, K.: MOOC completion rates: The data (2013). http://www.katyjordan.com/MOOCproject. Accessed 20 January 2015
Kaplan, F., Bornet, C.A.M.: A preparatory analysis of peer-grading for a digital humanities MOOC. In: Digital Humanities 2014: Book of Abstracts no. EPFL-CONF-200911, pp. 227–229 (2014)
Kop, R., Fournier, H., Mak, J.S.F.: A pedagogy of abundance or a pedagogy to support human beings? Participant support on massive open online courses. Int. Rev. Res. Open Distance Learn. 12(7), 74–93 (2011)
Kulkarni, C., Wei, K.P., Le, H., Chia, D., Papadopoulos, K., Cheng, J., Koller, D., Klemmer, S.R.: Peer and self assessment in massive online classes. ACM Trans. Comput. Hum. Interact. (TOCHI) 20(6), 33 (2013)
Luo, H., Robinson, A.C., Park, J.Y.: Peer Grading in a MOOC: reliability, validity, and perceived effects. Online Learn. Official J. Online Learn. Consortium 18(2) (2014)
McGarr, O., Clifford, A.M.: ‘Just enough to make you take it seriously’: exploring students’ attitudes towards peer assessment. High. Educ. 65(6), 677–693 (2013)
McMullan, M., Endacott, R., Gray, M.A., Jasper, M., Miller, C.M., Scholes, J., Webb, C.: Portfolios and assessment of competence: a review of the literature. J. Adv. Nurs. 41(3), 283–294 (2003)
Nielsen, J.: Usability inspection methods. In: Conference Companion on Human Factors in Computing Systems, pp. 413–414. ACM (1994)
Nonaka, I., Takeuchi, H.: The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation. Oxford University Press, Oxford (1995)
Ostashewski, N., Reid, D.: Delivering a MOOC using a social networking site: the SMOOC design model. In: Proceedings of IADIS International Conference on Internet Technologies and Society, pp. 217–220 (2012)
O’Toole, R.: Pedagogical strategies and technologies for peer assessment in massively open online courses (MOOCs). Discussion Paper, University of Warwick, Coventry, UK (2013). http://wrap.warwick.ac.uk/54602/
Piech, C., Huang, J., Chen, Z., Do, C., Ng, A., Koller, D.: Tuned models of peer assessment in MOOCs (2013). arXiv preprint arXiv:1307.2579
Prümper, J.: Der Benutzungsfragebogen ISONORM 9241/10: Ergebnisse zur Reliabilität und Validität. In: Software-Ergonomie 1997, pp. 253–262. Vieweg+Teubner Verlag (1997)
Sánchez-Vera, M.M., Prendes-Espinosa, M.P.: Beyond objective testing and peer assessment: alternative ways of assessment in MOOCs. RUSC Univ. Knowl. Soc. J. 12(1), 119–130 (2015). doi:http://dx.doi.org/10.7238/rusc.v12i1.2262
Sandeen, C.: Assessment’s place in the new MOOC world. Res. Pract. Assess. 8(1), 5–12 (2013)
Sitthiworachart, J., Joy, M.: Effective peer assessment for learning computer programming. In: ACM SIGCSE Bulletin, vol. 36, no. 3, pp. 122–126. ACM (2004)
Suen, H.K.: Peer assessment for massive open online courses (MOOCs). Int. Rev. Res. Open Distance Learn. 15(3) (2014)
Topping, K.: Peer assessment between students in colleges and universities. Rev. Educ. Res. 68(3), 249–276 (1998)
Van Zundert, M., Sluijsmans, D., Van Merriënboer, J.: Effective peer assessment processes: research findings and future directions. Learn. Instr. 20(4), 270–279 (2010)
Wolf, K., Stevens, E.: The role of rubrics in advancing and assessing student learning. J. Effective Teach. 7(1), 3–14 (2007)
Yin, S., Kawachi, P.: Improving open access through prior learning assessment. Open Praxis 5(1), 59–65 (2013)
Yorke, M.: Assessment, especially in the first year of higher education: old principles in new wrapping. In: REAP International Online Conference on Assessment Design for Learner Responsibility (2007)
Yousef, A.M.F., Chatti, M.A., Ahmad, I., Schroeder, U., Wosnitza, M.: An evaluation of learning analytics in a blended MOOC environment. In: The European MOOC Stakeholder Summit 2015 (2015a Submitted)
Yousef, A.M.F., Chatti, M.A., Danoyan, N., Thüs, H., Schroeder, U.: An evaluation of learning analytics in a blended MOOC environment. In: The European MOOC Stakeholder Summit 2015 (2015b Submitted)
Yousef, A.M.F., Chatti, M. A., Wosnitza, M., Schroeder, U.: A cluster analysis of MOOC stakeholder perspectives. RUSC Univ. Knowl. Soc. J. 12(1), 74–90 (2015c)
Yousef, A.M.F., Chatti, M.A., Schroeder, U., Wosnitza, M.: A usability evaluation of a blended MOOC environment: an experimental case study. In: The International Review of Research in Open and Distributed Learning, (2015d, Accepted)
Yousef, A.M.F., Chatti, M.A., Schroeder, U., Wosnitza, M., Jakobs, H.: MOOCs - a review of the state-of-the-art. In: Proceedings of CSEDU 2014 Conference, vol. 3, pp. 9–20. INSTICC, 2014 (2014a)
Yousef, A.M.F., Chatti, M.A., Schroeder, U., Wosnitza, M.: What drives a successful MOOC? an empirical examination of criteria to assure design quality of MOOCs. In: 14th IEEE International Conference on Advanced Learning Technologies Proceedings. ICALT 2014, pp. 44–48 (2014b)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Yousef, A.M.F., Wahid, U., Chatti, M.A., Schroeder, U., Wosnitza, M. (2016). The Impact of Rubric-Based Peer Assessment on Feedback Quality in Blended MOOCs. In: Zvacek, S., Restivo, M., Uhomoibhi, J., Helfert, M. (eds) Computer Supported Education. CSEDU 2015. Communications in Computer and Information Science, vol 583. Springer, Cham. https://doi.org/10.1007/978-3-319-29585-5_27
Download citation
DOI: https://doi.org/10.1007/978-3-319-29585-5_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-29584-8
Online ISBN: 978-3-319-29585-5
eBook Packages: Computer ScienceComputer Science (R0)