Research trends and applications of technology-supported peer assessment: a review of selected journal publications from 2007 to 2016

  • Qing-Ke Fu
  • Chi-Jen Lin
  • Gwo-Jen HwangEmail author


The study systematically reviewed 70 empirical studies on technology-supported peer assessment published in seven critical journals from 2007 to 2016. Several dimensions of the technology-supported peer-assessment studies were investigated, including the adopted technologies, learning environments, application domains, peer-assessment mode, and the research issues. It was found that, in the 10 years, there was a slight change in peer-assessment studies in terms of the adopted technologies, which were mostly traditional computers. In terms of learning environments, in the first 5 years, most activities were conducted online after class, while in the second 5 years, more activities were conducted in the classroom during school hours. Moreover, several researchers have started to consider peer assessment as a frequently adopted teaching strategy and have tried to integrate other learning strategies into peer-assessment activities to strengthen their effectiveness. In the meantime, it was found that little research engaged students in developing peer-assessment rubrics; that is, most of the studies employed rubrics developed by teachers. In terms of research issues, developing students’ higher-order thinking received the most attention. For future studies, it is suggested that researchers can explore the value and effects of adopting emerging technologies (e.g., mobile devices) in peer assessment as well as engaging students in the development of peer-assessment rubrics, which might enable them to deeply experience the tacit knowledge underlying the standard rubrics provided by the teacher.


Applications in subject areas Interactive learning environments Pedagogical issues Teaching/learning strategies 



This study is supported in part by the Ministry of Science and Technology of the Republic of China under contract numbers MOST-105-2511-S-011 -008 -MY3 and MOST 106-2511-S-011 -005 -MY3.


  1. Carson, J., & Nelson, G. (1996). Chinese students’ perceptions of ESL peer response group interaction. Journal of Second Language Writing, 5(1), 1–19.CrossRefGoogle Scholar
  2. Chen, T. (2016). Technology-supported peer feedback in ESL/EFL writing classes: a research synthesis. Computer Assisted Language Learning, 29(2), 365–397. Scholar
  3. Ching, Y. H., & Hsu, Y. C. (2016). Learners’ interpersonal beliefs and generated feedback in an online role-playing peer-feedback activity: An exploratory study. International Review of Research in Open and Distributed Learning, 17(2), 105–122.CrossRefGoogle Scholar
  4. Cho, K., Cho, M. H., & Hacker, D. J. (2010). Self-monitoring support for learning to write. Interactive Learning Environments, 18(2), 101–113. Scholar
  5. Falchikov, N. (1995). Peer feedback marking: Developing peer assessment. Innovations in Education & Training International, 32, 175–187. Scholar
  6. Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: A meta-analysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287–322. Scholar
  7. Fraile, J., Panadero, E., & Pardo, R. (2017). Co-creating rubrics: The effects on self-regulated learning, self-efficacy and performance of establishing rubrics with students. Studies in Educational Evaluation, 53, 69–76. Scholar
  8. Guardado, M., & Shi, L. (2007). ESL students’ experiences of online peer feedback. Computers and Composition, 24(4), 443–461.CrossRefGoogle Scholar
  9. Hsia, L. H., Huang, I., & Hwang, G. J. (2016a). A web-based peer-assessment approach to improving junior high school students’ performance, self-efficacy and motivation in performing arts courses. British Journal of Educational Technology, 47(4), 618–632. Scholar
  10. Hsia, L. H., Huang, I., & Hwang, G. J. (2016b). Effects of different online peer-feedback approaches on students’ performance skills, motivation and self-efficacy in a dance course. Computers & Education, 96, 55–71. Scholar
  11. Hsu, T. C. (2016). Effects of a peer assessment system based on a grid-based knowledge classification approach on computer skills training. Educational Technology & Society, 19(4), 100–111.Google Scholar
  12. Hulsman, R. L., & van der Vloodt, J. (2015). Self-evaluation and peer-feedback of medical students’ communication skills using a web-based video annotation system. Exploring content and specificity. Patient Education and Counseling, 98(3), 356–363. Scholar
  13. Hung, H. T., Yang, J. C., Hwang, G. J., Chu, H. C., & Wang, C. C. (2018). A scoping review of research on digital game-based language learning. Computers & Education, 126, 89–104.CrossRefGoogle Scholar
  14. Hwang, G. J., Hung, C. M., & Chen, N. S. (2014). Improving learning achievements, motivations and problem-solving skills through a peer assessment-based game development approach. Etr&D-Educational Technology Research and Development, 62(2), 129–145. Scholar
  15. Hwang, G. J., & Tsai, C. C. (2011). Research trends in mobile and ubiquitous learning: A review of publications in selected journals from 2001 to 2010. British Journal of Educational Technology, 42(4), E65–E70. Scholar
  16. Jones, I., & Alcock, L. (2014). Peer assessment without rubrics. Studies in Higher Education, 39(10), 1774–1787. Scholar
  17. Kao, G. Y. M., Lin, S. S. J., & Sun, C. T. (2008). Beyond sharing: Engaging students in cooperative and competitive active. Educational Technology & Society, 11(3), 82–96.Google Scholar
  18. Lai, C. L., & Hwang, G. J. (2014). Effects of mobile learning time on students’ conception of collaboration, communication, complex problem-solving, meta-cognitive awareness and creativity. International Journal of Mobile Learning and Organisation, 8(3), 276–291.CrossRefGoogle Scholar
  19. Lai, C. L., & Hwang, G. J. (2015). An interactive peer-rubrics development approach to improving students’ art design performance using handheld devices. Computers & Education, 85, 149–159. Scholar
  20. Li, L., Liu, X. Y., & Zhou, Y. C. (2012). Give and take: A re-analysis of assessor and assessee’s roles in technology-facilitated peer assessment. British Journal of Educational Technology, 43(3), 376–384. Scholar
  21. Li, L., Steckelberg, A. L., & Srinivasan, S. (2008). Utilizing peer interactions to promote learning through a computer-assisted peer assessment system. Canadian Journal of Learning and Technology, 34(2), 133–148.Google Scholar
  22. Li, H. L., Xiong, Y., Zang, X. J., Kornhaber, M. L., Lyu, Y. S., Chung, K. S., et al. (2016). Peer assessment in the digital age: A meta-analysis comparing peer and teacher ratings. Assessment & Evaluation in Higher Education, 41(2), 245–264. Scholar
  23. Lin, H. C., & Hwang, G. J. (2018). Research trends of flipped classroom studies for medical courses: A review of journal publications from 2008 to 2017 based on the Technology-Enhanced Learning Model. Interactive Learning Environments. Scholar
  24. Liu, N. F., & Carless, D. (2006). Peer feedback: the learning element of peer assessment. Teaching in Higher Education, 11(3), 279–290. Scholar
  25. Lu, J. Y., & Law, N. W. Y. (2012). Understanding collaborative learning behavior from Moodle log data. Interactive Learning Environments, 20(5), 451–466. Scholar
  26. Luo, T. (2016). Enabling microblogging-based peer feedback in face-to-face classrooms. Innovations in Education and Teaching International, 53(2), 156–166. Scholar
  27. Miller, P. J. (2003). The effect of scoring criteria specificity on peer and self-assessment. Assessment & Evaluation in Higher Education, 28(4), 383–394.CrossRefGoogle Scholar
  28. Mulder, R., Baik, C., Naylor, R., & Pearce, J. (2014). How does student peer review influence perceptions, engagement and academic outcomes? A case study. Assessment & Evaluation in Higher Education, 39(6), 657–677. Scholar
  29. O’Donovan, B., Price, M., & Rust, C. (2004). Know what I mean? Enhancing student understanding of assessment standards and criteria. Teaching in Higher Education, 9(3), 325–335. Scholar
  30. Speyer, R., Pilz, W., Van Der Kruis, J., & Brunings, J. W. (2011). Reliability and validity of student peer assessment in medical education: A systematic review. Medical Teacher, 33(11), E572–E585. Scholar
  31. Stefani, L. A. J. (1994). Peer, self and tutor assessment—relative reliabilities. Studies in Higher Education, 19(1), 69–75. Scholar
  32. Suen, H. K. (2014). Peer assessment for massive open online courses (MOOCs). International Review of Research in Open and Distance Learning, 15(3), 312–327.CrossRefGoogle Scholar
  33. Tenorio, T., Bittencourt, I. I., Isotani, S., & Silva, A. P. (2016). Does peer assessment in on-line learning environments work? A systematic review of the literature. Computers in Human Behavior, 64, 94–107. Scholar
  34. Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276. Scholar
  35. van Gennip, N. A. E., Segers, M. S. R., & Tillema, H. H. (2009). Peer assessment for learning from a social perspective: The influence of interpersonal variables and structural features. Educational Research Review, 4(1), 41–54. Scholar
  36. van Popta, E., Kral, M., Camp, G., Martens, R. L., & Simons, P. R. J. (2017). Exploring the value of peer feedback in online learning for the provider. Educational Research Review, 20, 24–34. Scholar
  37. Vanderhoven, E., Raes, A., Montrieux, H., Rotsaert, T., & Schellens, T. (2015). What if pupils can assess their peers anonymously? A quasi-experimental study. Computers & Education, 81, 123–132. Scholar
  38. Wang, H. Y., Liu, G. Z., & Hwang, G. J. (2017). Integrating socio-cultural contexts and location-based systems for ubiquitous language learning in museums: A state of the art review of 2009–2014. British Journal of Educational Technology, 48(2), 653–671.CrossRefGoogle Scholar
  39. Yu, F. Y. (2011). Multiple peer-assessment modes to augment online student question-generation processes. Computers & Education, 56(2), 484–494. Scholar
  40. Yu, F. Y., & Liu, Y. H. (2009). Creating a psychologically safe online space for a student-generated questions learning activity via different identity revelation modes. British Journal of Educational Technology, 40(6), 1109–1123. Scholar
  41. Yu, F. Y., & Sung, S. (2016). A mixed methods approach to the assessor’s targeting behavior during online peer assessment: Effects of anonymity and underlying reasons. Interactive Learning Environments, 24(7), 1674–1691. Scholar
  42. Yu, F. Y., & Wu, C. P. (2011). Different identity revelation modes in an online peer-assessment learning environment: Effects on perceptions toward assessors, classroom climate and learning activities. Computers & Education, 57(3), 2167–2177. Scholar

Copyright information

© Beijing Normal University 2019

Authors and Affiliations

  1. 1.School of Teacher EducationHuzhou UniversityHuzhouPeople’s Republic of China
  2. 2.Graduate Institute of Digital Learning and EducationNational Taiwan University of Science and TechnologyTaipeiTaiwan

Personalised recommendations