Skip to main content

Advertisement

Log in

“Free selection and invitation” online peer assessment of undergraduates’ research competencies, flow, motivation and interaction in a research methods course

  • Published:
Journal of Computing in Higher Education Aims and scope Submit manuscript

Abstract

This study examined and compared the effects of two types of online peer assessment, namely, “free selection and invitation” (FS&I) assessment and the commonly implemented “assigned-pair” (AP) assessment, on undergraduates’ research competencies, flow, motivation, and interaction, supported by the Cloud Classroom online learning system. Ninety-three undergraduates from a research methods course participated in this study. They were randomly divided into two groups: an FS&I group and an AP group. The two groups experienced exactly the same teaching conditions except for different online peer assessment types. The study was conducted over one semester (16 weeks). Both quantitative and qualitative methods were used to examine the effects of the two types of online peer assessment. Regarding research competencies that were represented by conceptual knowledge and research proposal, this study found that the FS&I and AP groups showed no significant difference in conceptual knowledge score, and that the FS&I group performed significantly better in the quality of research proposals than the AP group. The FS&I group also manifested higher levels of flow and motivation than the AP group. Additionally, social network analysis (SNA) revealed that the FS&I group exhibited more interactions and closer connections with peers than the AP group. These results suggest that FS&I online peer assessment is an effective scaffolding that can improve undergraduates’ research competencies, learning engagement and willingness to interact. The implications of this study are also discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Adams, C., Buetow, S., Edlin, R., Zdravkovic, N., & Heyligers, J. (2016). A collaborative approach to integrating Information and Academic Literacy into the Curricula of Research Methods Courses. The Journal of Academic Librarianship, 42(3), 222–231. https://doi.org/10.1016/j.acalib.2016.02.010

    Article  Google Scholar 

  • Albareda-Tiana, S., Vidal-Raméntol, S., Pujol-Valls, M., & Fernández-Morilla, M. (2018). Holistic approaches to develop sustainability and research competencies in pre-service teacher training. Sustainability, 10(10), 3698. https://doi.org/10.3390/su10103698

    Article  Google Scholar 

  • Altintas, T., Gunes, A., & Sayan, H. (2016). A peer-assisted learning experience in computer programming language learning and developing computer programming skills. Innovations in Education and Teaching International, 53(3), 329–337. https://doi.org/10.1080/14703297.2014.993418

    Article  Google Scholar 

  • Altman, D. G., Gore, S. M., Gardner, M. J., & Pocock, S. J. (1983). Statistical guidelines for contributors to medical journals. British Medical Journal (clinical Research Ed.), 286(6376), 1489.

    Article  Google Scholar 

  • Anaya, A. R., Luque, M., Letón, E., & Hernández-del-Olmo, F. (2019). Automatic assignment of reviewers in an online peer assessment task based on social interactions. Expert Systems, 36(4), e12405. https://doi.org/10.1111/exsy.12405

    Article  Google Scholar 

  • Babaii, E., & Adeh, A. (2019). One, two,…, many: The outcomes of paired peer assessment, group peer assessment, and teacher assessment in EFL writing. Journal of Asia TEFL, 16(1), 53. https://doi.org/10.18823/asiatefl.2019.16.1.4.53

    Article  Google Scholar 

  • Bachiochi, P., Everton, W., Evans, M., Fugere, M., Escoto, C., Letterman, M., & Leszczynski, J. (2011). Using empirical article analysis to assess research methods courses. Teaching of Psychology, 38(1), 5–9. https://doi.org/10.1177/0098628310387787

    Article  Google Scholar 

  • Bandaranaike, S. (2018). From research skill development to work skill development. Journal of University Teaching & Learning Practice, 15(4), 7. https://doi.org/10.53761/1.15.4.7

    Article  Google Scholar 

  • Barker, V., Dozier, D. M., Weiss, A. S., & Borden, D. L. (2015). Harnessing peer potency: Predicting positive outcomes from social capital affinity and online engagement with participatory websites. New Media & Society, 17(10), 1603–1623. https://doi.org/10.1177/1461444814530291

    Article  Google Scholar 

  • Barzilai, S., & Blau, I. (2014). Scaffolding game-based learning: Impact on learning achievements, perceived learning, and game experiences. Computers & Education, 70, 65–79. https://doi.org/10.1016/j.compedu.2013.08.003

    Article  Google Scholar 

  • Basnet, B., Brodie, L., & Worden, J. (2010). Peer assessment of assignment. In 2010 IEEE frontiers in education conference (FIE) (pp. T1G-1). IEEE.

  • Boud, D., Cohen, R., & Sampson, J. (1999). Peer learning and assessment. Assessment & Evaluation in Higher Education, 24(4), 413–426. https://doi.org/10.1080/0260293990240405

    Article  Google Scholar 

  • Boud, D., & Lee, A. (2005). ‘Peer learning’ as pedagogic discourse for research education. Studies in Higher Education, 30(5), 501–516. https://doi.org/10.1080/03075070500249138

    Article  Google Scholar 

  • Brew, A., & Mantai, L. (2020). Turning a dream into reality: Building undergraduate research capacity across Australasia. In International perspectives on undergraduate research (pp. 39–56). Palgrave Macmillan.

  • Brockmyer, J. H., Fox, C. M., Curtiss, K. A., McBroom, E., Burkhart, K. M., & Pidruzny, J. N. (2009). The development of the Game Engagement Questionnaire: A measure of engagement in video game-playing. Journal of Experimental Social Psychology, 45(4), 624–634. https://doi.org/10.1016/j.jesp.2009.02.016

    Article  Google Scholar 

  • Campbell, M., Gibson, W., Hall, A., Richards, D., & Callery, P. (2008). Online vs. face-to-face discussion in a web-based research methods course for postgraduate nursing students: A quasi-experimental study. International Journal of Nursing Studies, 45(5), 750–759. https://doi.org/10.1016/j.ijnurstu.2006.12.011

    Article  Google Scholar 

  • Chang, C. C., Tseng, K. H., & Lou, S. J. (2012). A comparative analysis of the consistency and difference among teacher-assessment, student self-assessment and peer-assessment in a Web-based portfolio assessment environment for high school students. Computers & Education, 58(1), 303–320. https://doi.org/10.1016/j.compedu.2011.08.005

    Article  Google Scholar 

  • Charumbira, M. Y., Berner, K., & Louw, Q. A. (2021). Research competencies for undergraduate rehabilitation students: A scoping review. African Journal of Health Professions Education, 13(1), 52–58.

    Article  Google Scholar 

  • Chen, C. H. (2010). The implementation and evaluation of a mobile self- and peer-assessment system. Computers & Education, 55(1), 229–236. https://doi.org/10.1016/j.compedu.2010.01.008

    Article  Google Scholar 

  • Chen, Y. C., & Tsai, C. C. (2009). An educational research course facilitated by online peer assessment. Innovations in Education and Teaching International, 46(1), 105–117. https://doi.org/10.1080/14703290802646297

    Article  Google Scholar 

  • Cho, K., & Schunn, C. D. (2007). Scaffolded writing and rewriting in the discipline: A web-based reciprocal peer review system. Computers & Education, 48(3), 409–426. https://doi.org/10.1016/j.compedu.2005.02.004

    Article  Google Scholar 

  • Crowe, J. A., Silva, T., & Ceresola, R. (2015). The effect of peer review on student learning outcomes in a research methods course. Teaching Sociology, 43(3), 201–213. https://doi.org/10.1177/0092055X15578033

    Article  Google Scholar 

  • Csikszentmihalyi, M. (1988). The flow experience and its significance for human psychology. Optimal Experience: Psychological Studies of Flow in Consciousness, 2, 15–35.

    Article  Google Scholar 

  • Csikszentmihalyi, M. (1997). Flow and the psychology of discovery and invention (p. 39). HarperPerennial.

    Google Scholar 

  • Csikszentmihalyi, M. (2014). Applications of flow in human development and education. Springer.

    Book  Google Scholar 

  • Csikszentmihalyi, M., Larson, R., & Prescott, S. (2014). The ecology of adolescent activity and experience. In Applications of flow in human development and education (pp. 241–254). Springer.

  • Csikszentmihalyi, M., & LeFevre, J. (1989). Optimal experience in work and leisure. Journal of Personality and Social Psychology, 56(5), 815. https://doi.org/10.1037/0022-3514.56.5.815

    Article  Google Scholar 

  • Davidson, Z. E., & Palermo, C. (2015). Developing research competence in undergraduate students through hands on learning. Journal of Biomedical Education. https://doi.org/10.1155/2015/306380

    Article  Google Scholar 

  • De Brún, A., Rogers, L., Drury, A., & Gilmore, B. (2022). Evaluation of a formative peer assessment in research methods teaching using an online platform: A mixed methods pre-post study. Nurse Education Today, 108, 105166. https://doi.org/10.1016/j.nedt.2021.105166

    Article  Google Scholar 

  • Domagk, S., Schwartz, R. N., & Plass, J. L. (2010). Interactivity in multimedia learning: An integrated model. Computers in Human Behavior, 26(5), 1024–1033. https://doi.org/10.1016/j.chb.2010.03.003

    Article  Google Scholar 

  • Etengoff, C. (2022). Reframing psychological research methods courses as tools for social justice education. Teaching of Psychology, 00986283221097404. https://doi.org/10.1177/00986283221097404

  • Fang, J. W., Chang, S. C., Hwang, G. J., & Yang, G. (2021). An online collaborative peer-assessment approach to strengthening pre-service teachers’ digital content development competence and higher-order thinking tendency. Educational Technology Research and Development, 69, 1155–1181. https://doi.org/10.1007/s11423-021-09990-7

    Article  Google Scholar 

  • Gehringer, E. (2001). Assignment and quality control of peer reviewers. In 2001 Annual conference (pp. 6–224). https://doi.org/10.18260/1-2--8941

  • Gielen, S., Peeters, E., Dochy, F., Onghena, P., & Struyven, K. (2010). Improving the effectiveness of peer feedback for learning. Learning and Instruction, 20(4), 304–315. https://doi.org/10.1016/j.learninstruc.2009.08.007

    Article  Google Scholar 

  • Goldstein, J. (2007). Easy to dance to: Solving the problems of teacher evaluation with peer assistance and review. American Journal of Education, 113(3), 479–508. https://doi.org/10.1086/512741

    Article  Google Scholar 

  • Hancock, D. (2004). Cooperative learning and peer orientation effects on motivation and achievement. The Journal of Educational Research, 97(3), 159–168. https://doi.org/10.3200/JOER.97.3.159-168

    Article  Google Scholar 

  • Henson, R. K. (2001). Understanding internal consistency reliability estimates: A conceptual primer on coefficient alpha. Measurement and Evaluation in Counseling and Development, 34(3), 177–189. https://doi.org/10.1080/07481756.2002.12069034

    Article  Google Scholar 

  • Hou, H. T., Chang, K. E., & Sung, Y. T. (2007). An analysis of peer assessment online discussions within a course that uses project-based learning. Interactive Learning Environments, 15(3), 237–251. https://doi.org/10.1080/10494820701206974

    Article  Google Scholar 

  • Hou, H. T., & Keng, S. H. (2021). A dual-scaffolding framework integrating peer-scaffolding and cognitive-scaffolding for an augmented reality-based educational board game: An analysis of learners’ collective flow state and collaborative learning behavioral patterns. Journal of Educational Computing Research, 59(3), 547–573. https://doi.org/10.1177/0735633120969409

    Article  Google Scholar 

  • Hsia, L. H., Huang, I., & Hwang, G. J. (2016). Effects of different online peer-feedback approaches on students’ performance skills, motivation and self-efficacy in a dance course. Computers & Education, 96, 55–71. https://doi.org/10.1016/j.compedu.2016.02.004

    Article  Google Scholar 

  • Hsia, L. H., & Sung, H. Y. (2020). Effects of a mobile technology-supported peer assessment approach on students’ learning motivation and perceptions in a college flipped dance class. International Journal of Mobile Learning and Organisation, 14(1), 99–113.

    Article  Google Scholar 

  • Hsieh, S. I., Hsu, L. L., & Huang, T. H. (2016). The effect of integrating constructivist and evidence-based practice on baccalaureate nursing student’s cognitive load and learning performance in a research course. Nurse Education Today, 42, 1–8. https://doi.org/10.1016/j.nedt.2016.03.025

    Article  Google Scholar 

  • Huang, Y. Y., Liu, C. C., Wang, Y., Tsai, C. C., & Lin, H. M. (2017). Student engagement in long-term collaborative EFL storytelling activities: An analysis of learners with English proficiency differences. Journal of Educational Technology & Society, 20(3), 95–109.

    Google Scholar 

  • Hwang, G. J., & Chang, S. C. (2021). Facilitating knowledge construction in mobile learning contexts: A bi-directional peer-assessment approach. British Journal of Educational Technology, 52(1), 337–357. https://doi.org/10.1111/bjet.13001

    Article  Google Scholar 

  • Jekel, J. F. (2007). Epidemiology, biostatistics, and preventive medicine. Elsevier Health Sciences.

    Google Scholar 

  • Kearney, S. (2013). Improving engagement: The use of ‘Authentic self-and peer-assessment for learning’ to enhance the student learning experience. Assessment & Evaluation in Higher Education, 38(7), 875–891. https://doi.org/10.1080/02602938.2012.751963

    Article  Google Scholar 

  • Krithikadatta, J. (2014). Normal distribution. Journal of Conservative Dentistry: JCD, 17(1), 96.

    Article  Google Scholar 

  • Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174. https://doi.org/10.2307/2529310

    Article  Google Scholar 

  • Langdon, J., Sturges, D., & Schlote, R. (2018). Flipping the classroom: Effects on course experience, academic motivation, and performance in an undergraduate exercise science research methods course. Journal of the Scholarship of Teaching and Learning, 18(4), 13–27.

    Article  Google Scholar 

  • Lee, J., & Bonk, C. J. (2016). Social network analysis of peer relationships and online interactions in a blended class using blogs. The Internet and Higher Education, 28, 35–44. https://doi.org/10.1016/j.iheduc.2015.09.001

    Article  Google Scholar 

  • Lettenmaier, D. P. (1988). Multivariate nonparametric tests for trend in water quality 1. JAWRA Journal of the American Water Resources Association, 24(3), 505–512. https://doi.org/10.1111/j.1752-1688.1988.tb00900.x

    Article  Google Scholar 

  • Li, H., Xiong, Y., Hunter, C. V., Guo, X., & Tywoniw, R. (2020). Does peer assessment promote student learning? A meta-analysis. Assessment & Evaluation in Higher Education, 45(2), 193–211. https://doi.org/10.1080/02602938.2019.1620679

    Article  Google Scholar 

  • Liu, C. C., Chen, Y. C., & Tai, S. J. D. (2017). A social network analysis on elementary student engagement in the networked creation community. Computers & Education, 115, 114–125. https://doi.org/10.1016/j.compedu.2017.08.002

    Article  Google Scholar 

  • Liu, C. C., Wang, P. C., & Tai, S. J. D. (2016). An analysis of student engagement patterns in language learning facilitated by Web 2.0 technologies. ReCALL, 28(2), 104–122. https://doi.org/10.1017/S095834401600001X

    Article  Google Scholar 

  • Lowes, S., Lin, P., & Kinghorn, B. (2015). Exploring the link between online behaviours and course performance in asynchronous online high school courses. Journal of Learning Analytics, 2(2), 169–194. https://doi.org/10.18608/jla.2015.22.13

    Article  Google Scholar 

  • Luria, G., & Kalish, Y. (2013). A social network approach to peer assessment: Improving predictive validity. Human Resource Management, 52(4), 537–560. https://doi.org/10.1002/hrm.21541

    Article  Google Scholar 

  • Luxton-Reilly, A. (2009). A systematic review of tools that support peer assessment. Computer Science Education, 19(4), 209–232. https://doi.org/10.1080/08993400903384844

    Article  Google Scholar 

  • Ma, N., Du, L., Lu, Y., & Sun, Y. F. (2022). The influence of social network prestige on in-service teachers’ learning outcomes in online peer assessment. Computers and Education Open, 3, 100087. https://doi.org/10.1016/j.caeo.2022.100087

    Article  Google Scholar 

  • MacLeod, J., Yang, H. H., & Shi, Y. (2019). Student-to-student connectedness in higher education: A systematic literature review. Journal of Computing in Higher Education, 31(2), 426–448. https://doi.org/10.1007/s12528-019-09214-1

    Article  Google Scholar 

  • Markle, G. (2017). Factors influencing achievement in undergraduate social science research methods courses: A mixed methods analysis. Teaching Sociology, 45(2), 105–115. https://doi.org/10.1177/0092055X16676302

    Article  Google Scholar 

  • Matute-Vallejo, J., & Melero-Polo, I. (2019). Understanding online business simulation games: The role of flow experience, perceived enjoyment and personal innovativeness. Australasian Journal of Educational Technology. https://doi.org/10.14742/ajet.3862

    Article  Google Scholar 

  • Meek, S. E., Blakemore, L., & Marks, L. (2017). Is peer review an appropriate form of assessment in a MOOC? Student participation and performance in formative peer review. Assessment & Evaluation in Higher Education, 42(6), 1000–1013. https://doi.org/10.1080/02602938.2016.1221052

    Article  Google Scholar 

  • Morris, P., Ida, A. K., Migliaccio, T., Tsukada, Y., & Baker, D. (2020). Collaborative learning in sociology research methods courses: Does race matter? Teaching Sociology, 48(4), 300–312. https://doi.org/10.1177/0092055X20953876

    Article  Google Scholar 

  • Müller, P. A., Bäumer, T., Silberer, J., & Zimmermann, S. (2020). Using research methods courses to teach students about sustainable development: A three-phase model for a transformative learning experience. International Journal of Sustainability in Higher Education, 21(3), 427–439.

    Article  Google Scholar 

  • Murtonen, M., Olkinuora, E., Tynjälä, P., & Lehtinen, E. (2008). “Do I need research skills in working life?”: University students’ motivation and difficulties in quantitative methods courses. Higher Education, 56(5), 599–612. https://doi.org/10.1007/s10734-008-9113-9

    Article  Google Scholar 

  • Mustafa, S. M. S., Elias, H., Noah, S. M., & Roslan, S. (2010). A proposed model of motivational influences on academic achievement with flow as the mediator. Procedia-Social and Behavioral Sciences, 7, 2–9. https://doi.org/10.1016/j.sbspro.2010.10.001

    Article  Google Scholar 

  • Nicol, D., Thomson, A., & Breslin, C. (2014). Rethinking feedback practices in higher education: A peer review perspective. Assessment & Evaluation in Higher Education, 39(1), 102–122. https://doi.org/10.1080/02602938.2013.795518

    Article  Google Scholar 

  • Nguyen, T. D., Cannata, M., & Miller, J. (2018). Understanding student behavioral engagement: Importance of student interaction with peers and teachers. The Journal of Educational Research, 111(2), 163–174. https://doi.org/10.1080/00220671.2016.1220359

    Article  Google Scholar 

  • Özhan, ŞÇ., & Kocadere, S. A. (2020). The effects of flow, emotional engagement, and motivation on success in a gamified online learning environment. Journal of Educational Computing Research, 57(8), 2006–2031. https://doi.org/10.1177/0735633118823159

    Article  Google Scholar 

  • Panadero, E., & Alqassab, M. (2019). An empirical review of anonymity effects in peer assessment, peer feedback, peer review, peer evaluation and peer grading. Assessment & Evaluation in Higher Education, 44(8), 1253–1278. https://doi.org/10.1080/02602938.2019.1600186

    Article  Google Scholar 

  • Papadopoulos, P., Lagkas, T., Demetriadis, S., & Fischer, F. (2011). Allowing students to select deliverables for peer review: Analysis of a free-selection protocol. In EdMedia+ innovate learning (pp. 2249–2258). Association for the Advancement of Computing in Education (AACE).

  • Papadopoulos, P., & Obwegeser, N. (2016). Peer review in the classroom: The benefits of free selection in a time. In MCIS 2016 proceedings. Paper 16. https://aisel.aisnet.org/mcis2016/16

  • Papadopoulos, P. M., Lagkas, T. D., & Demetriadis, S. N. (2012). How to improve the peer review method: Free-selection vs assigned-pair protocol evaluated in a computer networking course. Computers & Education, 59(2), 182–195. https://doi.org/10.1016/j.compedu.2012.01.005

    Article  Google Scholar 

  • Patchan, M. M., Schunn, C. D., & Correnti, R. J. (2016). The nature of feedback: How peer feedback features affect students’ implementation rate and quality of revisions. Journal of Educational Psychology, 108(8), 1098–1120. https://doi.org/10.1037/edu0000103

    Article  Google Scholar 

  • Payant, C., & Zuniga, M. (2022). Learners’ flow experience during peer revision in a virtual writing course during the global pandemic. System, 105, 102715. https://doi.org/10.1016/j.system.2021.102715

    Article  Google Scholar 

  • Peterson, S. E., & Miller, J. A. (2004). Comparing the quality of students’ experiences during cooperative learning and large-group instruction. The Journal of Educational Research, 97(3), 123–134. https://doi.org/10.3200/JOER.97.3.123-134

    Article  Google Scholar 

  • Philp, J., & Duchesne, S. (2016). Exploring engagement in tasks in the language classroom. Annual Review of Applied Linguistics, 36, 50–72. https://doi.org/10.1017/S0267190515000094

    Article  Google Scholar 

  • Pintrich, P. R., Smith, D., Garcia, T., & McKeachie, W. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ). The University of Michigan.

    Google Scholar 

  • Planas Lladó, A., Soley, L. F., Fraguell Sansbelló, R. M., Pujolras, G. A., Planella, J. P., Roura-Pascual, N., Suñol Martínez, J. J., & Moreno, L. M. (2014). Student perceptions of peer assessment: An interdisciplinary study. Assessment & Evaluation in Higher Education, 39(5), 592–610.

    Article  Google Scholar 

  • Rachmatullah, A., Reichsman, F., Lord, T., Dorsey, C., Mott, B., Lester, J., & Wiebe, E. (2021). Modeling secondary students’ genetics learning in a game-based environment: Integrating the expectancy-value theory of achievement motivation and flow theory. Journal of Science Education and Technology, 30(4), 511–528. https://doi.org/10.1007/s10956-020-09896-8

    Article  Google Scholar 

  • Raes, A., Vanderhoven, E., & Schellens, T. (2015). Increasing anonymity in peer assessment by using classroom response technology within face-to-face higher education. Studies in Higher Education, 40(1), 178–193. https://doi.org/10.1080/03075079.2013.823930

    Article  Google Scholar 

  • Reuse-Durham, N. (2005). Peer evaluation as an active learning technique. Journal of Instructional Psychology, 32(4), 338–345.

    Google Scholar 

  • Rhoads, M. C., Kirkland, R. A., Baker, C. A., Yeats, J. T., & Grevstad, N. (2021). Benefits of movement-integrated learning activities in statistics and research methods courses. Teaching of Psychology, 48(3), 197–203. https://doi.org/10.1177/0098628320977265

    Article  Google Scholar 

  • Roberts, T. S. (2005). Computer-supported collaborative learning in higher education. In Computer-supported collaborative learning in higher education (pp. 1–18). IGI Global.

  • Rosa, S. S., Coutinho, C. P., & Flores, M. A. (2016). Online peer assessment: Method and digital technologies. Procedia-Social and Behavioral Sciences, 228, 418–423. https://doi.org/10.1016/j.sbspro.2016.07.064

    Article  Google Scholar 

  • Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25(1), 54–67. https://doi.org/10.1006/ceps.1999.1020

    Article  Google Scholar 

  • Rynne, J., Kwek, A., & Bui, J. (2012). Insights into the academic motivation of tourism and hospitality students in a research methods course. Journal of Hospitality & Tourism Education, 24(2–3), 28–39. https://doi.org/10.1080/10963758.2012.10696667

    Article  Google Scholar 

  • Saqr, M., Fors, U., Tedre, M., & Nouri, J. (2018). How social network analysis can be used to monitor online collaborative learning and guide an informed intervention. PLoS ONE, 13(3), e0194777. https://doi.org/10.1371/journal.pone.0194777

    Article  Google Scholar 

  • Shernoff, D. J., Tonks, S. M., & Anderson, B. (2014). The impact of the learning environment on student engagement in high school classrooms. Teachers College Record, 116(13), 166–177. https://doi.org/10.1177/016146811411601315

    Article  Google Scholar 

  • Shin, N. (2006). Online learner’s ‘flow’ experience: An empirical study. British Journal of Educational Technology, 37(5), 705–720. https://doi.org/10.1111/j.1467-8535.2006.00641.x

    Article  Google Scholar 

  • Sillaots, M. (2014). Achieving flow through gamification: A study on re-designing research methods courses. In European conference on games based learning (Vol. 2, p. 538). Academic Conferences International Limited.

  • Silva, E., & Moreira, D. (2003). WebCoM: A tool to use peer review to improve student interaction. Journal on Educational Resources in Computing (JERIC), 3(1), 3. https://doi.org/10.1145/958795.958798

    Article  Google Scholar 

  • Sonnenberg-Klein, J., Abler, R. T., Coyle, E. J., & Ai, H. H. (2017). Multidisciplinary vertically integrated teams: Social network analysis of peer evaluations for Vertically Integrated Projects (VIP) program teams. In 2017 ASEE annual conference & exposition.

  • Stapleton, P. (2019). Avoiding cognitive biases: Promoting good decision making in research methods courses. Teaching in Higher Education, 24(4), 578–586. https://doi.org/10.1080/13562517.2018.1557137

    Article  Google Scholar 

  • Stapleton, P., & Shao, Q. (2018). A worldwide survey of MATESOL programs in 2014: Patterns and perspectives. Language Teaching Research, 22(1), 10–28. https://doi.org/10.1177/1362168816659681

    Article  Google Scholar 

  • Stockdale, S. L., & Williams, R. L. (2004). Cooperative learning groups at the college level: Differential effects on high, average, and low exam performers. Journal of Behavioral Education, 13, 37–50. https://doi.org/10.1023/B:JOBE.0000011259.97014.94

    Article  Google Scholar 

  • Swanson, H. L. (1990). Influence of metacognitive knowledge and aptitude on problem solving. Journal of Educational Psychology, 82(2), 306. https://doi.org/10.1037/0022-0663.82.2.306

    Article  Google Scholar 

  • Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276. https://doi.org/10.3102/00346543068003249

    Article  Google Scholar 

  • Topping, K., Buchs, C., Duran, D., & Van Keer, H. (2017). Effective peer learning: From principles to practical implementation. Routledge.

    Book  Google Scholar 

  • Trevino, L. K., & Webster, J. (1992). Flow in computer-mediated communication: Electronic mail and voice mail evaluation and impacts. Communication Research, 19(5), 539–573. https://doi.org/10.1177/009365092019005001

    Article  Google Scholar 

  • Tsivitanidou, O. E., Zacharia, Z. C., & Hovardas, T. (2011). Investigating secondary school students’ unmediated peer assessment skills. Learning and Instruction, 21(4), 506–519. https://doi.org/10.1016/j.learninstruc.2010.08.002

    Article  Google Scholar 

  • Van den Berg, I., Admiraal, W., & Pilot, A. (2006). Peer assessment in university teaching: Evaluating seven course designs. Assessment & Evaluation in Higher Education, 31(1), 19–36. https://doi.org/10.1080/02602930500262346

    Article  Google Scholar 

  • Vanderhoven, E., Raes, A., Montrieux, H., Rotsaert, T., & Schellens, T. (2015). What if pupils can assess their peers anonymously? A quasi-experimental study. Computers & Education, 81, 123–132. https://doi.org/10.1016/j.compedu.2014.10.001

    Article  Google Scholar 

  • van Gennip, N. A., Segers, M. S., & Tillema, H. H. (2009). Peer assessment for learning from a social perspective: The influence of interpersonal variables and structural features. Educational Research Review, 4(1), 41–54. https://doi.org/10.1016/j.edurev.2008.11.002

    Article  Google Scholar 

  • Vickers, A. J. (2005). Parametric versus non-parametric statistics in the analysis of randomized trials with non-normally distributed data. BMC Medical Research Methodology, 5(1), 1–12. https://doi.org/10.1186/1471-2288-5-35

    Article  Google Scholar 

  • Wahyuni, S., Ikashaum, F., Wulantina, E., Mustika, J., & Putri, L. M. (2022). Development of Authentic Assessment Models in Research Methods Courses. In Eighth Southeast Asia Design Research (SEA-DR) & the second science, technology, education, arts, culture, and humanity (STEACH) international conference (SEADR-STEACH 2021) (pp. 98–102). Atlantis Press.

  • Wang, S. M., Hou, H. T., & Wu, S. Y. (2017a). Analyzing the knowledge construction and cognitive patterns of blog-based instructional activities using four frequent interactive strategies (problem solving, peer assessment, role playing and peer tutoring): A preliminary study. Educational Technology Research and Development, 65(2), 301–323. https://doi.org/10.1007/s11423-016-9471-4

    Article  Google Scholar 

  • Wang, X. M., Hwang, G. J., Liang, Z. Y., & Wang, H. Y. (2017b). Enhancing students’ computer programming performances, critical thinking awareness and attitudes towards programming: An online peer-assessment attempt. Journal of Educational Technology & Society, 20(4), 58–68.

    Google Scholar 

  • Wang, Y., Wang, H., Schunn, C., & Baehr, E. (2016). Choosing a better moment to assign reviewers in peer assessment: The earlier the better, or the later the better?. In EDM (Workshops).

  • Wang, Y., Liu, B., Zhang, K., Jiang, Y., & Sun, F. (2019). Reviewer assignment strategy of peer assessment: Towards managing collusion in self-assignment. In 2nd International conference on social science, public health and education (SSPHE 2018) (pp. 313–317). Atlantis Press.

  • Wang, Y. Q., & Sun, F. Q. (2018). How to choose an appropriate reviewer assignment strategy in peer assessment system? Considering fairness and incentive. In 4th Annual international conference on management, economics and social development (ICMESD 2018) (pp. 603–608). Atlantis Press.

  • Wolfe, W. J. (2004). Online student peer reviews. In Proceedings of the 5th conference on information technology education, SIGITE 2004, 2004, Salt Lake City, UT, USA, October 28–30, 2004. ACM.

  • Xie, K., Di Tosto, G., Lu, L., & Cho, Y. S. (2018). Detecting leadership in peer-moderated online collaborative learning through text mining and social network analysis. The Internet and Higher Education, 38, 9–17. https://doi.org/10.1016/j.iheduc.2018.04.002

    Article  Google Scholar 

  • Yang, M., Badger, R., & Yu, Z. (2006). A comparative study of peer and teacher feedback in a Chinese EFL writing class. Journal of Second Language Writing, 15(3), 179–200. https://doi.org/10.1016/j.jslw.2006.09.004

    Article  Google Scholar 

  • Zhan, Y. (2021). What matters in design? Cultivating undergraduates’ critical thinking through online peer assessment in a Confucian heritage context. Assessment & Evaluation in Higher Education, 46(4), 615–630. https://doi.org/10.1080/02602938.2020.1804826

    Article  Google Scholar 

  • Zhao, X., & Zhang, Y. (2022). Reviewer assignment algorithms for peer review automation: A survey. Information Processing & Management, 59(5), 103028. https://doi.org/10.1016/j.ipm.2022.103028

    Article  Google Scholar 

  • Zhang, Y., Chen, B. L., Ge, J., Hung, C. Y., & Mei, L. (2019). When is the best time to use rubrics in flipped learning? A study on students’ learning achievement, metacognitive awareness, and cognitive load. Interactive Learning Environments, 27(8), 1207–1221. https://doi.org/10.1080/10494820.2018.1553187

    Article  Google Scholar 

  • Zhang, Y., Pi, Z., Chen, L., Zhang, X., & Yang, J. (2021). Online peer assessment improves learners’ creativity: Not only learners’ roles as an assessor or assessee, but also their behavioral sequence matter. Thinking Skills and Creativity, 42, 100950. https://doi.org/10.1016/j.tsc.2021.100950

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Xing Li and Jue Wang for their valuable advice to help improve the article.

Funding

This study was funded by National Natural Science Foundation of China (Grant No. 72274076), Teacher Education College of Central China Normal University (Grant No. CCNUTEIII 2021-06), Faculty of Artificial Intelligence Education, CCNU (Grant No. 2021ZNZJ012).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Yi Zhang or Yuqin Yang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The preliminary findings of this study were presented at 14th International Conference on Blended Learning (ICBL 2021), and the conference paper received the “Excellent Paper Award”. This study is invited by the Special Issue “Redefining the Learning Process through Educational and Technological Innovations” of the Journal of Computing in Higher Education.

Appendix

Appendix

Evaluation rubric for research topic concept maps

Criteria

Score

5

4

3

2

1

Title

The research title is very concise, clear and standard

The research title is concise, clear and standard

The research title is moderately concise, clear and standard

The research title is not sufficiently concise, clear, or standard

The research title is too complex and abstract

Elements

The research topic contains sufficient key elements and adequately reflects the relationships between them

The research topic contains sufficient key elements and reflects the relationships between them

The research topic contains the basic key elements and reflects the relationships between them

The research topic contains the basic key elements but does not reflect the relationships between them

The research topic does not contain all the elements

Value

The research is very valuable and gives a very good picture of a current educational topic

The research has value and reflects a current educational topic

The research is of moderate value and somewhat reflects a current education topic

The research is not sufficiently valuable or does not reflect an educational topic

The research is not valuable or does not reflect an educational topic

Evaluation rubric for research proposals

Criteria

Score

5

4

3

2

1

Research topic

The research title is very concise, clear and standard

The research title is concise, clear and standard

The research title is moderately concise, clear and standard

The research title is not sufficiently concise, clear, or standard

The research title is too complex and abstract

The research topic contains sufficient key elements and adequately reflects the relationships between them

The research topic contains sufficient key elements and reflects the relationships between them

The research topic contains the basic key elements and reflects the relationships between them

The research topic contains the basic key elements but does not reflect the relationships between them

The research topic does not contain all the elements

The research is very valuable and gives a very good picture of a current educational topic

The research has value and reflects a current educational topic

The research is of moderate value and somewhat reflects a current education topic

The research is not sufficiently valuable or does not reflect an educational topic

The research is not valuable or does not reflect an educational topic

Literature review

The literature is adequate and highly relevant

The literature is adequate and relevant

The literature is mostly adequate and relevant

The literature is not adequate or not highly relevant

The literature lacks unity and coherence

The review adequately addresses previous research

The review addresses previous research

The review is mostly adequate

The review is not adequate

The literature lacks a clear organization and structure

The review is very critical

The review is critical

There are some critical comments in the literature

There are only a few critical comments in the literature

There are no critical comments in the literature

Research question

The research question is described in a precise and concise way

The research question is described in a precise or concise way

The research question is described in a concise way

The research question is described in an imprecise or nonconcise way

The research question is fuzzy and overinvolved

Research content and method

The research content is highly feasible

The research content is feasible

The research content is basically feasible

The research content is not very feasible

The research content is messy and not feasible

The research method is very scientific and suitable

The research method is scientific and suitable

The research method is basically scientific and suitable

The research method is not sufficiently scientific or suitable

The research method is under- or unspecified

Research plan

The research plan is very reasonable and specific

The research plan is reasonable and specific

The research plan is moderately reasonable and specific

The research plan is not sufficiently reasonable or specific

The research plan is inoperable

References

The references are authoritative and new

Most of the references are authoritative and new

Some of the references are authoritative and new

Only a few of the references are authoritative and new

The references are outdated

The references are of immediate relevance to the topic

Most of the references are of immediate relevance to the topic

Some of the references are of immediate relevance to the topic

Only a few of the references are of immediate relevance to the topic

The references are irrelevant or trivial

The number of references is over 25

The number of references is between 25 and 20

The number of references is between 15 and 20

The number of references is between 10 and 15

The number of references is fewer than 10

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lin, Y., Zhang, Y., Yang, Y. et al. “Free selection and invitation” online peer assessment of undergraduates’ research competencies, flow, motivation and interaction in a research methods course. J Comput High Educ (2023). https://doi.org/10.1007/s12528-023-09374-1

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s12528-023-09374-1

Keywords

Navigation