Skip to main content
Log in

Using feedback requests to actively involve assessees in peer assessment: effects on the assessor’s feedback content and assessee’s agreement with feedback

  • Published:
European Journal of Psychology of Education Aims and scope Submit manuscript

A Correction to this article was published on 07 May 2018

This article has been updated

Abstract

Criticizing the common approach of supporting peer assessment through providing assessors with an explication of assessment criteria, recent insights on peer assessment call for support focusing on assessees, who often assume a passive role of receivers of feedback. Feedback requests, which require assessees to formulate their specific needs for feedback, have therefore been put forward as an alternative to supporting peer assessment, even though there is little known about their exact impact on feedback. Operationalizing effective feedback as feedback that (1) elaborates on the evaluation and (2) to which the receiver is agreeable, the present study examines how these two variables are affected by feedback requests, compared to an explanation of assessment criteria in the form of a content checklist. Situated against the backdrop of a writing task for 125 first-year students in an educational studies program at university, the study uses a 2 × 2 factorial design that resulted in four conditions: a control, feedback request, content checklist, and combination condition. The results underline the importance of taking message length into account when studying the effects of support for peer assessment. Although feedback requests did not have an impact on the raw number of elaborations, the proportion of informative elaborations within feedback messages was significantly higher in conditions that used a feedback request. In other words, it appears that the feedback request stimulated students to write more focused messages. In comparison with feedback content, the use of a feedback request did, however, not have a significant effect on agreement with feedback.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Change history

  • 07 May 2018

    Due to an error on the publisher’s side, the authors’ corrections to the proofs were unfortunately not incorporated into the original article. As a result, all references to the authors’ own work remained blinded, and minor textual errors went unattended.

References

  • Anseel, F., Lievens, F., & Schollaert, E. (2009). Reflection as a strategy to enhance task performance after feedback. Organizational Behavior and Human Decision Processes, 110(1), 23–35.

    Article  Google Scholar 

  • Authors (2015a) [removed for peer review].

  • Authors (2015b) [removed for peer review].

  • Authors (2015c) [removed for peer review].

  • Bangert-Drowns, R. L., Kulik, C.-C., Kulik, J. A., & Morgan, M. (1991). The instructional effect of feedback in test-like events. Review of Educational Research, 61(2), 213–238.

    Article  Google Scholar 

  • Butler, L. B. (1987). Task-involving and ego-involving properties of evaluation: effects of different feedback conditions on motivational perceptions, interest, and performance. Journal of Educational Psychology, 79(4), 474–482.

    Article  Google Scholar 

  • Cheng, K. H., Liang, J. C., & Tsai, C. C. (2015). Examining the role of feedback messages in undergraduates’ writing peroformance during an online peer assessment activity. The Internet and Higher Education, 25(1), 78–84.

    Article  Google Scholar 

  • Falchikov, N. (1995). Peer feedback marking: developing peer assessment. Innovations in Education and Training International, 32, 175–187.

    Article  Google Scholar 

  • Falchikov, N., & Goldfinch, J. (2000). Student peer assessment in higher education: a meta-analysis comparing peer and teacher marks. Review of Educational Research, 70(3), 287–322.

    Article  Google Scholar 

  • Gibbs, G., & Simpson, C. (2004). Conditions under which assessment support students’ learning. Learning and Teaching in Higher Education, 1(1), 3–31.

    Google Scholar 

  • Gielen, S., Tops, L., Dochy, F., Onghena, P., & Smeets, S. (2010). A comparative study of peer and teacher feedback and of various peer feedback forms in a secondary school writing curriculum. British Educational Research Journal, 36(1), 143–162.

    Article  Google Scholar 

  • Harks, B., Rakoczt, K., Hattie, J., Besser, M., & Klieme, E. (2014). The effects of feedback on achievement, interest and self-evaluation: the role of the feedback’s perceived usefulness. Educational Psychology, 34(3), 269–290.

    Article  Google Scholar 

  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 88–112.

    Article  Google Scholar 

  • Hox, J. (1998). Multilevel modeling: when and why. In I. Balderjahn, R. Mathar, & M. Schader (Eds.), Classification, data analysis, and data highways: proceedings of the 21st Annual Conference of the Gesellschaft für Klassifikation e.V., University of Potsdam, March 12–14, 1997 (pp. 147–154). Berlin: Springer Berlin Heidelberg.

    Chapter  Google Scholar 

  • Ilgen, D. R., Fisher, C. D., & Taylor, M. S. (1979). Consequences of individual feedback on behavior in organizations. Journal of Applied Psychology, 64(4), 349–371.

    Article  Google Scholar 

  • Kaufman, J. H., & Schunn, C. D. (2011). Students’ perceptions about peer assessment for writing: their origin and impact on revision work. Instructional Science, 39(3), 387–406.

    Article  Google Scholar 

  • King, A. (2002). Structuring peer interaction to promote high-level cognitive processing. Theory Into Practice, 41(1), 33–39.

    Article  Google Scholar 

  • Kollar, I., & Fischer, F. (2010). Peer assessment as collaborative learning: a cognitive perspective. Learning and Instruction, 20(4), 344–348.

    Article  Google Scholar 

  • Narciss, S. (2008). Feedback strategies for interactive learning tasks. In J. M. Spector, M. D. Merrill, J. J. G. Van Merriënboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 125–143). Mahwah: Lawrence Erlbaum.

    Google Scholar 

  • Neuendorf, K. A. (2002). The content analysis guidebook. London: Sage Publications.

    Google Scholar 

  • Ng, E. M. W. (2016). Fostering pre-service teachers’ self-regulated learning through self- and peer assessment of wiki projects. Computers and Education, 98(1), 180–191.

    Article  Google Scholar 

  • Nicol, D. (2010). From monologue to dialogue: improving written feedback processes in mass higher education. Assessment & Evaluation in Higher Education, 35(5), 501–517.

    Article  Google Scholar 

  • Nicol, D., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback. Studies in Higher Education, 31(2), 199–218.

    Article  Google Scholar 

  • Orr, J. O., Sacket, P. R., & Dubois, C. L. Z. (1991). Outlier detection and treatment in I/O psychology: a survey of research beliefs and an empirical illustration. Personnel Psychology, 44, 473–486.

    Article  Google Scholar 

  • Osborne, J. (2004). The power of outliers (and why researchers should ALWAYS check for them). Practical Assessment, Research & Evaluation, 9(6), 1–8.

    Google Scholar 

  • Ozogul, G., & Sullivan, H. (2009). Student performance and attitudes under formative evaluation by teacher, self and peer evaluators. Educational Technology Research and Development, 57(3), 393–410.

    Article  Google Scholar 

  • Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: a review. Educational Research Review, 9, 129–144.

    Article  Google Scholar 

  • Planas Lladó, A., Soley, L. F., Fraguell Sansbelló, R. M., Pujolras, G. A., Planella, J. P., Roura-Pascual, N., et al. (2014). Student perceptions of peer assessment: an interdisciplinary study. Assessment & Evaluation in Higher Education, 39(5), 592–610.

    Article  Google Scholar 

  • Poverjuc, O., Brooks, V., & Wray, D. (2012). Using peer feedback in a Master’s programme: a multiple case study. Teaching in Higher Education, 17(4), 465–477.

    Article  Google Scholar 

  • Price, M., Handley, K., Millar, J., & O’Donovan, B. (2010). Feedback: all that effort, but what is the effect? Assessment & Evaluation in Higher Education, 35(3), 277–289.

    Article  Google Scholar 

  • Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144.

    Article  Google Scholar 

  • Strijbos, J.-W., Martens, R. L., Prins, F. J., & Jochems, W. M. G. (2006). Content analysis: what are they talking about? Computers & Education, 46(1), 29–48.

    Article  Google Scholar 

  • Strijbos, J.-W., Narciss, S., & Dünnebier, K. (2010). Peer feedback content and sender’s competence level in academic writing revision tasks: are they critical for feedback perceptions and efficiency? Learning and Instruction, 20(4), 291–303.

    Article  Google Scholar 

  • Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276.

    Article  Google Scholar 

  • Topping, K. J. (2009). Peer assessment. Theory Into Practice, 48(1), 20–27.

    Article  Google Scholar 

  • Walker, M. (2014). The quality of written peer feedback on undergraduates’ draft answers to an assignment, and the use made of the feedback. Assessment & Evaluation in Higher Education, 40(2), 232–247.

    Article  Google Scholar 

  • Webb, N. M. (1991). Task-related verbal interaction and mathematics learning in small groups. Journal for Research in Mathematics Education, 22(5), 366–389.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michiel Voet.

Additional information

Michiel Voet. Ghent University, Department of Educational Studies—Henri Dunantlaan 2, 9000 Gent, Belgium, michiel.voet@ugent.be

Current themes of research:

History education, inquiry-based learning, teacher training

Most relevant publications in the field of Psychology of Education:

1. Voet, M. & De Wever, B. (2017). History teachers’ knowledge of inquiry methods: an analysis of cognitive processes used during a historical inquiry. Journal of Teacher Education, 68(3), 312–329.

2. Voet, M. & De Wever, B. (2017). Preparing pre-service history teachers for organizing inquiry-based learning: the effects of an introductory training program. Teaching and Teacher Education, 63, 206–217.

3. Voet, M. & De Wever, B. (2016). History teachers’ conceptions of inquiry-based learning, beliefs about the nature of history, and their relation to the classroom context. Teaching and Teacher Education, 55, 57–67.

4. Voet, M. & De Wever, B. (2016). Towards a differentiated and domain-specific view of educational technology: an exploratory study of history teachers’ technology use. British Journal of Educational Technology. Advance online publication.

5. De Wever, B., Hämäläinen, R., Voet, M., & Gielen, M. (2015). A wiki task for first-year university students: the effect of scripting students’ collaboration. The Internet and Higher Education, 25, 37–44.

Mario Gielen. Hasselt University, School for Transportation Sciences—Wetenschapspark 5, 3590 Diepenbeek, Belgium, mario.gielen@uhasselt.be

Current themes of research:

Peer assessment and peer feedback, massive online open courses.

Most relevant publications in the field of Psychology of Education:

1. Gielen, M. & De Wever, B. (2015). Scripting the role of assessor and assessee in peer assessment in a wild environment: impact on peer feedback quality and product improvement. Computers & Education, 88, 370–386.

2. Gielen, M., & De Wever, B. (2015). Structuring peer assessment: comparing the impact of the degree of structure on peer feedback content. Computers in Human Behavior, 52, 315–325

3. Gielen, M, & De Wever, B. (2014). Structuring the peer assessment process: a multilevel approach for the impact on product improvement and peer feedback quality. Journal of Computer-assisted Learning, 31(5), 435–449.

4. De Wever, B., Hämäläinen, R., Voet, M., & Gielen, M. (2015). A wiki task for first-year university students: the effect of scripting students’ collaboration. The Internet and Higher Education, 25, 37–44.

Ruth Boelens. Ghent University, Department of Educational Studies—Henri Dunantlaan 2, 9000 Gent, Belgium, ruth.boelens@ugent.be

Current themes of research:

Blended learning, higher education, adult education, peer assessment, computer-supported collaborative learning

Most relevant publications in the field of Psychology of Education:

1. Boelens, R., De Wever, B., Rosseel, Y., Verstraete, A., & Derese, A. (2015). What are the most important tasks of tutors during the tutorials in hybrid problem-based learning curricula? BMC Medical Education, 15(84).

Bram De Wever. Ghent University, Department of Educational Studies—Henri Dunantlaan 2, 9000 Gent, Belgium, bram.dewever@ugent.be

Current themes of research:

Technology-enhanced learning, inquiry learning, computer-supported collaborative learning, peer assessment and peer feedback, blended learning

Most relevant publications in the field of Psychology of Education:

1. De Wever, B., Schellens T., Valcke, M., & Van Keer H. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: a review. Computers and Education, 46, 6–28.

2. De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2007). Applying multilevel modelling on content analysis data: methodological issues in the study of the impact of role assignment in asynchronous discussion groups. Learning and Instruction, 17, 436–447.

3. De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2011). Assessing collaboration in a wiki: the reliability of university students’ peer assessment. The Internet and Higher Education, 14, 201–206.

4. Gielen, M, & De Wever, B. (2014). Structuring the peer assessment process: a multilevel approach for the impact on product improvement and peer feedback quality. Journal of Computer-assisted Learning, 31(5), 435–449.

5. Gielen, M., & De Wever, B. (2015). Structuring peer assessment: comparing the impact of the degree of structure on peer feedback content. Computers in Human Behavior, 52, 315–325

Appendix

Appendix

Table 7 Coding scheme for peer feedback content, and agreement with peer feedback

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Voet, M., Gielen, M., Boelens, R. et al. Using feedback requests to actively involve assessees in peer assessment: effects on the assessor’s feedback content and assessee’s agreement with feedback. Eur J Psychol Educ 33, 145–164 (2018). https://doi.org/10.1007/s10212-017-0345-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10212-017-0345-x

Keywords

Navigation