The validity and value of peer assessment using adaptive comparative judgement in design driven practical education

  • Niall SeeryEmail author
  • Donal Canty
  • Pat Phelan


This paper presents the response of the technology teacher education programmes at the University of Limerick to the assessment challenge created by the shift in philosophy of the Irish national curriculum from a craft-based focus to design-driven education. This study observes two first year modules of the undergraduate programmes that focused on the development of subject knowledge and practical craft skills. Broadening the educational experience and perspective of students to include design based aptitudes demanded a clear aligning of educational approaches with learning outcomes. As design is a complex iterative learning process it requires a dynamic assessment tool to facilitate and capture the process. Considering the critical role of assessment in the learning process, the study explored the relevance of individual student-defined assessment criteria and the validity of holistic professional judgement in assessing capability within a design activity. The kernel of the paper centres on the capacity of assessment criteria to change in response to how students align their work with evidence of capability. The approach also supported peer assessment, where student-generated performance ranks provided an insight into not only how effectively they evidenced capability but also to what extent their peers valued it. The study investigated the performance of 137 undergraduate teachers during an activity focusing on the development of design, processing and craft skills. The study validates the use of adaptive comparative judgement as a model of assessment by identifying a moderate to strong relationship with performance scores obtained by two different methods of assessment. The findings also present evidence of capability beyond the traditional measures. Level of engagement, diversity, and problem solving were also identified as significant results of the approach taken. The strength of this paper centres on the capacity of student-defined criterion assessment to evidence learning, and concludes by presenting a valid and reliable holistic assessment supported by comparative judgements.


Teacher education Technology education Holistic assessment Comparative pairs 


  1. Banks, F., Barlex, D., Jarvinen, E., O’Sullivan, G., Owen Jackson, G., & Rutland, M. (2004). DEPTH–developing professional thinking for technology teachers: An International Study. International Journal of Technology and Design Education, 14, 141–157.CrossRefGoogle Scholar
  2. Barlex, D. (2007). Creativity in school design and technology in England: A discussion of influences. International Journal of Technology and Design Education, 17, 149–162.CrossRefGoogle Scholar
  3. Barlex, D., & Trebell, D. (2008). Design-without-make: Challenging the conventional approach to teaching and learning in a design and technology classroom. International Journal of Technology Design Education, 18, 119–138.CrossRefGoogle Scholar
  4. Baynes, K. (2010). Models of change: The future of design education. Design and Technology: An International Journal, 15.3. The Design and Technology Association.Google Scholar
  5. Carty, A., & Phelan, P. (2006). The nature and provision of technology education in Ireland. Journal of Technology Education, 18(1), 7–11.Google Scholar
  6. Cross, N. (2001). Designerly ways of knowing: Design discipline versus design science. Design Issues, 17(3), 49–55.CrossRefGoogle Scholar
  7. Davies, A. (1996). Assessment and transferable skills in art and design. International Journal of Art and Design Education, 3, 327–331.CrossRefGoogle Scholar
  8. Dunbar, R. (2010). Informing technology teacher education; exploring the effects of contemporary teaching and learning. PhD thesis, University of Limerick.Google Scholar
  9. Gibson, K. (2008). Technology and technological knowledge: A challenge for school curricula. Teachers and Teaching: Theory and Practice, 14(1), 3–15.Google Scholar
  10. Guilford, J. P. (1950). Creativity. The American Psychologist, 5(9), 444–454.CrossRefGoogle Scholar
  11. Kimbell, R. (2007). E-assessment in Project e-scape. Design and Technology Education: An International Journal, 12(2), 66–76.Google Scholar
  12. Kimbell, R. (2010). The transient and the timeless: Surviving a lifetime of policy and practice in assessment. Design and Technology: An international Journal, 15.3, The Design and Technology Association.Google Scholar
  13. Kimbell, R., & Stables, K. (2007). Researching design learning. Dordrecht NL: Springer.CrossRefGoogle Scholar
  14. Kimbell, R., Stables, K., Wheeler, T., Miller, S., Bane, J., & Wright, R. (2004). Assessing design innovation. London: Department of Education and Skills.Google Scholar
  15. Kimbell, R., Stables, K., Wheeler, T., Wosniak, A., & Kelly, V. (1991). The assessment of performance in design and technology. London: School Examinations and Assessment Council/Central Office of Information.Google Scholar
  16. Kimbell, R., Wheeler, T., Miller, S., Pollitt, A., (2007). E-scape portfolio assessment phase 2 report. TERU, Goldsmiths University of London.Google Scholar
  17. Kimbell, R., Wheeler, T., Shepard, T., Browne Martin, G., Perry, D., Hall, P., et al. (2005). E-scape portfolio assessment phase 1 report. Teru: Goldsmiths University of London.Google Scholar
  18. Kimbell, R., Wheeler, T., Stables, K., Shepard, T., Pollitt, A., Martin, F., et al. (2009). E-scape portfolio assessment phase 3 report. Teru: Goldsmiths University of London.Google Scholar
  19. Lewis, T. (2009). Creativity in technology education: Providing children with glimpses of their inventive potential. International Journal of Technology and Design Education, 19, 255–268.CrossRefGoogle Scholar
  20. Mawson, B. (2003). Beyond “The Design Process”: An alternative pedagogy for technology education. International Journal of Technology and Design Education, 13, 117–128.CrossRefGoogle Scholar
  21. McCormick, R., & Davidson, M. (1996). Problem solving and the tyranny of product outcomes. The Journal of Design and Technology Education, 1(3), 230–241.Google Scholar
  22. McDowell, L., & Sambell, K. (1999). The experience of innovative assessment: Student perspectives. In S. Brown & A. Glasner (Eds.), Assessment matters in higher education: Choosing and using diverse approaches (pp. 71–82). Buckingham: Open University Press.Google Scholar
  23. Pollitt, A. (2004). Let’s stop marking exams. Paper presented at the annual conference of the international association for educational assessment, Philadelphia.Google Scholar
  24. Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education. Buckingham: SRHE and Open University Press.Google Scholar
  25. Rasinen, A. (2003). An analysis of technology education curriculum of six countries. Journal of Technology Education, 15(1), 31–47.Google Scholar
  26. Ritz, J. M. (2009). A new generation of goals for technology education. Journal of Technology Education, 20(2), 50–64.Google Scholar
  27. Schunk, D. (2004). Learning theories: An educational perspective (4th ed.). Upper Saddle River, NJ, USA: Pearson.Google Scholar
  28. Stiggins, R. (2005). Rethinking the motivational dynamics of productive assessment. Manitoba Association of School Superintendents Journal, 5(1), 8–12.Google Scholar
  29. Thurstone, L. L. (1927). A law of comparative judgment. Psychological Review, 34, 273–286. Reprinted as Chapter 3 in L. L. Thurstone (1959), The measurement of values. Chicago, Illinois: University of Chicago Press.Google Scholar
  30. Williams, P. J. (2000). Design: The only methodology of technology. Journal of Technology Education, 11(2), 48–60.Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2011

Authors and Affiliations

  1. 1.Department of Design and Manufacturing TechnologyUniversity of LimerickLimerickIreland

Personalised recommendations