Abstract
Technology education, or design technology as it is known elsewhere, supports learners to develop technological literacy by providing them with the opportunity to, amongst others, develop and apply the preset design process to solve technological problems. This subject requires learners to develop authentic technological solutions. As a result, assessment in this regard is essential. An analytical rubric is a tool for assessment with explicit assessment criteria that enables learners to understand what is expected of them. It is emphasised that knowledge without the pertinent skills to realise practical solutions is worthless. This study investigated the manner in which technology teachers use analytical rubrics to assess technological solutions. A qualitative research approach was employed to understand the teachers’ perceptions when using analytical rubrics, and to investigate how effective they are in assessing learners. A case study was selected to identify the participants, and data were collected through semi-structured, face-to-face interviews, including a document analysis. In order to ensure the credibility of the study, specific questions were asked to all of the participants. A thematic analysis procedure was used to analyse the data. The study confirmed that teachers still do not explain to learners the key concepts and descriptors used in a rubric. Future research should focus on supporting technology teachers to develop their own analytical rubrics to ensure that learners are conversant with the terminology used in the rubric and that they subsequently develop a sense of direction academically.
Similar content being viewed by others
References
Ackermans, K., Rusman, E., Brand-Gruwel, S., & Spencht, M. (2017). A first step towards synthesizing rubrics and video for the formative assessment of complex skills. International Computer Assisted Assessment Conference. TEA 2016: Technology Enhanced Assessment, pp 1–10.
Ary, D., Jacobs, L. C., & Sorensen, C. (2010). Introduction to research in education (8th ed.). Belmont: Wadsworth, Cengage Learning.
Baird, J. A., Andrich, D., Hopfenbeck, T., & Stobart, G. (2019). Assessment in education: principle. Policy & Practice, 24(3), 317–350.
Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research Journal, 9(2), 28–40.
Brinkmann, S., & Kvale, S. (2018). Doing interviews: The Sage qualitative research kit (2nd ed.). London: SAGE Publications.
Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Alexandria: ASCD Publications.
Brookhart, S. M. (2018). Appropriate criteria: key to effective rubrics. Frontiers in Education, 3(22), 1–12.
Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67(3), 343–368.
Bruno, I., Santos, L., & Costa, N. (2016). The way students’ internalise assessment criteria on inquiry reports. Studies in Educational Evaluation, 51, 55–66.
Cennamo, K. S., Ross, J. D., & Ertmer, P. A. (2014). Technology integration for meaningful classroom use: A Standard-Based approach (2nd ed.). Belmont: Wadsworth, Cengage Learning.
Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education (6th ed.). London: Routledge Taylor & Francis Group.
Cowie, B., Moreland, J., & Otrel-Cass, K. (2013). Expanding notions of assessment for learning: inside Science and Technology primary classrooms. Rotterdam: Sense Publishers.
Chung, P., Yeh, R. C., & Chen., Y. C. (2016). Influence of problem-based learning strategy on enhancing students industrial oriented competencies learned: an action research on learning Weblog Analysis. International Journal of Technology and Design Education, 26, 285–307.
Crooks, T. (2011). Assessment for learning in the accountability era: New Zealand. Studies in Educational Evaluation, 37, 71–77.
Dawson, P. (2017). Assessment rubrics: towards clearer and more replicable design, research and practice. Assessment & Evaluation in Higher Education, 42(3), 347–360.
Department of Basic Education. (2011). Curriculum and Assessment Policy Statement (CAPS) of Technology Grades 7–9. Pretoria: Department of Education.
Department of Basic Education. (2017). Guidelines to strengthen CAPS implementation 2017. General Education and Training (GET) Band Grade R-9. Pretoria: Department of Education.
Giacumo, L. A., & Savenye, W. (2020). Asynchronous discussion forum design to support cognition: effects of rubrics and instructors prompts on learners’ critical thinking, achievement, and satisfaction. Education Technology Research Development, 68, 37–66.
Greenberg, K. P. (2015). Rubric use in formative assessment: a detailed behavioural rubric helps students improve their scientific writing skills. Teaching of Psychology, 42(3), 211–217.
Grigorenko, E. L. (2009). Dynamic assessment and response to intervention: two sides of one coin. Journal of Learning Disabilities, 42(2), 111–132.
James, M. (2017). (Re)viewing assessment: changing lenses to refocus on learning. Assessment in Education: Principle, Policy & Practice, 24(3), 404–414.
Jones, L., Allen, B., Dunn, P., & Brooker, L. (2017). Demystifying the rubric: a five-step pedagogy to improve students understanding and utilisation of marking criteria. Higher Education Research & Development, 36(1), 129–142.
Johnsson, A., & Svingby, G. (2007). The use of scoring rubrics: reliability, validity and educational consequences. Educational Research Review, 2, 130–144.
Houghton, C., Casey, D., Shaw, D., & Murphy, K. (2013). Rigour in qualitative case study research. Nurse Researcher, 20(4), 12–17.
Laurian, S., & Fitzgerald, C. J. (2013). Effects of using rubrics in a university academic level Romanian literature class. Social and Behavioural Science, 76, 431–440.
Lidz, C. S. (1991). Practitioner’s guide to dynamic assessment. New York: The Guilford Press.
Lou, S. J., Shin, R. C., Diez, R., & Tseng, K. H. (2011). The impact of problem-based learning strategies on STEM knowledge integration and attitudes: an exploratory study among female Taiwanese Senior High students. International Journal of Technology and Design Education, 21, 195–215.
Marchant, J., Pretorius, A., Smith, K., & Smith, S. (2013). Spot on technology, grade 9: Learners’ book. Sandton: Heinemann.
Mertler, C. A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation, 7(25), 1–8.
Molina, G. M., Alfaro, H. P., & González, S. R. A. (2017). Design and validation of a rubric to assess the use of American Psychological Association style in scientific articles. Journal of New Approaches in Educational Research, 6(1), 78–86.
Morris, T. H. (2019). Adaptivity through self-directed learning to meet the challenges of our ever-changing world. Adult Learning, 30(2), 56–66.
Opre, D. (2015). Teacher’s conceptions of assessment. Procedia – Social and Behavioural Science, 209, 229–233.
Owen-Jackson, G. (2013). Debates in design and technology education. New York: Routledge.
Panadero, E., & Johsson, A. (2020). A critical review of the arguments against the use of rubrics. Educational Research Review, 30, 1–19.
Poeher, M.E. (2008). Dynamic assessment: A Vygotskian approach to understanding and promoting L2 development. Springer Science.
Popham, W. J. (1997). What’s wrong—and what’s right—with rubrics. Educational Leadership, 51(2), 72–75.
Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation, 35(4), 435–448.
Salinas, J.J., & Erochko, J. (2015). Using weighted scoring rubrics in Engineering assessment. Proceedings of Canadian Engineering Education Association Conference, McMaster University, May 31 - June 3, 2015.
Shata, S., & Well, J.G. (2020). T/E design based learning: assessing student critical thinking and problem solving abilities. International Journal of Technology and Design Education, (In press).
Sulla, N. (2011). Students taking charge: inside the learner—active, technology—infused classroom. London: Routledge Taylor & Francis Group.
Van der Sanden, M. C. A., & De Vries, M. J. (2016). Science and Technology Education and Communication: seeking synergy. Rotterdam: Sense Publishers.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.
Wang, V. C. X. (2011). Assessment and evaluating adult learning in career and technical education. Hershey: ICI Global and Zhejiang University Press.
William, D. (2011). What is assessment for learning? Studies in Education Evaluation, 37, 3–14.
William, D. (2017). Assessment and learning: some reflections. Assessment in Education: Principle, Policy & Practice, 24(3), 394–403.
William, P. J., Iglesias, J., & Barak, M. (2008). Problem-based learning: application to technology education in three countries. International Journal of Technology and Design Education, 18, 319–355.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Kola, I.M. Using analytical rubrics to assess technological solutions in the technology classroom. Int J Technol Des Educ 32, 883–904 (2022). https://doi.org/10.1007/s10798-020-09635-5
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10798-020-09635-5