Skip to main content

Advertisement

Log in

Using analytical rubrics to assess technological solutions in the technology classroom

  • Published:
International Journal of Technology and Design Education Aims and scope Submit manuscript

Abstract

Technology education, or design technology as it is known elsewhere, supports learners to develop technological literacy by providing them with the opportunity to, amongst others, develop and apply the preset design process to solve technological problems. This subject requires learners to develop authentic technological solutions. As a result, assessment in this regard is essential. An analytical rubric is a tool for assessment with explicit assessment criteria that enables learners to understand what is expected of them. It is emphasised that knowledge without the pertinent skills to realise practical solutions is worthless. This study investigated the manner in which technology teachers use analytical rubrics to assess technological solutions. A qualitative research approach was employed to understand the teachers’ perceptions when using analytical rubrics, and to investigate how effective they are in assessing learners. A case study was selected to identify the participants, and data were collected through semi-structured, face-to-face interviews, including a document analysis. In order to ensure the credibility of the study, specific questions were asked to all of the participants. A thematic analysis procedure was used to analyse the data. The study confirmed that teachers still do not explain to learners the key concepts and descriptors used in a rubric. Future research should focus on supporting technology teachers to develop their own analytical rubrics to ensure that learners are conversant with the terminology used in the rubric and that they subsequently develop a sense of direction academically.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Ackermans, K., Rusman, E., Brand-Gruwel, S., & Spencht, M. (2017). A first step towards synthesizing rubrics and video for the formative assessment of complex skills. International Computer Assisted Assessment Conference. TEA 2016: Technology Enhanced Assessment, pp 1–10.

  • Ary, D., Jacobs, L. C., & Sorensen, C. (2010). Introduction to research in education (8th ed.). Belmont: Wadsworth, Cengage Learning.

    Google Scholar 

  • Baird, J. A., Andrich, D., Hopfenbeck, T., & Stobart, G. (2019). Assessment in education: principle. Policy & Practice, 24(3), 317–350.

    Google Scholar 

  • Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research Journal, 9(2), 28–40.

    Article  Google Scholar 

  • Brinkmann, S., & Kvale, S. (2018). Doing interviews: The Sage qualitative research kit (2nd ed.). London: SAGE Publications.

    Book  Google Scholar 

  • Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Alexandria: ASCD Publications.

    Google Scholar 

  • Brookhart, S. M. (2018). Appropriate criteria: key to effective rubrics. Frontiers in Education, 3(22), 1–12.

    Google Scholar 

  • Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67(3), 343–368.

    Article  Google Scholar 

  • Bruno, I., Santos, L., & Costa, N. (2016). The way students’ internalise assessment criteria on inquiry reports. Studies in Educational Evaluation, 51, 55–66.

    Article  Google Scholar 

  • Cennamo, K. S., Ross, J. D., & Ertmer, P. A. (2014). Technology integration for meaningful classroom use: A Standard-Based approach (2nd ed.). Belmont: Wadsworth, Cengage Learning.

    Google Scholar 

  • Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education (6th ed.). London: Routledge Taylor & Francis Group.

    Book  Google Scholar 

  • Cowie, B., Moreland, J., & Otrel-Cass, K. (2013). Expanding notions of assessment for learning: inside Science and Technology primary classrooms. Rotterdam: Sense Publishers.

    Book  Google Scholar 

  • Chung, P., Yeh, R. C., & Chen., Y. C. (2016). Influence of problem-based learning strategy on enhancing students industrial oriented competencies learned: an action research on learning Weblog Analysis. International Journal of Technology and Design Education, 26, 285–307.

    Article  Google Scholar 

  • Crooks, T. (2011). Assessment for learning in the accountability era: New Zealand. Studies in Educational Evaluation, 37, 71–77.

    Article  Google Scholar 

  • Dawson, P. (2017). Assessment rubrics: towards clearer and more replicable design, research and practice. Assessment & Evaluation in Higher Education, 42(3), 347–360.

    Article  Google Scholar 

  • Department of Basic Education. (2011). Curriculum and Assessment Policy Statement (CAPS) of Technology Grades 7–9. Pretoria: Department of Education.

    Google Scholar 

  • Department of Basic Education. (2017). Guidelines to strengthen CAPS implementation 2017. General Education and Training (GET) Band Grade R-9. Pretoria: Department of Education.

  • Giacumo, L. A., & Savenye, W. (2020). Asynchronous discussion forum design to support cognition: effects of rubrics and instructors prompts on learners’ critical thinking, achievement, and satisfaction. Education Technology Research Development, 68, 37–66.

    Article  Google Scholar 

  • Greenberg, K. P. (2015). Rubric use in formative assessment: a detailed behavioural rubric helps students improve their scientific writing skills. Teaching of Psychology, 42(3), 211–217.

    Article  Google Scholar 

  • Grigorenko, E. L. (2009). Dynamic assessment and response to intervention: two sides of one coin. Journal of Learning Disabilities, 42(2), 111–132.

    Article  Google Scholar 

  • James, M. (2017). (Re)viewing assessment: changing lenses to refocus on learning. Assessment in Education: Principle, Policy & Practice, 24(3), 404–414.

    Google Scholar 

  • Jones, L., Allen, B., Dunn, P., & Brooker, L. (2017). Demystifying the rubric: a five-step pedagogy to improve students understanding and utilisation of marking criteria. Higher Education Research & Development, 36(1), 129–142.

    Article  Google Scholar 

  • Johnsson, A., & Svingby, G. (2007). The use of scoring rubrics: reliability, validity and educational consequences. Educational Research Review, 2, 130–144.

    Article  Google Scholar 

  • Houghton, C., Casey, D., Shaw, D., & Murphy, K. (2013). Rigour in qualitative case study research. Nurse Researcher, 20(4), 12–17.

    Article  Google Scholar 

  • Laurian, S., & Fitzgerald, C. J. (2013). Effects of using rubrics in a university academic level Romanian literature class. Social and Behavioural Science, 76, 431–440.

    Google Scholar 

  • Lidz, C. S. (1991). Practitioner’s guide to dynamic assessment. New York: The Guilford Press.

    Google Scholar 

  • Lou, S. J., Shin, R. C., Diez, R., & Tseng, K. H. (2011). The impact of problem-based learning strategies on STEM knowledge integration and attitudes: an exploratory study among female Taiwanese Senior High students. International Journal of Technology and Design Education, 21, 195–215.

    Article  Google Scholar 

  • Marchant, J., Pretorius, A., Smith, K., & Smith, S. (2013). Spot on technology, grade 9: Learners’ book. Sandton: Heinemann.

    Google Scholar 

  • Mertler, C. A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research & Evaluation, 7(25), 1–8.

    Google Scholar 

  • Molina, G. M., Alfaro, H. P., & González, S. R. A. (2017). Design and validation of a rubric to assess the use of American Psychological Association style in scientific articles. Journal of New Approaches in Educational Research, 6(1), 78–86.

    Article  Google Scholar 

  • Morris, T. H. (2019). Adaptivity through self-directed learning to meet the challenges of our ever-changing world. Adult Learning, 30(2), 56–66.

    Article  Google Scholar 

  • Opre, D. (2015). Teacher’s conceptions of assessment. Procedia – Social and Behavioural Science, 209, 229–233.

  • Owen-Jackson, G. (2013). Debates in design and technology education. New York: Routledge.

    Book  Google Scholar 

  • Panadero, E., & Johsson, A. (2020). A critical review of the arguments against the use of rubrics. Educational Research Review, 30, 1–19.

    Article  Google Scholar 

  • Poeher, M.E. (2008). Dynamic assessment: A Vygotskian approach to understanding and promoting L2 development. Springer Science.

  • Popham, W. J. (1997). What’s wrong—and what’s right—with rubrics. Educational Leadership, 51(2), 72–75.

    Google Scholar 

  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation, 35(4), 435–448.

    Article  Google Scholar 

  • Salinas, J.J., & Erochko, J. (2015). Using weighted scoring rubrics in Engineering assessment. Proceedings of Canadian Engineering Education Association Conference, McMaster University, May 31 - June 3, 2015.

  • Shata, S., & Well, J.G. (2020). T/E design based learning: assessing student critical thinking and problem solving abilities. International Journal of Technology and Design Education, (In press).

  • Sulla, N. (2011). Students taking charge: inside the learner—active, technology—infused classroom. London: Routledge Taylor & Francis Group.

    Google Scholar 

  • Van der Sanden, M. C. A., & De Vries, M. J. (2016). Science and Technology Education and Communication: seeking synergy. Rotterdam: Sense Publishers.

    Book  Google Scholar 

  • Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Wang, V. C. X. (2011). Assessment and evaluating adult learning in career and technical education. Hershey: ICI Global and Zhejiang University Press.

    Book  Google Scholar 

  • William, D. (2011). What is assessment for learning? Studies in Education Evaluation, 37, 3–14.

    Article  Google Scholar 

  • William, D. (2017). Assessment and learning: some reflections. Assessment in Education: Principle, Policy & Practice, 24(3), 394–403.

    Google Scholar 

  • William, P. J., Iglesias, J., & Barak, M. (2008). Problem-based learning: application to technology education in three countries. International Journal of Technology and Design Education, 18, 319–355.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Isaac Malose Kola.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kola, I.M. Using analytical rubrics to assess technological solutions in the technology classroom. Int J Technol Des Educ 32, 883–904 (2022). https://doi.org/10.1007/s10798-020-09635-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10798-020-09635-5

Keywords

Navigation