Skip to main content
Log in

Investigating the impacts of differentiated stimulus materials in a learning by evaluating activity

  • Published:
International Journal of Technology and Design Education Aims and scope Submit manuscript

Abstract

Classroom research has demonstrated the capacity for significantly influencing student learning by engaging students in evaluation of previously submitted work as an intentional priming exercise for learning; we call this experience Learning by Evaluating (LbE). Expanding on current LbE research, we set forth to investigate the impact on student learning by intentionally differing the quality of examples evaluated by the students using adaptive comparative judgement. In this research, university design students (N = 468 students) were randomly assigned to one of three treatment groups; while each group evaluated previously collected student work as an LbE priming activity, the work evaluated by each group differed in quality. Using a three-group experimental design, one group of students only evaluated high quality examples, the second only evaluated low quality examples, and the third group of students evaluated a set of mixed-quality examples of the assignment they were about to work on. Following these LbE priming evaluations, students completed the assigned work and then their projects were evaluated to determine if there was a difference between student performance by treatment condition. Additional qualitative analysis was completed on student LbE rationales to explore similarities and differences in student cognitive judgments based on intervention grouping. No significant difference was found between the groups in terms of achievement, but several differences in group judgement approach were identified and future areas needing investigation were highlighted.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Baker, S. E., & Edwards, R. (2012). How many qualitative interviews is enough? Expert voices and early career reflections on sampling and cases in qualitative research. National Centre for Research Methods, pp. 1–42.

  • Baniya, S., Chesley, A., Mentzer, N., Bartholomew, S., Moon, C., & Sherman, D. (2019). Using adaptive comparative judgment in writing assessment: An investigation of reliability among interdisciplinary evaluators. Journal of Technology Studies, 45(2), 24–45.

    Google Scholar 

  • Bartholomew, S. R., Strimel, G. J., Garcia Bravo, E., Zhang, L., & Yoshikawa, E. (2018). Formative feedback for improved student performance through adaptive comparative judgment. Paper presented at the 125th ASEE conference, Salt Lake City, Utah.

  • Bartholomew, S. R., & Yoshikawa, E. (2018). A systematic review of research around adaptive comparative judgment (ACJ) in K-16 education. 2018 CTETE Monograph Series. https://doi.org/10.21061/ctete-rms.v1.c.1.

  • Bartholomew, S. R., & Yauney, J. (2022). The impact of differentiated stimulus materials in learning by evaluating. Pupils’ Attitudes Towards Technology 39th Annual Conference, St. John’s, Canada, 2022. https://par.nsf.gov/servlets/purl/10340851.

  • Bartholomew, S. R. (2017). Assessing open-ended design problems. Technology and Engineering Education Teacher, 76(6), 13–17.

    Google Scholar 

  • Bartholomew, S. R., Mentzer, N., Jones, M., Sherman, D., & Baniya, S. (2020). Learning by evaluating (LbE) through adaptive comparative judgment. International Journal of Technology and Design Education. https://doi.org/10.1007/s10798-020-09639-1

    Article  Google Scholar 

  • Bartholomew, S. R., Strimel, G. S., & Yoshikawa, E. (2018b). Using adaptive comparative judgment for student formative feedback and learning during a middle school open- ended design challenge. International Journal of Technology and Design Education, 29(2), 363–385.

    Article  Google Scholar 

  • Bartholomew, S. R., Zhang, L., Garcia Bravo, E., & Strimel, G. J. (2019). A tool for formative assessment and learning in a graphics design course: Adaptive comparative judgement. The Design Journal, 22(1), 73–95.

    Article  Google Scholar 

  • Bramley, T. (2015). Investigating the reliability of adaptive comparative judgment (p. 36). Cambridge Assessment.

    Google Scholar 

  • Buckley, J., Seery, N., & Kimbell, R. (2022). A review of the valid methodological use of adaptive comparative judgment in technology education research. Frontiers in Education, 7, 787926. https://doi.org/10.3389/feduc.2022.787926

    Article  Google Scholar 

  • Caniglia, J. (2020). Promoting self-reflection over re-teaching: Addressing students misconceptions with “my favorite no.” Journal of Mathematics Education, 5(2), 70–78.

    Google Scholar 

  • Canty, D. (2012). The impact of holistic assessment using adaptive comparative judgment of student learning, PhD Thesis, University of Limerick, Ireland.

  • Collins, A. (2022). Cognitive apprenticeship. Retrieved April 11, 2022 from https://www.isls.org/research-topics/cognitive-apprenticeship/.

  • Collins, R. (2014). Skills for the 21st Century: Teaching higher-order thinking. Curriculum and Leadership Journal, 12(14), 1–8.

    Google Scholar 

  • Dam, R. F., & Siang, T. Y. (2020). Stage 2 in the design thinking process: Define the problem and interpret the results. The Interaction Design Foundation. Retrieved November 15, 2021, from https://www.interaction-design.org/literature/article/stage-2-in-the-design-thinking-process-define-the-problem-and-interpret-the-results.

  • Johnson, C. C., Sondergeld, T. A., & Walton, J. B. (2019). A study of the implementation of formative assessment in three large urban districts. American Educational Research Journal, 56(6), 2408–2438. https://doi.org/10.3102/0002831219842347

    Article  Google Scholar 

  • Johnston, O., Wildy, H., & Shand, J. (2019). A Decade of teacher expectations research 2008–2018: Historical foundations, new developments, and future pathways. Australian Journal of Education, 63(1), 44–73. https://doi.org/10.1177/0004944118824420

    Article  Google Scholar 

  • Kimbell, R. (2018). Constructs of quality and the power of holism in Pupils attitudes towards technology 36th Conference Proceedings, pp. 181–186.

  • Kimbell, R. (2012). The origins and underpinning principles of e-scape. International Journal of Technology and Design Education, 22, 123–124.

    Article  Google Scholar 

  • Kimbell, R. (2021). Examining the reliability of adaptive comparative judgement (ACJ) as an assessment tool in educational settings. International Journal of Technology and Design Education. https://doi.org/10.1007/s10798-021-09654-w

    Article  Google Scholar 

  • Mentzer, N., Lee, W., & Bartholomew, S. R. (2021). Examining the validity of adaptive comparative judgment for peer evaluation in a design thinking course. Frontiers in Education, 6, 772832.

    Article  Google Scholar 

  • Miksza, P. (2011). A review of research on practicing: Summary and synthesis of the extant research with implications for a new theoretical orientation. Bulletin of the Council for Research in Music Education, 190, 51–92. https://doi.org/10.5406/bulcouresmusedu.190.0051

    Article  Google Scholar 

  • Pollitt, A. (2004). Let’s stop marking exams. Retrieved from http://www.cambridgeassessment.org.uk/images/109719-let-s-stop-marking-exams.pdf.

  • Pollitt, A. (2015). On ‘reliability’ bias in ACJ. Cambridge Exam Research. Retrieved April 23, 2020 from https://www.researchgate.net/publication/283318012_On_’Reliability’_bias_in_ACJ.

  • Pollitt, A. (2012). The method of adaptive comparative judgement. Assessment in Education: Principles, Policy and Practice, 19(3), 281–300.

    Google Scholar 

  • Rangel-Smith, C., & Lynch, D. (2018). Addressing the issue of bias in the measurement of reliability in the method of adaptive comparative judgment. In 36th pupils’ attitudes towards technology conference, Athlone, Ireland, pp. 378–387.

  • Robertson, S., Humphrey, S., & Steele, J. (2019). Using technology tools for formative assessments. The Journal of Educators Online. https://doi.org/10.9743/jeo.2019.16.2.11

    Article  Google Scholar 

  • Saldaña, J. (2015). The coding manual for qualitative researchers. SAGE.

    Google Scholar 

  • Schwartz, D. L., Chase, C. C., Oppezzo, M. A., & Chin, D. B. (2011). Practicing versus inventing with contrasting cases: The effects of telling first on learning and transfer. Journal of Educational Psychology, 103(4), 759.

    Article  Google Scholar 

  • Seery, N., & Canty, D. (2017). Assessment and learning: The proximal and distal effects of comparative judgment. Handbook of Technology Education. https://doi.org/10.1007/978-3-319-38889-2_54-1

    Article  Google Scholar 

  • Sherman, D., Mentzer, N., Bartholomew, S., et al. (2022). Across the disciplines: our gained knowledge in assessing a firstyear integrated experience. Int J Technol Des Educ, 32, 1369–1391. https://doi.org/10.1007/s10798-020-09650-6.

    Article  Google Scholar 

  • Thurstone, L. L. (1927). A law of comparative judgment. Psychological Review, 34, 273–286.

    Article  Google Scholar 

  • Wible, S. (2020). Using design thinking to teach creative problem solving in writing courses. College Composition and Communication, 71(3), 399–425.

    Article  Google Scholar 

Download references

Funding

This material is based upon work supported by the National Science Foundation under Grant 2101235.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Scott Bartholomew.

Ethics declarations

Conflict of interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Disclaimer

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bartholomew, S., Yauney, J., Mentzer, N. et al. Investigating the impacts of differentiated stimulus materials in a learning by evaluating activity. Int J Technol Des Educ (2024). https://doi.org/10.1007/s10798-023-09871-5

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10798-023-09871-5

Keywords

Navigation