Skip to main content

Advertisement

Log in

Assessing the Load: Effects of Visual Representation and Task Features on Exam Performance in Undergraduate Molecular Life Sciences

  • Published:
Research in Science Education Aims and scope Submit manuscript

Abstract

Current initiatives to transform undergraduate STEM education in the United States advocate for the use of multidimensional learning, wherein instruction provides opportunities for students to gain competency with fundamental or cross-disciplinary concepts and practices. To achieve this goal, the discipline-based education research (DBER) community is drawing on theories and approaches from the cognitive sciences to better understand how instructional choices may influence student cognition and performance. In this study, we investigate the extent to which adding visual representation to exam questions may alter the cognitive demand placed on the learner, thereby affecting exam performance. Exam questions were crafted in pairs – one form embedded some necessary information within a visual representation, while the other comprised only text – and distributed on exams across an undergraduate molecular life sciences curriculum. Comparison of analogous questions indicates that visual representation does affect performance on most questions; however, the nature of that effect depends on other features (e.g., format, cognitive level) of the task. Adding a visual representation to difficult open-response questions often decreased performance further, while adding a representation to similarly difficult forced-response items was more likely to increase performance. Drawing on cognitive theories of learning, we rationalize how the presence and interactivity of these elements may affect the cognitive load of the task and working memory efficiency. The findings of this study have implications for instructors regarding the design and interpretation of student assessments and call on researchers for deeper investigation of the relationship between student cognition and multidimensional assessments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data Availability

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  • Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. Longman.

    Google Scholar 

  • Arneson, J. B., & Offerdahl, E. G. (2018). Visual literacy in Bloom: Using Bloom’s taxonomy to support visual learning skills. CBE—Life Sciences. Education, 17(1), ar7.

    Google Scholar 

  • Bain, K., Bender, L., Bergeron, P., Caballero, M. D., Carmel, J. H., Duffy, E. M., et al. (2020). Characterizing college science instruction: The three-dimensional learning observation protocol. PLoS ONE, 15(6), e0234640.

    Article  Google Scholar 

  • Beilock, S. L. (2008). Math Performance in Stressful Situations. Current Directions in Psychological Science, 17(5), 339–343.

    Article  Google Scholar 

  • Bloom, B. S., Krathwohl, D. R., & Masia, B. B. (1956). Taxonomy of educational objectives: The classification of educational goals. D. McKay.

    Google Scholar 

  • Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Science.

    Google Scholar 

  • Brownell, S. E., Freeman, S., Wenderoth, M. P., & Crowe, A. J. (2014). BioCore guide: A tool for interpreting the core concepts of Vision and Change for biology majors. CBE—Life Sciences. Education, 13(2), 200–211.

    Google Scholar 

  • Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction. Cognition and instruction, 8(4), 293–332.

    Article  Google Scholar 

  • Chi, M. T., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. J. Sternberg (Ed.), Advances in the psychology of human intelligence (pp. 7–75). Erlbaum.

    Google Scholar 

  • Clemmons, A. W., Timbrook, J., Herron, J. C., & Crowe, A. J. (2020). BioSkills guide: Development and national validation of a tool for interpreting the Vision and Change core competencies. CBE—Life Sciences. Education, 19, ar53.

    Google Scholar 

  • Cowan, N. (2001). The magical number 4 in short term memory: a reconsideration of mental storage capacity. Behavioral and Brain Sciences, 24, 87–114.

    Article  Google Scholar 

  • Crowe, A., Dirks, C., & Wenderoth, M. P. (2008). Biology in bloom: implementing Bloom's taxonomy to enhance student learning in biology. CBE—Life Sciences Education, 7(4), 368–381.

    Article  Google Scholar 

  • Dolan, E. L., Elliott, S. L., Henderson, C., Curran-Everett, D., St John, K., & Ortiz, P. A. (2018). Evaluating discipline-based education research for promotion and tenure. Innovative Higher Education, 43(1), 31–39.

    Article  Google Scholar 

  • Funk, S. C., & Dickson, K. L. (2011). Multiple-choice and short-answer exam performance in a college classroom. Teaching of Psychology, 38(4), 273–277.

    Article  Google Scholar 

  • Hall, K. L. (2013). Examining the effects of students’ classroom expectations on undergraduate biology course reform. Digital Repository at the U of Maryland. Retrieved December 16, 2021, from http://drum.lib.umd.edu/handle/1903/14080

  • Kirschner, P. A. (2002). Cognitive load theory: Implications of cognitive load theory on the design of learning. Learning and Instruction, 12, 1–10.

    Article  Google Scholar 

  • Laverty, J. T., Underwood, S. M., Matz, R. L., Posey, L. A., Carmel, J. H., Caballero, M. D., et al. (2016). Characterizing college science assessments: The three-dimensional learning assessment protocol. PloS ONE, 11(9), e0162333.

    Article  Google Scholar 

  • Liou, P. Y., & Bulut, O. (2020). The effects of item format and cognitive domain on students’ science performance in TIMSS 2011. Research in Science Education, 50, 99–121.

    Article  Google Scholar 

  • Mayer, R. E. (2001). Multimedia learning. Cambridge University Press.

    Book  Google Scholar 

  • Mayer, R. E., & Sims, V. K. (1994). For whom is a picture worth a thousand words? Extensions of a dual-coding theory of multimedia learning. Journal of educational psychology, 86(3), 389.

    Article  Google Scholar 

  • Merkel, S., & ASM Task Force on Curriculum Guidelines for Undergraduate Microbiology. (2012). The development of curricular guidelines for introductory microbiology that focus on understanding. Journal of Microbiology & Biology Education, 13(1), 32.

    Article  Google Scholar 

  • Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63(2), 81.

    Article  Google Scholar 

  • Momsen, J., Offerdahl, E., Kryjevskaia, M., Montplaisir, L., Anderson, E., & Grosz, N. (2013). Using assessments to investigate and compare the nature of learning in undergraduate science courses. CBE—Life Sciences. Education, 12(2), 239–249.

    Google Scholar 

  • National Research Council. (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering. National Academies Press.

    Google Scholar 

  • Paivio, A. (1990). Mental representations: A dual coding approach. Oxford University Press.

    Book  Google Scholar 

  • Roediger III, H. L., & Pyc, M. A. (2012). Inexpensive techniques to improve education: Applying cognitive psychology to enhance educational practice. Journal of Applied Research in Memory and Cognition, 1(4), 242–248.

    Article  Google Scholar 

  • Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE—Life Sciences. Education, 16(1), rm1.

    Google Scholar 

  • Sweller, J. (2010). Element interactivity and intrinsic, extraneous, and germane cognitive load. Educational psychology review, 22(2), 123–138.

    Article  Google Scholar 

  • Sweller, J., Van Merrienboer, J. J., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.

    Article  Google Scholar 

  • Sweller, J., van MerriĂ«nboer, J. J., & Paas, F. (2019). Cognitive architecture and instructional design: 20 years later. Educational Psychology Review, 1–32.

  • Tansey, J. T., Baird Jr., T., Cox, M. M., Fox, K. M., Knight, J., Sears, D., & Bell, E. (2013). Foundational concepts and underlying theories for majors in “biochemistry and molecular biology”. Biochemistry and Molecular Biology Education, 41(5), 289–296.

    Google Scholar 

  • Theobald, E. (2018). Students are rarely independent: When, why, and how to use random effects in discipline-based education research. CBE—Life Sciences. Education, 17(3), rm2.

    Google Scholar 

  • Theobald, R., & Freeman, S. (2014). Is it the intervention or the students? Using linear regression to control for student characteristics in undergraduate STEM education research. CBE—Life Sciences. Education, 13(1), 41–48.

    Google Scholar 

  • Tibell, L. A. E., & Rundgren, C.-J. (2010). Educational challenges of molecular life science: Characteristics and implications for education and research. CBE—Life Sciences. Education, 9(1), 25–33.

    Google Scholar 

  • White, H. B., Benore, M. A., Sumter, T. F., Caldwell, B. D., & Bell, E. (2013). What skills should students of undergraduate biochemistry and molecular biology programs have upon graduation? Biochemistry and Molecular Biology Education, 41(5), 297–301.

    Google Scholar 

Download references

Acknowledgements

We thank L. Gloss, W. B. Davis, P. Mixter, and N. Kelp for their feedback and contributions to question design and implementation.

Funding

This material is based on work supported by the National Science Foundation (NSF) Graduate Research Fellowship Program under grant no. DGE-1010619. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

Author information

Authors and Affiliations

Authors

Contributions

JBA and EGO conceived the experiment. JBA designed the analogous questions, coordinated with instructors to measure student performance, and analyzed the data. EGO supervised the project. JBA and EGO contributed to the writing of the manuscript.

Corresponding author

Correspondence to Jessie B. Arneson.

Ethics declarations

Competing Interests

The authors declare they have no competing interests.

Ethics

The WSU Office of Research Assurances has determined this study is exempt from the need for IRB review as it satisfied the criteria for Exempt Research at 45 CFR 46.101(b)(1).

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Arneson, J.B., Offerdahl, E.G. Assessing the Load: Effects of Visual Representation and Task Features on Exam Performance in Undergraduate Molecular Life Sciences. Res Sci Educ 53, 319–335 (2023). https://doi.org/10.1007/s11165-022-10057-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11165-022-10057-7

Keywords

Navigation