Skip to main content
Log in

Diagnosing Students’ Understanding of the Nature of Models

  • Published:
Research in Science Education Aims and scope Submit manuscript

Abstract

Students’ understanding of models in science has been subject to a number of investigations. The instruments the researchers used are suitable for educational research but, due to their complexity, cannot be employed directly by teachers. This article presents forced choice (FC) tasks, which, assembled as a diagnostic instrument, are supposed to measure students’ understanding of the nature of models efficiently, while being sensitive enough to detect differences between individuals. In order to evaluate if the diagnostic instrument is suitable for its intended use, we propose an approach that complies with the demand to integrate students’ responses to the tasks into the validation process. Evidence for validity was gathered based on relations to other variables and on students’ response processes. Students’ understanding of the nature of models was assessed using three methods: FC tasks, open-ended tasks and interviews (N = 448). Furthermore, concurrent think-aloud protocols (N = 30) were performed. The results suggest that the method and the age of the students have an effect on their understanding of the nature of models. A good understanding of the FC tasks as well as a convergence in the findings across the three methods was documented for grades eleven and twelve. This indicates that teachers can use the diagnostic instrument for an efficient and, at the same time, valid diagnosis for this group. Finally, the findings of this article may provide a possible explanation for alternative findings from previous studies as a result of specific methods that were used.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Abd-El-Khalick, F. (2012). Nature of science in science education: toward a coherent framework for synergistic research and development. In B. Fraser, K. Tobin, & C. McRobbie (Eds.), Second international handbook of science education (pp. 1041–1060). Dordrecht: Springer.

    Chapter  Google Scholar 

  • Adams, R., & Wu, M. (2002). PISA 2000 technical report. Paris: OECD.

    Google Scholar 

  • Al-Balushi, S. (2011). Students’ evaluation of the credibility of scientific models that represent natural entities and phenomena. International Journal of Science and Mathematics Education, 9, 571–601.

    Article  Google Scholar 

  • American Educational Research Association, American Psychological Association & National Council on Measurement in Education [AERA, APA, & NCME]. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  • Anastasi, A. (1976). Psychological testing. New York: Macmillan.

    Google Scholar 

  • Bailer-Jones, D. (2009). Scientific models in philosophy of science. Pittsburgh, PA: University of Pittsburgh.

    Google Scholar 

  • Bennett, R. (1993). On the meaning of constructed response. In R. Bennett & W. Ward (Eds.), Construction versus choice in cognitive measurement: issues in constructed response, performance testing, and portfolio assessment (pp. 1–28). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Böckenholt, U. (2004). Comparative judgments as an alternative to ratings: identifying the scale origin. Psychological Methods, 9, 453–465.

    Article  Google Scholar 

  • Bond, T., & Fox, C. (2001). Applying the Rasch model. Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Brennan, R., & Prediger, D. (1981). Coefficient kappa: some uses, misuses, and alternatives. Educational and Psychological Measurement, 41, 687–699.

    Article  Google Scholar 

  • Campbell, D., & Fiske, D. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81–105.

    Article  Google Scholar 

  • Chittleborough, G., Treagust, D., Mamiala, T., & Mocerino, M. (2005). Students’ perceptions of the role of models in the process of science and in the process of learning. Research in Science and Technological Education, 23, 195–212.

    Article  Google Scholar 

  • Crawford, B., & Cullin, M. (2005). Dynamic assessments of preservice teachers’ knowledge of models and modelling. In K. Boersma, M. Goedhart, O. de Jong, & H. Eijkelhof (Eds.), Research and the quality of science education (pp. 309–323). Dordrecht: Springer.

    Chapter  Google Scholar 

  • Danusso, L., Testa, I., & Vicentini, M. (2010). Improving prospective teachers’ knowledge about scientific models and modelling: design and evaluation of a teacher education intervention. International Journal of Science Education, 32, 871–905.

    Article  Google Scholar 

  • Eggert, S., & Bögeholz, S. (2010). Students’ use of decision-making strategies with regard to socioscientific issues—an application of the Rasch partial credit model. Science Education, 94, 230–258.

    Google Scholar 

  • Eid, M., & Diener, E. (2006). Handbook of multimethod measurement in psychology. Washington, DC: American Psychological Association.

  • Eid, M., & Schmidt, K. (2014). Testtheorie und Testkonstruktion [Test theory—test construction]. Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Embretson, S., & Gorin, J. (2001). Improving construct validity with cognitive psychology principles. Journal of Educational Measurement, 38, 343–368.

    Article  Google Scholar 

  • Embretson, S., & Reise, S. (Eds.). (2000). Item response theory for psychologists. Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Ericsson, K., & Simon, H. (1998). How to study thinking in everyday life: contrasting think-aloud protocols with descriptions and explanations of thinking. Mind, Culture, and Activity, 5, 178–186.

    Article  Google Scholar 

  • Gibbs, A., & Lawson, A. (1992). The nature of scientific thinking as reflected by the work of biologists & by biology textbooks. The American Biology Teacher, 54, 137–152.

    Article  Google Scholar 

  • Giere, R. (1988). Explaining science. A cognitive approach. Chicago, IL: University of Chicago.

    Book  Google Scholar 

  • Gilbert, J. (2004). Models and modelling: routes to more authentic science education. International Journal of Science and Mathematics Education, 2, 115–130.

    Article  Google Scholar 

  • Gobert, J., O’Dwyer, L., Horwitz, P., Buckley, B., Levy, S., & Wilensky, U. (2011). Examining the relationship between students’ understanding of the nature of models and conceptual learning in biology, physics, and chemistry. International Journal of Science Education, 33, 653–684.

    Article  Google Scholar 

  • Gogolin, S., & Krüger, D. (2015). Nature of models — Entwicklung von Diagnoseaufgaben [Nature of models — development of diagnostic tasks]. In M. Hammann, J. Mayer, & N. Wellnitz (Eds.), Lehr- und Lernforschung in der Biologiedidaktik [Research on Teaching and Learning in Biology Education] 6 (pp. 27–41). Innsbruck: Studienverlag.

  • Gonzalez, E., & Rutkowski, L. (2010). Principles of multiple matrix booklet designs and parameter recovery in large-scale assessments. IERI Monograph Series: Issues and Methodologies in Large-Scale Assessments, 3, 125–156.

    Google Scholar 

  • Grosslight, L., Unger, C., Jay, E., & Smith, C. (1991). Understanding models and their use in science: conceptions of middle and high school students and experts. Journal of Research in Science Teaching, 28, 799–822.

    Article  Google Scholar 

  • Grünkorn, J., Upmeier zu Belzen, A., & Krüger, D. (2014). Assessing students' understandings of biological models and their use in science to evaluate a theoretical framework. International Journal of Science Education, 36, 1651–1684.

  • Harrison, A., & Treagust, D. (2000). A typology of school science models. International Journal of Science Education, 22, 1011–1026.

    Article  Google Scholar 

  • Hartig, J., Klieme, E., & Leutner, D. (Eds.). (2008). Assessment of competencies in educational contexts: state of the art and future prospects. Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Hodson, D. (2014). Learning science, learning about science, doing science. International Journal of Science Education, 36, 2534–2553.

    Article  Google Scholar 

  • Hoyt, W. (2000). Rater bias in psychological research: when is it a problem and what can we do about it? Psychological Methods, 5, 64–86.

    Article  Google Scholar 

  • Justi, R., & Gilbert, J. (2003). Teacher’s views on the nature of models. International Journal of Science Education, 25, 1369–1386.

    Article  Google Scholar 

  • Justi, R., & Van Driel, J. (2005). The development of science teachers’ knowledge on models and modelling: promoting, characterizing, and understanding the process. International Journal of Science Education, 27, 549–573.

    Article  Google Scholar 

  • Katz, I., Bennett, E., & Rerger, A. (2000). Effects of response format on difficulty of SAT-Mathematics items: it’s not the strategy. Journal of Edueational Measurement, 37, 39–57.

    Article  Google Scholar 

  • Kauertz, A., & Fischer, H. (2006). Assessing students’ level of knowledge and analysing the reasons for learning difficulties in physics by Rasch analysis. In X. Liu & W. Boone (Eds.), Applications of Rasch Measurement in Science Education (pp. 212–245). Maple Grove, MN: JAM.

    Google Scholar 

  • Kauertz, A., Neumann, K., & Haertig, H. (2012). Competence in science education. In B. Fraser, K. Tobin, & C. McRobbie (Eds.), Second international handbook of science education (pp. 711–721). Dordrecht: Springer.

    Chapter  Google Scholar 

  • Klieme, E., Hartig, J., & Rauch, D. (2008). The concept of competence in educational contexts. In J. Hartig, E. Klieme, & D. Leutner (Eds.), Assessment of competencies in educational contexts (pp. 3–22). Hogrefe & Huber: Göttingen.

    Google Scholar 

  • Konferenz der Kultusminister der Länder in der Bundesrepublik Deutschland (KMK). (2005). Bildungsstandards im Fach Biologie für den Mittleren Schulabschluss [Biology education standards for the Mittlere Schulabschluss]. München: Wolters Kluwer.

    Google Scholar 

  • Krell, M. (2012). Using polytomous IRT models to evaluate theoretical levels of understanding models and modeling in biology education. Science Education Review Letters, Theoretical Letters, 2012, 1–5.

    Google Scholar 

  • Krell, M. (2013). Wie Schülerinnen und Schüler biologische Modelle verstehen [How students understand biological models]. Berlin: Logos.

    Google Scholar 

  • Krell, M., Upmeier zu Belzen, A., & Krüger, D. (2012). Students’ understanding of the purpose of models in different biological contexts. International Journal of Biology Education, 2, 1–34.

    Google Scholar 

  • Krell, M., Reinisch, B., & Krüger, D. (2015). Analyzing students’ understanding of models and modeling referring to the disciplines biology, chemistry, and physics. Research in Science Education, 45, 367–393.

    Article  Google Scholar 

  • Krell, M., Upmeier zu Belzen, A., & Krüger, D. (2014). Context-specificities in students’ understanding of models and modelling: an issue of critical importance for both assessment and teaching. In C. Constantinou, N. Papadouris, & A. Hadjigeorgiou (Eds.), E-Book proceedings of the ESERA 2013 conference. Retrieved from http://www.esera.org/media/esera2013/Moritz_Krell_07Feb2014.pdf. Accessed 12 January 2016.

  • Krell, M., Upmeier zu Belzen, A., & Krüger, D. (2014b). Students’ levels of understanding models and modelling in biology: global or aspect-dependent? Research in Science Education, 44, 109–132.

    Article  Google Scholar 

  • Kuckartz, U. (2012). Qualitative Inhaltsanalyse: Methoden, Praxis, Computerunterstützung [Qualitative content analysis: methods, practice, technological support]. Beltz Juventa: Weinheim.

    Google Scholar 

  • Leach, J., Millar, R., Ryder, J., & Séré, M.-G. (2000). Epistemological understanding in science learning: the consistency of representations across contexts. Learning and Instruction, 10, 497–527.

    Article  Google Scholar 

  • Lee, S., Chang, H., & Wu, H. (2015). Students’ views of scientific models and modeling: do representational characteristics of models and students’ educational levels matter? Research in Science Education. doi:10.1007/s11165-015-9502-x.

    Google Scholar 

  • Leighton, J. (2004). Avoiding misconception, misuse, and missed opportunities: the collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, 23, 6–15.

    Article  Google Scholar 

  • Mahr, B. (2011). On the epistemology of models. Rethinking Epistemology, 1, 301–352.

  • Martinez, M. (1999). Cognition and the question of test item format. Educational Psychologist, 34, 207–218.

    Article  Google Scholar 

  • Masters, G. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149–174.

    Article  Google Scholar 

  • Mayring, P. (2010). Qualitative Inhaltsanalyse: Grundlagen und Techniken [Qualitative content analysis: Basics and techniques]. Weinheim: Beltz.

  • McCloy, R., Heggestad, E., & Reeve, C. (2005). A silk purse from the sow’s ear: retrieving normative information from multidimensional forced-choice items. Organizational Research Methods, 8, 222–248.

    Article  Google Scholar 

  • McComas, W. (Ed.). (2013). The language of science education: an expanded glossary of key terms and concepts in science teaching and learning. Rotterdam: Sense.

    Google Scholar 

  • Meisert, A. (2007). Über den Umgang mit Hypothesen [How to deal with hypotheses]. Der mathematische und naturwissenschaftliche Unterricht [Mathematical and science teaching], 60, 431–437.

  • Messick, S. (1995). Validity of psychological assessment: validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50, 741–749.

    Article  Google Scholar 

  • Lead States, N. G. S. S. (Ed.). (2013). Next generation science standards: for states, by states. Washington, DC: National Academies.

    Google Scholar 

  • Nisbett, R., & Wilson, T. (1977). Telling more than we can know: verbal reports on mental processes. Psychological Review, 84, 231–259.

    Article  Google Scholar 

  • Odom, A., & Barrow, L. (1995). Development and application of a two‐tier diagnostic test measuring college biology students’ understanding of diffusion and osmosis after a course of instruction. Journal of Research in Science Teaching, 32, 45–61.

    Article  Google Scholar 

  • Oh, P., & Oh, S. (2011). What teachers of science need to know about models: an overview. International Journal of Science Education, 33, 1109–1130.

    Article  Google Scholar 

  • Pant, H., Stanat, P., Schroeders, U., Roppelt, A., & Siegle, T. (Eds.). (2013). IQB-Ländervergleich 2012 [IQB – Cross-country comparison 2012]. Münster: Waxmann.

    Google Scholar 

  • Passmore, C., Gouvea, J., & Giere, R. (2014). Models in science and in learning science. In M. Matthews (Ed.), International handbook of research in history, philosophy and science teaching (pp. 1171–1202). Dordrecht: Springer.

    Google Scholar 

  • Patzke, C., Krüger, D., & Upmeier zu Belzen, A. (2015). Entwicklung von Modellkompetenz im Längsschnitt [Development of model competence in a longitudinal study]. In M. Hammann, J. Mayer, & N. Wellnitz (Eds.), Lehr- und Lernforschung in der Biologiedidaktik [Research on teaching and learning in biology education] 6 (pp. 43–58). Innsbruck: Studienverlag.

  • Rector Federer, M., Nehm, R., Opfer, J., & Pearl, D. (2014). Using a constructed-response instrument to explore the effects of item position and item features on the assessment of students’ written scientific explanations. Research in Science Education, 4, 527–553.

    Google Scholar 

  • Rost, J. (2004). Lehrbuch Testtheorie – Testkonstruktion [Textbook test theory—test construction]. Bern: Hans Gruber.

    Google Scholar 

  • Rost, J., & Walter, O. (2006). Multimethod item response theory. In M. Eid & E. Diener (Eds.), Handbook of multimethod measurement in psychology (pp. 249–268). Washington, DC: American Psychological Association.

  • Schwarz, C. (2002). The role of epistemic forms and games: perspectives on the role of metamodeling knowledge in learning with models. In P. Bell, R. Stevens, & T. Satwicz (Eds.), Keeping learning complex: the proceedings of the Fifth International Conference of the Learning Sciences (ICLS) (pp. 414–420). Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Schwarz, C., & White, B. (2005). Metamodelling knowledge: developing students’ understanding of scientific modelling. Cognition and Instruction, 23, 165–205.

    Article  Google Scholar 

  • Schwarz, C., Reiser, B., Davis, E., Kenyon, L., Acher, A., Fortus, D., et al. (2009). Developing a learning progression for scientific modeling: making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46, 632–654.

    Article  Google Scholar 

  • Schwarz, C., Reiser, B., Fortus, D., Krajcik, J., Roseman, J., Willard, T., & Acher, A. (2008). Designing and testing the MoDeLS learning progression. Paper presented at the annual conference of the National Association for Research in Teaching (NARST), Baltimore, MD. Retrieved from http://www.models.northwestern.edu/docs/NARST_MoDeLS_Intro_Paper_FINAL.pdf. Accessed 12 January 2016.

  • Sins, P., Savelsbergh, E., Van Joolingen, W., & Van Hout–Wolters, B. (2009). The relation between students’ epistemological understanding of computer models and their cognitive processing on a modelling task. International Journal of Science Education, 31, 1205–1229.

    Article  Google Scholar 

  • Terzer, E. (2013). Modellkompetenz im Kontext Biologieunterricht [Model competence in the context of biology education] (Dissertation). Humboldt Universität zu Berlin. Retrieved from http://edoc.hu-berlin.de/dissertationen/terzereva-2012-12-19/PDF/terzer.pdf. Accessed 12 January 2016.

  • Traub, R., & MacRury, K. (1990). Multiple-choice vs. free-response in the testing of scholastic achievement. In K. Ingenkamp & R. Jäger (Eds.), Tests und Trends [Tests and Trends]. 8. Jahrbuch der Pädagogischen Diagnostik (pp. 128–159). Beltz: Weinheim.

  • Treagust, D., Chittleborough, G., & Mamiala, T. (2002). Student’s understanding of the role of scientific models in learning science. International Journal of Science Education, 24, 357–368.

    Article  Google Scholar 

  • Treagust, D., Chittleborough, G., & Mamiala, T. (2004). Students’ understanding of the descriptive and predictive nature of teaching models in organic chemistry. Research in Science Education, 34, 1–20.

    Article  Google Scholar 

  • Trier, U., & Upmeier zu Belzen, A. (2009). ‘Die Wissenschaftler nutzen Modelle, um etwas Neues zu entdecken, und in der Schule lernt man einfach nur, dass es so ist’.: Schülervorstellungen zu Modellen [‘Scientists use models to discover something new and in school, you only learn that this is the case’: students’ conceptions of models]. Erkenntnisweg Biologiedidaktik [Path of knowledge in biology education], 8, 23–37.

  • Trier, U., Krüger, D., & Upmeier zu Belzen, A. (2014). Students’ versus scientists’ conceptions of models and modelling. In D. Krüger, & M. Ekborg (Eds.), Research in biological education (pp. 103–115). Retrieved from http://www.bcp.fu-berlin.de/biologie/arbeitsgruppen/didaktik/eridob_2012/eridob_proceeding/7-Students_Versus.pdf?1389177503. Accessed 12 January 2016.

  • Ubben, I., Nitz, S., Rousseau, M., & Upmeier zu Belzen, A. (2015). Modelle von und für Evolution in Schulbüchern [Models of and for evolution in textbooks]. [Abstract]. In U. Gebhard, M. Hammann, & B. Knälmann (Eds.), Bildung durch Biologieunterricht. 20. Internationale Tagung der Fachsektion Didaktik der Biologie (FDdB) im VBiO [Education through biology teaching. 20th international conference of the FDdB] (pp. 75–76). Hamburg. Retrieved from http://docplayer.org/15781310-Bildung-durch-biologieunterricht-14-17-september-2015.html. Accessed 05 September 2016.

  • Upmeier zu Belzen, A., & Krüger, D. (2010). Modellkompetenz im Biologieunterricht [Model competence in biology education]. Zeitschrift für Didaktik der Naturwissenschaften [Journal for Education in Science], 16, 41–57.

  • Van Der Valk, T., Van Driel, J., & De Vos, W. (2007). Common characteristics of models in present-day scientific practice. Research in Science Education, 37, 469–488.

    Article  Google Scholar 

  • Van Driel, J., & Verloop, N. (1999). Teachers’ knowledge of models and modelling in science. International Journal of Science Education, 21, 1141–1153.

    Article  Google Scholar 

  • Van Driel, J., & Verloop, N. (2002). Experienced teachers’ knowledge of teaching and learning of models and modelling in science education. International Journal of Science Education, 24, 1255–1277.

    Article  Google Scholar 

  • Van Someren, M., Barnard, Y., & Sandberg, J. (1994). The think-aloud method: a practical guide to modelling cognitive processes. San Diego, CA: Academic.

    Google Scholar 

  • Vygotsky, L. (1962). Thought and language. Cambridge, MA: MIT.

    Book  Google Scholar 

  • Warm, T. (1989). Weighted likelihood estimation of ability in item response theory. Psychometrika, 54, 427–450.

    Article  Google Scholar 

  • Welzel, M., Haller, K., Bandiera, M., Hammelev, D., Koumaras, P., Niedderer, H., Paulsen, A., Robinault, K., & Von Aufschnaiter, S. (1998). Ziele, die Lehrende mit dem Experimentieren in der naturwissenschaftlichen Ausbildung verbinden [Aims, teachers pursue with experimenting as part of science education]. Zeitschrift für Didaktik der Naturwissenschaften [Journal for education in science], 4, 29–44.

  • Willard, T., & Roseman, J. (2010). Probing students’ ideas about models using standards-based assessment items. Paper presented at the annual conference of the National Association for Research in Teaching (NARST), Philadelphia, PA. Retrieved from http://www.models.northwestern.edu/docs/Models%20Willard%20NARST%203-19-10.pdf. Accessed 12 January 2016.

  • Wilson, M., De Boeck, P., & Carstensen, C. (2008). Explanatory item response models: a brief introduction. In J. Hartig, E. Klieme, & D. Leutner (Eds.), Assessment of competencies in educational contexts (pp. 91–120). Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Windschitl, M., Thompson, J., & Braaten, M. (2008). Beyond the scientific method: model-based inquiry as a new paradigm of preference for school science investigations. Science Education, 92, 941–967.

    Article  Google Scholar 

  • Wirtz, M., & Caspar, F. (2002). Beurteilerübereinstimmung und Beurteilerreliabilität [Interrater-accordance and interrater-reliability]. Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Wu, M., Adams, R., Wilson, M., & Haldane, S. (2007). ACER ConQuest. Camberwell, VIC: ACER.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sarah Gogolin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gogolin, S., Krüger, D. Diagnosing Students’ Understanding of the Nature of Models. Res Sci Educ 47, 1127–1149 (2017). https://doi.org/10.1007/s11165-016-9551-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11165-016-9551-9

Keywords

Navigation