Skip to main content
Log in

An Empirically Grounded Framework That Evaluates Argument Quality in Scientific and Social Contexts

  • Published:
International Journal of Science and Mathematics Education Aims and scope Submit manuscript

Abstract

This study was aimed to develop a general argumentation framework for evaluating the quality of causal arguments across scientific and social contexts. We designed a computer-delivered assessment that contains four scenario-based argumentation tasks. Each task asks students to identify relevant evidence from provided data sources and use the evidence to construct an argument that answers a causal question. One task is about a social issue, while the rest three tasks each requires knowledge of a scientific concept (melting/evaporation, photosynthesis, trophic cascade). The assessment was implemented with 349 students from urban middle and high schools. Based on the data and prior research, we developed an empirically grounded argumentation framework that contains four qualitatively different levels: non-causal arguments, causal arguments lacking logical connections, causal arguments with weak reasoning, and causal arguments with strong reasoning. The qualitative results provide evidence of the existence of the argumentation levels. The IRT analysis and the Wright map provide the evidence that the order of and the distinctions among the argumentation levels are meaningful. Together, the qualitative and quantitative results support the viability of the framework.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (2014). Standards for educational and psychological testing. Washington: American Educational Research Association.

    Google Scholar 

  • Bennett, R. E. (2010). Cognitively based assessment of, for, and as learning (CBAL): A preliminary theory of action for summative and formative assessment. Measurement, 8, 70–91.

    Google Scholar 

  • Berland, L. K., & McNeill, K. L. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94(5), 765–793. https://doi.org/10.1002/sce.20402

  • Braaten, M., & Windschitl, M. (2011). Working toward a stronger conceptualization of scientific explanation for science education. Science Education, 95, 639–669.

    Article  Google Scholar 

  • Bricker, L. A., & Bell, P. (2008). Conceptualizations of argumentation from science studies and the learning sciences and their implications for the practice of science education. Science Education, 92, 473–498.

    Article  Google Scholar 

  • Clark, D., & Sampson, V. (2008). Assessing dialogic argumentation in online environments to relate structure, grounds, and conceptual quality. Journal of Research in Science Teaching, 45(3), 293–321.

    Article  Google Scholar 

  • Common Core State Standards Initiative (2010). Common Core State Standards for English language arts and literacy in history/social studies, science, and technical subjects. Retrieved June 8, 2019 from http://www.corestandards.org/ELA-Literacy/

  • Driver, R., Newton, P., & Osborne, J. (2000). Establishing the norms of scientific argumentation in classrooms. Science Education, 84, 287–312.

    Article  Google Scholar 

  • Eemeren, F. H. V., & Grootendorst, R. (2004). A systematic theory of argumentation: The pragmadialected approach. Cambridge: Cambridge University Press.

    Google Scholar 

  • Erduran, S., Simon, S., & Osborne, J. (2004). TAPping into argumentation: Developments in the application of Toulmin’s argument pattern for studying science discourse. Science Education, 88, 915–933.

    Article  Google Scholar 

  • Fortus, D., Shwartz, Y., & Rosenfeld, S. (2016). High school students’ meta-modeling knowledge. Research in Science Education, 46(6), 787–810.

    Article  Google Scholar 

  • Jin, H., Mehl, C. E., & Lan, D. H. (2015). Developing an analytical framework for argumentation on energy consumption issues. Journal of Research in Science Teaching, 52(8), 1132–1162. https://doi.org/10.1002/tea.21237.

  • Jin, H., Shin, H. J., Hokayem, H., Qureshi, F., & Jenkins, T. (2019). Secondary students' understanding of ecosystems: A learning progression approach. International Journal of Science and Mathematics Education, 17(2), 217–235. https://doi.org/10.1007/s10763-017-9864-9.

  • Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50, 1–73.

    Article  Google Scholar 

  • Kuhn, D. (1992). Thinking as argument. Harvard Educational Review, 62(2), 155–178.

    Article  Google Scholar 

  • McNeill, K. L., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students’ construction of scientific explanations by fading scaffolds in instructional materials. The Journal of the Learning Science, 15(2), 153–191.

    Article  Google Scholar 

  • Mendonça, P. C. C., & Justi, R. (2013). The relationships between modelling and argumentation from the perspective of the model of modelling diagram. International Journal of Science Education, 35(14), 2407-2434.

  • Mendonça, P. C. C., & Justi, R. (2014). An instrument for analyzing arguments produced in model-based chemistry lessons. Journal of Research in Science Teaching, 51, 192–218.

    Article  Google Scholar 

  • NGSS Lead States (2013). Next generation science standards: For states, by states. Washington: Achieve, Inc..

    Google Scholar 

  • NRC (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington: The National Academies Press.

    Google Scholar 

  • NRC. (1996). National Science Education Standards : Observe, interact, change, learn. Washington, D.C.: National Academies Press.

  • Osborne, J., Erduran, S., & Simon, S. (2004). Enhancing the quality of argumentation in school science. Journal of Research in Science Teaching, 41, 994–1020.

    Article  Google Scholar 

  • Osborne, J., Henderson, B., MacPherson, A., Szu, E., Wild, A., & Yao, S.-Y. (2016). The development and validation of a learning progression for argumentation. Journal of Research in Science Teaching, 53, 821–846.

    Article  Google Scholar 

  • Ripple, W. J., Larsen, E. J., Renkin, R. A., & Smith, D. W. (2001). Trophic cascades among wolves, elk, and aspen on Yellowstone National Park’s northern range. Biological Conservation, 102, 227–334.

    Article  Google Scholar 

  • Schwarz, C. V., Reiser, B. J., Fortus, D., Davis, E. A., Kenyon, L., & Shwartz, Y. (2009). Developing a learning progression of scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46, 632–655.

    Article  Google Scholar 

  • Simosi, M. (2003). Using Toulmin’s framework for the analysis of everyday argumentation: Some methodological considerations. Argumentation, 17, 185–202.

    Article  Google Scholar 

  • Toulmin, S. E. (1958). The uses of argument. Cambridge Cambridge University Press.

  • Walton, D. (1996). Argumentation schemes for presumptive reasoning. Mahwah: Lawrence Erlbaum Associates.

    Google Scholar 

  • Wilson, M. (2005). Constructing measures: An item response modeling approach. New York: Taylor & Francis Group.

    Google Scholar 

  • Wilson, C. D., Tylor, J. A., Kowalski, S. M., & Carlson, J. (2010). The relative effects and equity of inquiry-based and commonplace science teaching on students’ knowledge, reasoning, and argumentation. Journal of Research in Science Teaching, 47, 276–301.

    Google Scholar 

Download references

Funding

The work reported in this article is funded by Educational Testing Service, under Challenge IX.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hui Jin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jin, H., Yan, D., Mehl, C.E. et al. An Empirically Grounded Framework That Evaluates Argument Quality in Scientific and Social Contexts. Int J of Sci and Math Educ 19, 681–700 (2021). https://doi.org/10.1007/s10763-020-10075-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10763-020-10075-9

Keywords

Navigation