Skip to main content

Advertisement

Log in

Assessment of Context-Based Chemistry Problem-Solving Skills: Test Design and Results from Ninth-Grade Students

  • Published:
Research in Science Education Aims and scope Submit manuscript

Abstract

This study aims to evaluate students’ ability to process the context information embedded in chemistry problems. To achieve this goal, a diagnostic measurement instrument was developed, comprising 28 short-answer items embedded in seven context-based chemistry tasks. Four hundred and ninety-three ninth-graders took part in the testing in Jiangsu, China. The partial credit Rasch model was applied to establish evidence of validity and reliability of the measurement instrument. Results showed that this instrument could produce reliable and valid measures of students’ context-based chemistry problem-solving skills. Nearly half of the ninth-grade students had a high level of context extracting skills and a moderate level of context integrating skills. However, most ninth-graders only had a low level of context reasoning or argumentation skills. In addition, most students had unbalanced skill levels, and their skill levels decreased or equaled as the complexity of skill components increased. This study provides insights into the extent of students’ skills of dealing with context information in chemistry problems by developing a reliable and valid measurement instrument. The findings of this study might inform chemistry teachers to improve their teaching practices, call attention to test developers in taking much care of designing the context, and enable further studies to relate context-based problem-solving skills to chemistry concept learning and application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
€32.70 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Finland)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Ahmed, A., & Pollitt, A. (2007). Improving the quality of contextualized questions: An experimental investigation of focus. Assessment in Education, 14(2), 201–232.

    Google Scholar 

  • Avargil, S. (2019). Learning chemistry: Self-efficacy, chemical understanding, and graphing skills. Journal of Science Education and Technology, 28(4), 285–298.

    Article  Google Scholar 

  • Baran, M., & Sozbilir, M. (2018). An application of context-and problem-based learning (C-PBL) into teaching thermodynamics. Research in Science Education, 48(4), 663–689.

    Article  Google Scholar 

  • Becerra, B., Núñez, P., Vergara, C., Santibáñez, D., Krüger, D., & Cofré, H. (2022). Developing an instrument to assess pedagogical content knowledge for evolution. Research in Science Education, 1–17.

  • Bellocchi, A., King, D. T., & Ritchie, S. M. (2016). Context-based assessment: Creating opportunities for resonance between classroom fields and societal fields. International Journal of Science Education, 38(8), 1304–1342.

    Article  Google Scholar 

  • Bennett, J., & Holman, J. (2002). Context-based approaches to the teaching of chemistry: What are they and what are their effects? In Chemical education: Towards research-based practice (pp. 165–184). Springer.

    Google Scholar 

  • Bennett, J., Lubben, F., & Hogarth, S. (2007). Bringing science to life: A synthesis of the research evidence on the effects of context-based and STS approaches to science teaching. Science Education, 91(3), 347–370.

    Article  Google Scholar 

  • Bernholt, S., & Parchmann, I. (2011). Assessing the complexity of students’ knowledge in chemistry. Chemistry Education Research and Practice, 12(2), 167–173.

    Article  Google Scholar 

  • Bond, T., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). Routledge.

    Book  Google Scholar 

  • Boone, W. J., & Staver, J. R. (2020). Advances in Rasch analyses in the human sciences. Springer.

    Book  Google Scholar 

  • Boone, W., Staver, J., & Yale, M. (2014). Rasch analysis in the human sciences. Springer.

    Book  Google Scholar 

  • Broman, K. (2020). Engagement and relevance through context-based, everyday life, open-ended chemistry problems. In S. Simon, I. Parchmann, & J. Apotheker (Eds.), Engaging learners with chemistry (pp. 52–72). Royal Society of Chemistry.

    Chapter  Google Scholar 

  • Broman, K., Bernholt, S., & Parchmann, I. (2018). Using model-based scaffolds to support students solving context-based chemistry problems. International Journal of Science Education, 40(10), 1176–1197.

    Article  Google Scholar 

  • Broman, K., Bernholt, S., & Christensson, C. (2020). Relevant or interesting according to upper secondary students? Affective aspects of context-based chemistry problems. Research in Science and Technological Education, 1–21.

  • Cao, Y. (2008). Mixed-format test equating: Effects of test dimensionality and common item sets. Unpublished doctoral dissertation, University of Maryland, College Park, MD.

  • Cigdemoglu, C., & Geban, O. (2015). Improving students’ chemical literacy levels on thermochemical and thermodynamics concepts through a context-based approach. Chemistry Education Research and Practice, 16(2), 302–317.

    Article  Google Scholar 

  • Cohen, L., Manion, L., & Morrison, K. (2000). Research methods in education. RoutledgeFalmer.

    Google Scholar 

  • Criswell, B. A., & Rushton, G. T. (2014). Activity structures and the unfolding of problem-solving actions in high-school chemistry classrooms. Research in Science Education, 44(1), 155–188.

    Article  Google Scholar 

  • Dori, Y. J., Avargil, S., Kohen, Z., & Saar, L. (2018). Context-based learning and metacognitive prompts for enhancing scientific text comprehension. International Journal of Science Education, 40(10), 1198–1220.

    Article  Google Scholar 

  • Fox, C. (1999). An introduction to the partial credit model for developing nursing assessments. Journal of Nursing Education, 38(8), 340–346.

    Article  Google Scholar 

  • Garcia, C., Argelagós, E., & Privado, J. (2021). Assessment of higher education students’ information problem-solving skills in educational sciences. Information Development, 37(3), 359–375.

    Article  Google Scholar 

  • Gilbert, J. K. (2006). On the nature of “context” in chemical education. International Journal of Science Education, 28(9), 957–976.

    Article  Google Scholar 

  • Gulacar, O., Eilks, I., & Bowman, C. R. (2014). Differences in general cognitive abilities and domain-specific skills of higher-and lower-achieving students in stoichiometry. Journal of Chemical Education, 91(7), 961–968.

    Article  Google Scholar 

  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144.

    Article  Google Scholar 

  • King, D., Bellocchi, A., & Ritchie, S. M. (2008). Making connections: Learning and teaching chemistry in context. Research in Science Education, 38(3), 365–384.

    Article  Google Scholar 

  • Lerdal, A., & Kottorp, A. (2011). Psychometric properties of the Fatigue Severity Scale—Rasch analyses of individual responses in a Norwegian stroke cohort. International Journal of Nursing Studies, 48(10), 1258–1265.

    Article  Google Scholar 

  • Linacre, J. M. (2012). A user’s guide to WINSTEPS® MINISTEP Rasch-model computer programs program manual 3.80.0. Winsteps.

    Google Scholar 

  • Löffler, P., Pozas, M., & Kauertz, A. (2018). How do students coordinate context-based information and elements of their own knowledge? An analysis of students’ context-based problem-solving in thermodynamics. International Journal of Science Education, 40(16), 1935–1956.

    Article  Google Scholar 

  • Lohr, K. N. (2002). Assessing health status and quality-of-life instruments: Attributes and review criteria. Quality of Life Research, 11(3), 193–205.

    Article  Google Scholar 

  • Lynn, M. R. (1986). Determination and quantification of content validity. Nursing Research, 35(6), 382–386.

    Article  Google Scholar 

  • Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47(2), 149–174.

    Article  Google Scholar 

  • McNeill, K. L., & Krajcik, J. (2007). Middle school students’ use of appropriate and inappropriate evidence in writing scientific explanations. In M. C. Lovett & P. Shah (Eds.), Thinking with data: The Proceedings of the 33rd Carnegie Symposium on Cognition (pp. 233–265). Erlbaum.

    Google Scholar 

  • Meltzer, D. E. (2005). Relation between students’ problem-solving performance and representational format. American Journal of Physics, 73(5), 463–478.

    Article  Google Scholar 

  • Milenković, D., Segedinac, M., Hrin, T., & Gajić, G. (2016). Evaluation of context-level effect on students’ performance and perceived cognitive load in chemistry problem-solving tasks. Croatian Journal of Education, 17(4), 959–982.

    Article  Google Scholar 

  • Nentwig, P., Demuth, R., Parchmann, I., Gräsel, C., & Ralle, B. (2007). Chemie im Kontext: Situated learning in relevant contexts while systematically developing basic chemical concepts. Journal of Chemical Education, 84(9), 1439–1444.

    Article  Google Scholar 

  • Overton, T. L., & Potter, N. M. (2011). Investigating students’ success in solving and attitudes towards context-rich open-ended problems in chemistry. Chemistry Education Research and Practice, 12(3), 294–302.

    Article  Google Scholar 

  • Park, J., & Lee, L. (2004). Analysing cognitive or non-cognitive factors involved in the process of physics problem-solving in an everyday context. International Journal of Science Education, 26(13), 1577–1595.

    Article  Google Scholar 

  • Park, M., & Liu, X. (2016). Assessing understanding of the energy concept in different science disciplines. Science Education, 100(3), 483–516.

    Article  Google Scholar 

  • Rittle-Johnson, B., Matthews, P. G., Taylor, R. S., & McEldoon, K. L. (2011). Assessing knowledge of mathematical equivalence: A construct-modeling approach. Journal of Educational Psychology, 103(1), 85.

    Article  Google Scholar 

  • Rose, L. T., & Fischer, K. W. (2009). Dynamic development: A neo-Piagetian approach. In U. Muller, J. Carpendale, & L. Smith (Eds.), The Cambridge companion to Piaget (pp. 400–421). Cambridge University Press.

    Chapter  Google Scholar 

  • Ruiz-Primo, M. A., & Li, M. (2016). PISA science contextualized items: The link between the cognitive demands and context characteristics of the items. Electronic Journal of Educational Research, Assessment and Evaluation, 22(1), 1–20.

    Google Scholar 

  • Ryan, K. E., & Chiu, S. (2001). An examination of item context effects, DIF, and gender DIF. Applied Measurement in Education, 14(1), 73–90.

    Article  Google Scholar 

  • Sadhu, S., & Laksono, E. W. (2018). Development and validation of an integrated assessment for measuring critical thinking and chemical literacy in chemical equilibrium. International Journal of Instruction, 11(3), 557–572.

    Article  Google Scholar 

  • Salgado, F. A. (2016). Investigating the impact of context on students’ performance. In White, B.. Chinnappan, M. & Trenholm, S. (Eds.), Opening up mathematics education research (Proceedings of the 39th annual conference of the Mathematics Education Research Group of Australasia) (pp. 102–109). Adelaide: MERGA.

  • Salta, K., & Tzougraki, C. (2011). Conceptual versus algorithmic problem-solving: Focusing on problems dealing with conservation of matter in chemistry. Research in Science Education, 41(4), 587–609.

    Article  Google Scholar 

  • Shwartz, Y., Ben-Zvi, R., & Hofstein, A. (2006). Chemical literacy: What does this mean to scientists and school teachers? Journal of Chemical Education, 83(10), 1557.

    Article  Google Scholar 

  • Sondergeld, T. A., & Johnson, C. C. (2014). Using Rasch measurement for the development and use of affective assessments in science education research. Science Education, 98(4), 581–613.

    Article  Google Scholar 

  • Stolk, M. J., De Jong, O., Bulte, A. M., & Pilot, A. (2011). Exploring a framework for professional development in curriculum innovation: Empowering teachers for designing context-based chemistry education. Research in Science Education, 41(3), 369–388.

    Article  Google Scholar 

  • Tsaparlis, G. (2021). It depends on the problem and on the solver: An overview of the working memory overload hypothesis, its applicability and its limitations. In G. Tsaparlis (Ed.), Problems and problem solving in chemistry education: Analysing data, looking for patterns and making deductions (pp. 93–126). Royal Society of Chemistry.

    Chapter  Google Scholar 

  • Upahi, J. E., & Ramnarain, U. (2020). Examining the connection between students’ working memory and their abilities to solve open-ended chemistry problems. Journal of Baltic Science Education, 19(1), 142–156.

    Article  Google Scholar 

  • Walpuski, M., Ropohl, M., & Sumfleth, E. (2011). Students’ knowledge about chemical reactions – Development and analysis of standard-based test items. Chemical Education Research and Practice, 12(2), 174–183.

    Article  Google Scholar 

  • Walraven, A., Brand-Gruwel, S., & Boshuizen, H. P. (2008). Information-problem solving: A review of problems students encounter and instructional solutions. Computers in Human Behavior, 24(3), 623–648.

    Article  Google Scholar 

  • Wei, J., Treagust, D. F., Mocerino, M., Vishnumolakala, V. R., Zadnik, M. G., Lucey, A. D., & Lindsay, E. D. (2021). Design and validation of an instrument to measure students’ interactions and satisfaction in undergraduate chemistry laboratory classes. Research in Science Education, 51(4), 1039–1053.

    Article  Google Scholar 

  • Wilson, M. (2005). Constructing measures: An item response modeling approach. Erlbaum.

    Google Scholar 

  • Witte, D., & Beers, K. (2003). Testing of chemical literacy (chemistry in context in the Dutch national examination). Chemical Education International, 4(1), 1–3.

    Google Scholar 

  • Wiyarsi, A., Prodjosantoso, A. K., & Nugraheni, A. R. E. (2021). Promoting students’ scientific habits of mind and chemical literacy using the context of socio-scientific issues on the inquiry learning. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.660495

  • Zoller, U. (2002). Algorithmic, LOCS and HOCS (chemistry) exam questions: Performance and attitudes of college students. International Journal of Science Education, 24(2), 185–203.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shaohui Chi.

Ethics declarations

This work was supported by the Shanghai Pujiang Program (No. 2020PJC032) and the MOE Key Research Institute of Humanities and Social Sciences (No. 17JJD880007). All the authors have no relevant financial or non-financial interests to disclose.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

Tables 8, 9, 10, 11, and 12

Table 8 Exemplary task for level E3

Task 3-3 (level E3) requires students to identify and extract the essential information implicitly expressed from a low transparent context. Because the explicit information about “reaction of ammonia synthesis” is not presented in the context, students must resist a bunch of irrelevant information (i.e., “it is possible to see how the atoms in a molecule move...’ or ‘the ability to monitor the movement of particles”) and identify the essential information from the context needed for the solution (i.e., “dynamical processes of chemical bond breaking and making can be imaged”).

Table 9 Exemplary task for level I3

Task 1 was adapted from one PISA 2015 released science unit (Unit CS613). Task 1-2 (level I3) requires students to incorporate relevant information expressed by multiple presentation types from a low transparent context. A correct response should indicate a carbon dioxide cycle and point out that while power plants burn biofuels and emit CO2 into the atmosphere, plants take up carbon dioxide during photosynthesis, which is converted into biofuels. Students must integrate both textual information (i.e., “Biofuel is the fuel produced from plants”) and diagrammatic information (i.e., plants take up CO2 during photosynthesis). In addition, they must resist irrelevant information, such as the descriptions of ethanol or CO2 emissions released by burning fossil fuels.

Table 10 Exemplary task for level R3

Task 1-4 (level R3) requires students to use relevant information to explain complex cause-effect relations focusing on the superposition and the reciprocal influence of the different relations. The cause-effect relations between the adjustment and replacement needed to explain here are complex. Students must first reference the relevant information about natural gas and propane from the text and table, then use the table information to calculate the amount of the O2 needed by burning the same amount of natural gas and propane. Students subsequently decide how to adjust air and gas intake based on the calculation results. That is, gas intake should allow more natural gas. In contrast, the air intake should allow less air to mix with the natural gas because burning the same amount of natural gas requires less O2 than propane.

Table 11 Exemplary task for level A3

Task 4-4 (level A3) requires students to apply multiple pieces of relevant information, either data, warrant, or backings, and with an identifiable rebuttal, to support a claim. For this task, students must synthesize and use multiple pieces of information from three materials with their prior basic chemistry knowledge (e.g., harmful effects of CO2 emission on the environment) to support a given claim. They should make arguments founded on analysis, synthesis, and evaluation of context information, discussing the advantages of using nitrogen (e.g., easy to get, low-cost) and stating the disadvantages of using carbon dioxide from the perspective of environmental protection or energy-consuming.

Table 12 Revising item example (Task 1-2)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chi, S., Wang, Z. & Liu, X. Assessment of Context-Based Chemistry Problem-Solving Skills: Test Design and Results from Ninth-Grade Students. Res Sci Educ 53, 295–318 (2023). https://doi.org/10.1007/s11165-022-10056-8

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11165-022-10056-8

Keywords

Navigation