Abstract
This study aims to evaluate students’ ability to process the context information embedded in chemistry problems. To achieve this goal, a diagnostic measurement instrument was developed, comprising 28 short-answer items embedded in seven context-based chemistry tasks. Four hundred and ninety-three ninth-graders took part in the testing in Jiangsu, China. The partial credit Rasch model was applied to establish evidence of validity and reliability of the measurement instrument. Results showed that this instrument could produce reliable and valid measures of students’ context-based chemistry problem-solving skills. Nearly half of the ninth-grade students had a high level of context extracting skills and a moderate level of context integrating skills. However, most ninth-graders only had a low level of context reasoning or argumentation skills. In addition, most students had unbalanced skill levels, and their skill levels decreased or equaled as the complexity of skill components increased. This study provides insights into the extent of students’ skills of dealing with context information in chemistry problems by developing a reliable and valid measurement instrument. The findings of this study might inform chemistry teachers to improve their teaching practices, call attention to test developers in taking much care of designing the context, and enable further studies to relate context-based problem-solving skills to chemistry concept learning and application.



Similar content being viewed by others
References
Ahmed, A., & Pollitt, A. (2007). Improving the quality of contextualized questions: An experimental investigation of focus. Assessment in Education, 14(2), 201–232.
Avargil, S. (2019). Learning chemistry: Self-efficacy, chemical understanding, and graphing skills. Journal of Science Education and Technology, 28(4), 285–298.
Baran, M., & Sozbilir, M. (2018). An application of context-and problem-based learning (C-PBL) into teaching thermodynamics. Research in Science Education, 48(4), 663–689.
Becerra, B., Núñez, P., Vergara, C., Santibáñez, D., Krüger, D., & Cofré, H. (2022). Developing an instrument to assess pedagogical content knowledge for evolution. Research in Science Education, 1–17.
Bellocchi, A., King, D. T., & Ritchie, S. M. (2016). Context-based assessment: Creating opportunities for resonance between classroom fields and societal fields. International Journal of Science Education, 38(8), 1304–1342.
Bennett, J., & Holman, J. (2002). Context-based approaches to the teaching of chemistry: What are they and what are their effects? In Chemical education: Towards research-based practice (pp. 165–184). Springer.
Bennett, J., Lubben, F., & Hogarth, S. (2007). Bringing science to life: A synthesis of the research evidence on the effects of context-based and STS approaches to science teaching. Science Education, 91(3), 347–370.
Bernholt, S., & Parchmann, I. (2011). Assessing the complexity of students’ knowledge in chemistry. Chemistry Education Research and Practice, 12(2), 167–173.
Bond, T., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). Routledge.
Boone, W. J., & Staver, J. R. (2020). Advances in Rasch analyses in the human sciences. Springer.
Boone, W., Staver, J., & Yale, M. (2014). Rasch analysis in the human sciences. Springer.
Broman, K. (2020). Engagement and relevance through context-based, everyday life, open-ended chemistry problems. In S. Simon, I. Parchmann, & J. Apotheker (Eds.), Engaging learners with chemistry (pp. 52–72). Royal Society of Chemistry.
Broman, K., Bernholt, S., & Parchmann, I. (2018). Using model-based scaffolds to support students solving context-based chemistry problems. International Journal of Science Education, 40(10), 1176–1197.
Broman, K., Bernholt, S., & Christensson, C. (2020). Relevant or interesting according to upper secondary students? Affective aspects of context-based chemistry problems. Research in Science and Technological Education, 1–21.
Cao, Y. (2008). Mixed-format test equating: Effects of test dimensionality and common item sets. Unpublished doctoral dissertation, University of Maryland, College Park, MD.
Cigdemoglu, C., & Geban, O. (2015). Improving students’ chemical literacy levels on thermochemical and thermodynamics concepts through a context-based approach. Chemistry Education Research and Practice, 16(2), 302–317.
Cohen, L., Manion, L., & Morrison, K. (2000). Research methods in education. RoutledgeFalmer.
Criswell, B. A., & Rushton, G. T. (2014). Activity structures and the unfolding of problem-solving actions in high-school chemistry classrooms. Research in Science Education, 44(1), 155–188.
Dori, Y. J., Avargil, S., Kohen, Z., & Saar, L. (2018). Context-based learning and metacognitive prompts for enhancing scientific text comprehension. International Journal of Science Education, 40(10), 1198–1220.
Fox, C. (1999). An introduction to the partial credit model for developing nursing assessments. Journal of Nursing Education, 38(8), 340–346.
Garcia, C., Argelagós, E., & Privado, J. (2021). Assessment of higher education students’ information problem-solving skills in educational sciences. Information Development, 37(3), 359–375.
Gilbert, J. K. (2006). On the nature of “context” in chemical education. International Journal of Science Education, 28(9), 957–976.
Gulacar, O., Eilks, I., & Bowman, C. R. (2014). Differences in general cognitive abilities and domain-specific skills of higher-and lower-achieving students in stoichiometry. Journal of Chemical Education, 91(7), 961–968.
Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144.
King, D., Bellocchi, A., & Ritchie, S. M. (2008). Making connections: Learning and teaching chemistry in context. Research in Science Education, 38(3), 365–384.
Lerdal, A., & Kottorp, A. (2011). Psychometric properties of the Fatigue Severity Scale—Rasch analyses of individual responses in a Norwegian stroke cohort. International Journal of Nursing Studies, 48(10), 1258–1265.
Linacre, J. M. (2012). A user’s guide to WINSTEPS® MINISTEP Rasch-model computer programs program manual 3.80.0. Winsteps.
Löffler, P., Pozas, M., & Kauertz, A. (2018). How do students coordinate context-based information and elements of their own knowledge? An analysis of students’ context-based problem-solving in thermodynamics. International Journal of Science Education, 40(16), 1935–1956.
Lohr, K. N. (2002). Assessing health status and quality-of-life instruments: Attributes and review criteria. Quality of Life Research, 11(3), 193–205.
Lynn, M. R. (1986). Determination and quantification of content validity. Nursing Research, 35(6), 382–386.
Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47(2), 149–174.
McNeill, K. L., & Krajcik, J. (2007). Middle school students’ use of appropriate and inappropriate evidence in writing scientific explanations. In M. C. Lovett & P. Shah (Eds.), Thinking with data: The Proceedings of the 33rd Carnegie Symposium on Cognition (pp. 233–265). Erlbaum.
Meltzer, D. E. (2005). Relation between students’ problem-solving performance and representational format. American Journal of Physics, 73(5), 463–478.
Milenković, D., Segedinac, M., Hrin, T., & Gajić, G. (2016). Evaluation of context-level effect on students’ performance and perceived cognitive load in chemistry problem-solving tasks. Croatian Journal of Education, 17(4), 959–982.
Nentwig, P., Demuth, R., Parchmann, I., Gräsel, C., & Ralle, B. (2007). Chemie im Kontext: Situated learning in relevant contexts while systematically developing basic chemical concepts. Journal of Chemical Education, 84(9), 1439–1444.
Overton, T. L., & Potter, N. M. (2011). Investigating students’ success in solving and attitudes towards context-rich open-ended problems in chemistry. Chemistry Education Research and Practice, 12(3), 294–302.
Park, J., & Lee, L. (2004). Analysing cognitive or non-cognitive factors involved in the process of physics problem-solving in an everyday context. International Journal of Science Education, 26(13), 1577–1595.
Park, M., & Liu, X. (2016). Assessing understanding of the energy concept in different science disciplines. Science Education, 100(3), 483–516.
Rittle-Johnson, B., Matthews, P. G., Taylor, R. S., & McEldoon, K. L. (2011). Assessing knowledge of mathematical equivalence: A construct-modeling approach. Journal of Educational Psychology, 103(1), 85.
Rose, L. T., & Fischer, K. W. (2009). Dynamic development: A neo-Piagetian approach. In U. Muller, J. Carpendale, & L. Smith (Eds.), The Cambridge companion to Piaget (pp. 400–421). Cambridge University Press.
Ruiz-Primo, M. A., & Li, M. (2016). PISA science contextualized items: The link between the cognitive demands and context characteristics of the items. Electronic Journal of Educational Research, Assessment and Evaluation, 22(1), 1–20.
Ryan, K. E., & Chiu, S. (2001). An examination of item context effects, DIF, and gender DIF. Applied Measurement in Education, 14(1), 73–90.
Sadhu, S., & Laksono, E. W. (2018). Development and validation of an integrated assessment for measuring critical thinking and chemical literacy in chemical equilibrium. International Journal of Instruction, 11(3), 557–572.
Salgado, F. A. (2016). Investigating the impact of context on students’ performance. In White, B.. Chinnappan, M. & Trenholm, S. (Eds.), Opening up mathematics education research (Proceedings of the 39th annual conference of the Mathematics Education Research Group of Australasia) (pp. 102–109). Adelaide: MERGA.
Salta, K., & Tzougraki, C. (2011). Conceptual versus algorithmic problem-solving: Focusing on problems dealing with conservation of matter in chemistry. Research in Science Education, 41(4), 587–609.
Shwartz, Y., Ben-Zvi, R., & Hofstein, A. (2006). Chemical literacy: What does this mean to scientists and school teachers? Journal of Chemical Education, 83(10), 1557.
Sondergeld, T. A., & Johnson, C. C. (2014). Using Rasch measurement for the development and use of affective assessments in science education research. Science Education, 98(4), 581–613.
Stolk, M. J., De Jong, O., Bulte, A. M., & Pilot, A. (2011). Exploring a framework for professional development in curriculum innovation: Empowering teachers for designing context-based chemistry education. Research in Science Education, 41(3), 369–388.
Tsaparlis, G. (2021). It depends on the problem and on the solver: An overview of the working memory overload hypothesis, its applicability and its limitations. In G. Tsaparlis (Ed.), Problems and problem solving in chemistry education: Analysing data, looking for patterns and making deductions (pp. 93–126). Royal Society of Chemistry.
Upahi, J. E., & Ramnarain, U. (2020). Examining the connection between students’ working memory and their abilities to solve open-ended chemistry problems. Journal of Baltic Science Education, 19(1), 142–156.
Walpuski, M., Ropohl, M., & Sumfleth, E. (2011). Students’ knowledge about chemical reactions – Development and analysis of standard-based test items. Chemical Education Research and Practice, 12(2), 174–183.
Walraven, A., Brand-Gruwel, S., & Boshuizen, H. P. (2008). Information-problem solving: A review of problems students encounter and instructional solutions. Computers in Human Behavior, 24(3), 623–648.
Wei, J., Treagust, D. F., Mocerino, M., Vishnumolakala, V. R., Zadnik, M. G., Lucey, A. D., & Lindsay, E. D. (2021). Design and validation of an instrument to measure students’ interactions and satisfaction in undergraduate chemistry laboratory classes. Research in Science Education, 51(4), 1039–1053.
Wilson, M. (2005). Constructing measures: An item response modeling approach. Erlbaum.
Witte, D., & Beers, K. (2003). Testing of chemical literacy (chemistry in context in the Dutch national examination). Chemical Education International, 4(1), 1–3.
Wiyarsi, A., Prodjosantoso, A. K., & Nugraheni, A. R. E. (2021). Promoting students’ scientific habits of mind and chemical literacy using the context of socio-scientific issues on the inquiry learning. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.660495
Zoller, U. (2002). Algorithmic, LOCS and HOCS (chemistry) exam questions: Performance and attitudes of college students. International Journal of Science Education, 24(2), 185–203.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
This work was supported by the Shanghai Pujiang Program (No. 2020PJC032) and the MOE Key Research Institute of Humanities and Social Sciences (No. 17JJD880007). All the authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
Task 3-3 (level E3) requires students to identify and extract the essential information implicitly expressed from a low transparent context. Because the explicit information about “reaction of ammonia synthesis” is not presented in the context, students must resist a bunch of irrelevant information (i.e., “it is possible to see how the atoms in a molecule move...’ or ‘the ability to monitor the movement of particles”) and identify the essential information from the context needed for the solution (i.e., “dynamical processes of chemical bond breaking and making can be imaged”).
Task 1 was adapted from one PISA 2015 released science unit (Unit CS613). Task 1-2 (level I3) requires students to incorporate relevant information expressed by multiple presentation types from a low transparent context. A correct response should indicate a carbon dioxide cycle and point out that while power plants burn biofuels and emit CO2 into the atmosphere, plants take up carbon dioxide during photosynthesis, which is converted into biofuels. Students must integrate both textual information (i.e., “Biofuel is the fuel produced from plants”) and diagrammatic information (i.e., plants take up CO2 during photosynthesis). In addition, they must resist irrelevant information, such as the descriptions of ethanol or CO2 emissions released by burning fossil fuels.
Task 1-4 (level R3) requires students to use relevant information to explain complex cause-effect relations focusing on the superposition and the reciprocal influence of the different relations. The cause-effect relations between the adjustment and replacement needed to explain here are complex. Students must first reference the relevant information about natural gas and propane from the text and table, then use the table information to calculate the amount of the O2 needed by burning the same amount of natural gas and propane. Students subsequently decide how to adjust air and gas intake based on the calculation results. That is, gas intake should allow more natural gas. In contrast, the air intake should allow less air to mix with the natural gas because burning the same amount of natural gas requires less O2 than propane.
Task 4-4 (level A3) requires students to apply multiple pieces of relevant information, either data, warrant, or backings, and with an identifiable rebuttal, to support a claim. For this task, students must synthesize and use multiple pieces of information from three materials with their prior basic chemistry knowledge (e.g., harmful effects of CO2 emission on the environment) to support a given claim. They should make arguments founded on analysis, synthesis, and evaluation of context information, discussing the advantages of using nitrogen (e.g., easy to get, low-cost) and stating the disadvantages of using carbon dioxide from the perspective of environmental protection or energy-consuming.
Rights and permissions
About this article
Cite this article
Chi, S., Wang, Z. & Liu, X. Assessment of Context-Based Chemistry Problem-Solving Skills: Test Design and Results from Ninth-Grade Students. Res Sci Educ 53, 295–318 (2023). https://doi.org/10.1007/s11165-022-10056-8
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11165-022-10056-8


