Skip to main content
Log in

ASSESSING STUDENTS’ EXPERIMENTATION PROCESSES IN GUIDED INQUIRY

  • Published:
International Journal of Science and Mathematics Education Aims and scope Submit manuscript

ABSTRACT

In recent science education, experimentation features ever more strongly as a method of inquiry in science classes rather than as a means to illustrate phenomena. Ideas and materials to teach inquiry abound. Yet, tools for assessing students’ achievement in their processes of experimentation are lacking. The present study assumes a basal, non-exclusive process model of inquiry in experimentation that can be considered a consensus from multiple approaches: (1) finding an idea/hypothesis, (2) planning and conducting an experiment, and (3) drawing conclusions from evidence. The study confronted 339 secondary level students with three guided inquiry experimentation tasks on 3 days. Selected working groups were videotaped while experimenting. All the students reported their processes in a structured report form simultaneous to their progress. The generated videos and reports were analysed in a two-stepped way: (1) Experimentation was coded according to the process model into process plots; on basis of these, (2) process-focused performance scores were calculated considering logical coherence and immediacy of the inquiry processes. Correlative analyses show for two of the tasks that the report format yielded comparable performance scores to those generated from video data after students have had opportunity to learn the surveying formats (r S > .80). A first suggestion of a process-oriented assessment tool for inquiry in experimentation can be drawn from this study. It might be used to inform and complement secondary science instruction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abd-El-Khalick, F., BouJouade, S., Duschl, R., Lederman, N. G., Mamlook-Maaman, R., Hofstein, A., et al. (2004). Inquiry in science education: International perspectives. Science Education, 88, 397–419.

    Article  Google Scholar 

  • American Association for the Advancement of Science (AAAS) (Ed.). (1993). Benchmarks for scientific literacy. Washington, DC: AAAS.

    Google Scholar 

  • American Association for the Advancement of Science (AAAS) (Ed.). (2001). Atlas of science literacy (Vol. 1). Washington, DC: AAAS.

    Google Scholar 

  • Baxter, G. P. & Shavelson, R. J. (1994). Science performance assessment: Benchmarks and surrogates. International Journal of Educational Research, 21, 279–298.

  • Bell, R. L., Smetana, L. & Binns, I. (2005). Simplifying inquiry instruction: Assessing the inquiry level of classroom activities. The Science Teacher, 72, 30–33.

    Google Scholar 

  • Campbell, D. T. & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait–multimethod matrix. Psychological Bulletin, 56, 81–105.

  • Carey, S., Evany, R., Honda, M., Jay, E. & Unger, C. (1989). ‘An experiment is when you try it and see if it works’: A study of grade 7 students’ understanding of the construction of scientific knowledge. International Journal of Science Education, 11, 514–529.

    Article  Google Scholar 

  • Chiappetta, E. L. (1997). Inquiry-based science: Strategies and techniques for encouraging inquiry in the classroom. The Science Teacher, 64, 22–26.

    Google Scholar 

  • Emden, M. (2011). Prozessorientierte Leistungsmessung des naturwissenschaftlich-experimentellen Arbeitens. Berlin: Logos.

  • Hamman, M., Phan, T. T. H., Ehmer, M. & Grimm, T. (2008). Assessing pupils‘skills in experimentation. Journal of Biological Education, 42, 66–71.

    Article  Google Scholar 

  • Heller, K. A. & Perleth, C. (2000). Kognitiver Fähigkeitstest für 4. bis 12. Klassen Revision (KFT 4-12 + R). Göttingen: Hogrefe Verl. für Psychologie.

    Google Scholar 

  • Henke, C. (2007). Experimentell-naturwissenschaftliche Arbeitsweisen in der Oberstufe. Berlin: Logos.

    Google Scholar 

  • Hodson, D. (1996). Practical work in school science: Exploring some directions for change. International Journal of Science Education, 18, 755–760.

    Article  Google Scholar 

  • Hofstein, A. (2004). The laboratory in chemistry education: Thirty years of experience with developments, implementation, and research. Chemistry Education: Research and Practice, 5, 247–264.

    Google Scholar 

  • Hofstein, A. & Lunetta, V. N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88, 28–54.

    Article  Google Scholar 

  • Kalyuga, S., Ayres, P., Chandler, P. & Sweller, J. (2003). The expertise reversal effect. Educational Psychologist, 38, 23–31.

    Article  Google Scholar 

  • Kempa, R. (1986). Assessment in science. Cambridge science education series. Cambridge: Cambridge University Press.

    Google Scholar 

  • Kipnis, M. & Hofstein, A. (2008). The inquiry laboratory as a source for development of metacognitive skills. International Journal of Science and Mathematics Education, 6, 601–627.

    Article  Google Scholar 

  • Klahr, D. & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Science, 12, 1–48.

    Article  Google Scholar 

  • Klahr, D. & Simon, H. A. (2001). What have psychologists (and others) discovered about the process of scientific discovery? Current Directions in Psychological Science, 10, 75–79.

    Article  Google Scholar 

  • Linn, R. L., Baker, E. L. & Dunbar, S. D. (1991). Complex, performance-based assessment: expectations and validation criteria. Educational Researcher, 20, 15–21.

    Article  Google Scholar 

  • Lunetta, V. N. (1998). The school science laboratory: Historical perspectives and contexts for contemporary teaching. In B. J. Fraser & K. G. Tobin (Eds.), International Handbook of Science Education (pp. 249–262). Dordrecht: Kluwer Academic Publishers.

    Chapter  Google Scholar 

  • Mayer, J. (2007). Erkenntnisgewinnung als wissenschaftliches Problemlösen. In D. Krüger & H. Vogt (Eds.), Theorien in der biologiedidaktischen Forschung (pp. 177–186). Berlin: Springer.

    Chapter  Google Scholar 

  • Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23, 13–23.

    Article  Google Scholar 

  • Ministry of Education (Ed.). (2007). The New Zealand curriculum for English-medium teaching and learning in years 1–13. Wellington: Learning Media Limited.

    Google Scholar 

  • National Research Council (NRC) (Ed.). (1996). National Science Education Standards. Washington, DC: National Academy Press (NAP).

    Google Scholar 

  • National Research Council (NRC) (Ed.). (2000). Inquiry and the National Science Education Standards. Washington, DC: NAP.

    Google Scholar 

  • National Research Council (NRC) (Ed.). (2001). Classroom assessment and the National Science Education Standards. Washington, DC: NAP.

    Google Scholar 

  • National Research Council (NRC) (Ed.). (2011). A Framework for K-12 science education. Washington, DC: NAP.

    Google Scholar 

  • Pellegrino, J. W., Chudowsky, N. & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: NAP.

    Google Scholar 

  • Schreiber, N. (2012). Diagnostik experimenteller Kompetenz. Berlin: Logos.

    Google Scholar 

  • Sekretariat der Ständigen Konferenz der Kultusminister in der Bundesrepublik Deutschland (KMK). (2005). Bildungsstandards im Fach Chemie für den Mittleren Schulabschluss. München: Luchterhand.

    Google Scholar 

  • Shavelson, R. J., Ruiz-Primo, M. A. & Wiley, E. W. (1999). Note on sources of sampling variability in science performance assessments. Journal of Educational Measurement, 36, 61–71.

    Article  Google Scholar 

  • Sumfleth, E., Rumann, S., & Nicolai, N. (2004). Kooperatives Arbeiten im Chemieunterricht. In K. Klemm (Ed.), Essener Unikate: Vol. 24 (pp. 75–85). Essen: Universität Duisburg-Essen.

  • Sweller, J. (2010). Element interactivity and intrinsic, extraneous, and germane cognitive load. Educational Psychology Review, 22, 123–138.

    Article  Google Scholar 

  • Walpuski, M. (2006). Optimierung von experimenteller Kleingruppenarbeit durch Strukturierungshilfen und Feedback. Berlin: Logos.

    Google Scholar 

  • Walpuski, M. & Sumfleth, E. (2009). The use of video data to evaluate inquiry situations in chemistry education. In T. Janík & T. Seidel (Eds.), The power of video studies in investigating teaching and learning in the classroom (pp. 121–133). Münster: Waxmann.

  • Wirth, J., Thillmann, H., Künsting, J., Fischer, H. E. & Leutner, D. (2008). Das Schülerexperiment im naturwissenschaftlichen Unterricht - Bedingungen der Lernförderlichkeit dieser Lehrmethode. Zeitschrift für Pädagogik, 54, 361–375.

    Google Scholar 

  • Wissenschaftliches Konsortium HarmoS Naturwissenschaften + [HarmoS+]. (2008). HarmoS Naturwissenschaften+: Kompetenzmodell und Vorschläge für Bildungsstandards. Wissenschaftlicher Schlussbericht. Bern.

  • Zimmerman, C. (2000). The development of scientific reasoning skills. Developmental Review, 20, 99–149.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Markus Emden.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Emden, M., Sumfleth, E. ASSESSING STUDENTS’ EXPERIMENTATION PROCESSES IN GUIDED INQUIRY . Int J of Sci and Math Educ 14, 29–54 (2016). https://doi.org/10.1007/s10763-014-9564-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10763-014-9564-7

KEYWORDS

Navigation