Advertisement

ASSESSING STUDENTS’ EXPERIMENTATION PROCESSES IN GUIDED INQUIRY

  • Markus EmdenEmail author
  • Elke Sumfleth
Article

ABSTRACT

In recent science education, experimentation features ever more strongly as a method of inquiry in science classes rather than as a means to illustrate phenomena. Ideas and materials to teach inquiry abound. Yet, tools for assessing students’ achievement in their processes of experimentation are lacking. The present study assumes a basal, non-exclusive process model of inquiry in experimentation that can be considered a consensus from multiple approaches: (1) finding an idea/hypothesis, (2) planning and conducting an experiment, and (3) drawing conclusions from evidence. The study confronted 339 secondary level students with three guided inquiry experimentation tasks on 3 days. Selected working groups were videotaped while experimenting. All the students reported their processes in a structured report form simultaneous to their progress. The generated videos and reports were analysed in a two-stepped way: (1) Experimentation was coded according to the process model into process plots; on basis of these, (2) process-focused performance scores were calculated considering logical coherence and immediacy of the inquiry processes. Correlative analyses show for two of the tasks that the report format yielded comparable performance scores to those generated from video data after students have had opportunity to learn the surveying formats (r S > .80). A first suggestion of a process-oriented assessment tool for inquiry in experimentation can be drawn from this study. It might be used to inform and complement secondary science instruction.

KEYWORDS

assessment guided inquiry scientific experimentation secondary school 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abd-El-Khalick, F., BouJouade, S., Duschl, R., Lederman, N. G., Mamlook-Maaman, R., Hofstein, A., et al. (2004). Inquiry in science education: International perspectives. Science Education, 88, 397–419.CrossRefGoogle Scholar
  2. American Association for the Advancement of Science (AAAS) (Ed.). (1993). Benchmarks for scientific literacy. Washington, DC: AAAS.Google Scholar
  3. American Association for the Advancement of Science (AAAS) (Ed.). (2001). Atlas of science literacy (Vol. 1). Washington, DC: AAAS.Google Scholar
  4. Baxter, G. P. & Shavelson, R. J. (1994). Science performance assessment: Benchmarks and surrogates. International Journal of Educational Research, 21, 279–298.Google Scholar
  5. Bell, R. L., Smetana, L. & Binns, I. (2005). Simplifying inquiry instruction: Assessing the inquiry level of classroom activities. The Science Teacher, 72, 30–33.Google Scholar
  6. Campbell, D. T. & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait–multimethod matrix. Psychological Bulletin, 56, 81–105.Google Scholar
  7. Carey, S., Evany, R., Honda, M., Jay, E. & Unger, C. (1989). ‘An experiment is when you try it and see if it works’: A study of grade 7 students’ understanding of the construction of scientific knowledge. International Journal of Science Education, 11, 514–529.CrossRefGoogle Scholar
  8. Chiappetta, E. L. (1997). Inquiry-based science: Strategies and techniques for encouraging inquiry in the classroom. The Science Teacher, 64, 22–26.Google Scholar
  9. Emden, M. (2011). Prozessorientierte Leistungsmessung des naturwissenschaftlich-experimentellen Arbeitens. Berlin: Logos.Google Scholar
  10. Hamman, M., Phan, T. T. H., Ehmer, M. & Grimm, T. (2008). Assessing pupils‘skills in experimentation. Journal of Biological Education, 42, 66–71.CrossRefGoogle Scholar
  11. Heller, K. A. & Perleth, C. (2000). Kognitiver Fähigkeitstest für 4. bis 12. Klassen Revision (KFT 4-12 + R). Göttingen: Hogrefe Verl. für Psychologie.Google Scholar
  12. Henke, C. (2007). Experimentell-naturwissenschaftliche Arbeitsweisen in der Oberstufe. Berlin: Logos.Google Scholar
  13. Hodson, D. (1996). Practical work in school science: Exploring some directions for change. International Journal of Science Education, 18, 755–760.CrossRefGoogle Scholar
  14. Hofstein, A. (2004). The laboratory in chemistry education: Thirty years of experience with developments, implementation, and research. Chemistry Education: Research and Practice, 5, 247–264.Google Scholar
  15. Hofstein, A. & Lunetta, V. N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88, 28–54.CrossRefGoogle Scholar
  16. Kalyuga, S., Ayres, P., Chandler, P. & Sweller, J. (2003). The expertise reversal effect. Educational Psychologist, 38, 23–31.CrossRefGoogle Scholar
  17. Kempa, R. (1986). Assessment in science. Cambridge science education series. Cambridge: Cambridge University Press.Google Scholar
  18. Kipnis, M. & Hofstein, A. (2008). The inquiry laboratory as a source for development of metacognitive skills. International Journal of Science and Mathematics Education, 6, 601–627.CrossRefGoogle Scholar
  19. Klahr, D. & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Science, 12, 1–48.CrossRefGoogle Scholar
  20. Klahr, D. & Simon, H. A. (2001). What have psychologists (and others) discovered about the process of scientific discovery? Current Directions in Psychological Science, 10, 75–79.CrossRefGoogle Scholar
  21. Linn, R. L., Baker, E. L. & Dunbar, S. D. (1991). Complex, performance-based assessment: expectations and validation criteria. Educational Researcher, 20, 15–21.CrossRefGoogle Scholar
  22. Lunetta, V. N. (1998). The school science laboratory: Historical perspectives and contexts for contemporary teaching. In B. J. Fraser & K. G. Tobin (Eds.), International Handbook of Science Education (pp. 249–262). Dordrecht: Kluwer Academic Publishers.CrossRefGoogle Scholar
  23. Mayer, J. (2007). Erkenntnisgewinnung als wissenschaftliches Problemlösen. In D. Krüger & H. Vogt (Eds.), Theorien in der biologiedidaktischen Forschung (pp. 177–186). Berlin: Springer.CrossRefGoogle Scholar
  24. Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23, 13–23.CrossRefGoogle Scholar
  25. Ministry of Education (Ed.). (2007). The New Zealand curriculum for English-medium teaching and learning in years 1–13. Wellington: Learning Media Limited.Google Scholar
  26. National Research Council (NRC) (Ed.). (1996). National Science Education Standards. Washington, DC: National Academy Press (NAP).Google Scholar
  27. National Research Council (NRC) (Ed.). (2000). Inquiry and the National Science Education Standards. Washington, DC: NAP.Google Scholar
  28. National Research Council (NRC) (Ed.). (2001). Classroom assessment and the National Science Education Standards. Washington, DC: NAP.Google Scholar
  29. National Research Council (NRC) (Ed.). (2011). A Framework for K-12 science education. Washington, DC: NAP.Google Scholar
  30. Pellegrino, J. W., Chudowsky, N. & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: NAP.Google Scholar
  31. Schreiber, N. (2012). Diagnostik experimenteller Kompetenz. Berlin: Logos.Google Scholar
  32. Sekretariat der Ständigen Konferenz der Kultusminister in der Bundesrepublik Deutschland (KMK). (2005). Bildungsstandards im Fach Chemie für den Mittleren Schulabschluss. München: Luchterhand.Google Scholar
  33. Shavelson, R. J., Ruiz-Primo, M. A. & Wiley, E. W. (1999). Note on sources of sampling variability in science performance assessments. Journal of Educational Measurement, 36, 61–71.CrossRefGoogle Scholar
  34. Sumfleth, E., Rumann, S., & Nicolai, N. (2004). Kooperatives Arbeiten im Chemieunterricht. In K. Klemm (Ed.), Essener Unikate: Vol. 24 (pp. 75–85). Essen: Universität Duisburg-Essen.Google Scholar
  35. Sweller, J. (2010). Element interactivity and intrinsic, extraneous, and germane cognitive load. Educational Psychology Review, 22, 123–138.CrossRefGoogle Scholar
  36. Walpuski, M. (2006). Optimierung von experimenteller Kleingruppenarbeit durch Strukturierungshilfen und Feedback. Berlin: Logos.Google Scholar
  37. Walpuski, M. & Sumfleth, E. (2009). The use of video data to evaluate inquiry situations in chemistry education. In T. Janík & T. Seidel (Eds.), The power of video studies in investigating teaching and learning in the classroom (pp. 121–133). Münster: Waxmann.Google Scholar
  38. Wirth, J., Thillmann, H., Künsting, J., Fischer, H. E. & Leutner, D. (2008). Das Schülerexperiment im naturwissenschaftlichen Unterricht - Bedingungen der Lernförderlichkeit dieser Lehrmethode. Zeitschrift für Pädagogik, 54, 361–375.Google Scholar
  39. Wissenschaftliches Konsortium HarmoS Naturwissenschaften + [HarmoS+]. (2008). HarmoS Naturwissenschaften+: Kompetenzmodell und Vorschläge für Bildungsstandards. Wissenschaftlicher Schlussbericht. Bern.Google Scholar
  40. Zimmerman, C. (2000). The development of scientific reasoning skills. Developmental Review, 20, 99–149.CrossRefGoogle Scholar

Copyright information

© Springer Science + Business Media B.V. 2015

Authors and Affiliations

  1. 1.University of EducationSchwaebisch GmuendGermany

Personalised recommendations