Journal of Science Teacher Education

, Volume 21, Issue 1, pp 13–30 | Cite as

Development of Instruments to Assess Teacher and Student Perceptions of Inquiry Experiences in Science Classrooms

  • Todd Campbell
  • Nor Hashidah Abd-Hamid
  • Heather Chapman
Article

Abstract

This study describes the development of two instruments to investigate the extent to which students are engaged in scientific inquiry. As a result of the instrument development process employed, each finalized instrument consisted of 20-items separated into five categories. Both instruments were found to be internally consistent, with high reliability estimates. Factor analysis showed two factors for each instrument that, while not clustering the items into the five categories, did show item clustering that is consistent with research literature about students’ engagement in inquiry experiences. Based on the analyses completed, the instruments appear to be useful instruments for use in comprehensive assessment packages for assessing the extent to which students are experiencing inquiry in science classrooms.

Keywords

Scientific inquiry Reform Research instrument 

References

  1. American Association for the Advancement of Science. (1989). Science for all Americans. New York: Oxford University Press.Google Scholar
  2. American Association for the Advancement of Science. (1993). Benchmarks for science literacy. Washington, D.C.: Author.Google Scholar
  3. Campbell, T., & Bohn, C. (2008). Science laboratory experiences of high school students across one state in the U.S.: Descriptive research from the classroom. Science Educator, 17(1), 36–48.Google Scholar
  4. Cavallo, A., & Laubach, T. (2001). Students’ science perceptions and enrollment decisions in differing learning cycle classrooms. Journal of Research in Science Teaching, 38, 1029–1062.CrossRefGoogle Scholar
  5. Chang, C., & Mao, S. (1999). Comparison of Taiwan science students’ outcomes with inquiry-group versus traditional instruction. The Journal of Educational Research, 92, 340–346.CrossRefGoogle Scholar
  6. Ertepinar, H., & Geban, O. (1996). Effect of instruction supplied with the investigative-oriented laboratory approach on achievement in a science course. Educational Research, 38, 333–341.Google Scholar
  7. Fairclough, N. (1995). Critical discourse analysis: The critical study of language. Harlow, England: Longman Group.Google Scholar
  8. Gorsuch, R. L. (1983). Factor analysis (2nd ed.). Hillsdale, NJ: Erlbaum.Google Scholar
  9. Gorsuch, R. L. (2003). Factor analysis. In I. B. Weiner, D. K. Freedheim, & J. A. Schinka (Eds.), Handbook of psychology (pp. 143–164). Hoboken, NJ: John Wiley & Sons, Inc.Google Scholar
  10. Hakkarainen, K. (2003). Progressive inquiry in a computer-supported biology class. Journal of Research in Science Teaching, 40, 1072–1088.CrossRefGoogle Scholar
  11. Johnston, A. (2008). Demythologizing or dehumanizing? A response to Settlage and the ideals of open inquiry. Journal of Science Teacher Education, 19, 11–13.CrossRefGoogle Scholar
  12. Lawson, A., Benford, R., Bloom, I., Carlson, M., Falconer, K., Hestenes, D., et al. (2002). Evaluating college science and mathematics instruction: A reform effort that improves teaching skills. Journal of College Science Teaching, 31, 388–393.Google Scholar
  13. Leong, F., & Austin, J. (2006). The psychology research handbook: A guide for graduate students and research assistants (2nd ed.). Thousand Oaks, CA: Sage Publications.Google Scholar
  14. Maclsaac, D., & Falconer, K. (2002). Reforming physics instruction via RTOP. The Physics Teacher, 40, 479–485.CrossRefGoogle Scholar
  15. Madaus, G. F. (1999). The influence of testing on the curriculum. In M. J. Early & K. J. Rehage (Eds.), Issues in curriculum: A selection of chapters from past NSSE yearbooks. Chicago: The University of Chicago Press.Google Scholar
  16. Marek, E. A., Laubach, T. A., & Pedersen, J. (2003). Preservice elementary school teachers’ understanding of theory-based science education. Journal of Science Teacher Education, 14, 147–159.CrossRefGoogle Scholar
  17. Marek, E. A., Maier, S. J., & McCann, F. (2008). Assessing understanding of the learning cycle: The ULC. Journal of Science Teacher Education, 19, 375–389.CrossRefGoogle Scholar
  18. Mathison, S. (1988). Why triangulate? Educational Researcher, 17, 13–17.Google Scholar
  19. National Research Council (NRC). (1996). National science education standards. Washington, DC: National Academy Press.Google Scholar
  20. National Research Council (NRC). (2005). America’s lab report: Investigations in high school science. Washington, DC: National Academy Press.Google Scholar
  21. National Science Teachers Association (NSTA). (2007). NSTA position statement. The Integral Role of Laboratory Investigations in Science Instruction. Retrieved on October 23, 2009, from http://www.nsta.org/about/positions/laboratory.aspx.
  22. Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). New York: McGraw-Hill.Google Scholar
  23. O’Sullivan, C. Y., & Weiss, A. R. (1999). Student work and teacher practices in science. United States Department of Education. Office of Educational Research and Improvement. Washington, DC: National Center for Education Statistics (NCES 1999-455).Google Scholar
  24. Paris, S., Yambor, K., & Packard, B. (1998). Hands-on biology: A museum-school-university partnership for enhancing students’ interest and learning in science. The Elementary School Journal, 98, 267–289.CrossRefGoogle Scholar
  25. Piburn, M., Sawada, D., Turley, J., Falconer, K., Benford, R., Bloom, I., & Judson, E. (2000). Reformed teaching observation protocol (RTOP): Reference manual (ACEPT Technical Report No. INOO-3). Tempe, AZ: Arizona Collaborative for Excellence in the Preparation of Teachers (Eric Document Reproduction Service, ED 447 205).Google Scholar
  26. Sawada, D., Piburn, M., Judson, E., Turley, J., Falconer, K., Benford, R., et al. (2002). Measuring reform practices in science and mathematics. School Science and Mathematics, 102, 245–253.CrossRefGoogle Scholar
  27. Schwartz, R., Lederman, N., & Crawford, B. (2004). Developing views of nature of science in an authentic context: An explicit approach to bridging the gap between nature of science and scientific inquiry. Science Education, 88, 610–645.CrossRefGoogle Scholar
  28. Settlage, J. (2007). Demythologizing science teacher education: Conquering the false ideal of open inquiry. Journal of Science Teacher Education, 18, 461–467.CrossRefGoogle Scholar
  29. Smith, K. (1993). Development of the primary teacher questionnaire. Journal of Educational Research, 87(1), 23–29.Google Scholar
  30. Smolleck, L., & Yoder, E. (2008). Further development and validation of the teaching science as inquiry (TSI) instrument. School Science & Mathematics, 108, 291–297.CrossRefGoogle Scholar
  31. Statistical Package for the Social Sciences for Windows (SPSS). (2007). Chicago: SPSS Inc. (Rel. 16.0.1. 2007).Google Scholar
  32. Taylor, P. C., Fraser, B., & Fisher, D. (1997). Monitoring constructivist classroom learning environments. International Journal of Educational Research, 27, 293–302.CrossRefGoogle Scholar
  33. Taylor, P., & Maor, D. (2000). Assessing the efficacy of online teaching with the constructivist online learning environment survey. In A. Herrmann & M. M. Kulski (Eds.), Flexible futures in tertiary teaching. Proceedings of the 9th Annual Teaching Learning Forum. Perth, Australia: Curtin University of Technology. Retrieved on October 23, 2009, from http://lsn.curtin.edu.au/tlf/tlf2000/taylor.html.
  34. Windschitl, M. (2003). Inquiry projects in science teacher education: What can investigative experiences reveal about teacher thinking and eventual classroom practice? Science Education, 87, 112–143.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, B.V. 2009

Authors and Affiliations

  • Todd Campbell
    • 1
  • Nor Hashidah Abd-Hamid
    • 2
  • Heather Chapman
    • 1
  1. 1.Utah State UniversityLoganUSA
  2. 2.University of IowaIowa CityUSA

Personalised recommendations