Advertisement

International Assessments of Science Learning: Their Positive and Negative Contributions to Science Education

  • Peter J. Fensham
Chapter

Abstract

The establishment and continuity of two international comparative assessments of science learning—the IEA’s TIMSS project and the OECD’s PISA project—have meant that there are now high-status reference points for other national and more local approaches to assessing the efficacy of science teaching and learning. Both projects, albeit with very different senses of what the outcome of science learning should be, have contributed positively and negatively to the current state of assessment of school science. The TIMSS project looks back at the science that is commonly included in the curricula of the participating countries. It is thus not about established school science nor about innovations in it. PISA is highly innovative looking, prospectively forward to see how students can use their science learning in everyday life situations. In this chapter some of these positives and negatives are discussed.

Keywords

Science Education Science Learning Scientific Competence Intended Curriculum School Science Education 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. ACARA. (2011). Science: Rationale (p. 1). http://www.australiancurriculum.edu.au/Science/rationale. Accessed 1 Sept 2011.
  2. Achieve Inc. (2010). Taking the lead in science education: Forging next-generation science standards. Washington, DC: Author. www.achieve.org/international-science-benchmarking-report. Accessed 1 Sept 2011.
  3. Ainley, M., & Ainley, J. (2011). A cultural perspective on the structure of student interest in science. International Journal of Science Education, 33(1), 51–71.CrossRefGoogle Scholar
  4. Baker, D. P. (1997). Surviving TIMSS, or everything you have forgotten about international comparisons. Phi Delta Kappa, 79(4), 295–300.Google Scholar
  5. Basl, J. (2011). Effect of school on interest in natural sciences: A comparison of the Czech Republic, Germany, Finland and Norway based on PISA 2006. International Journal of Science Education, 33(1), 145–157.CrossRefGoogle Scholar
  6. Black, P., Aitkin, M., & Pevsner, D. (1995). Changing the subject: Innovation and change in science, mathematics and technology education. New York: Routledge.Google Scholar
  7. Buccheri, G., Gürber, N. A., & Brühwiler, C. (2011). The impact of gender on interest in science topics and the choice of scientific and technical vocations. International Journal of Science Education, 33(1), 159–178.CrossRefGoogle Scholar
  8. Bybee, R. W. (1997). Achieving scientific literacy: From purposes to practice. Portsmouth: Heinemann.Google Scholar
  9. Chiu, M.-H. (2007). Standards for science education in Taiwan. In D. Waddington, P. Nentwig, & S. Schanze (Eds.), Making it comparable: Standards in science education (pp. 303–346). Munster: Waxmann.Google Scholar
  10. De Boer, G. (2011). The globalisation of science education. Journal of Research in Science Teaching, 48(6), 567–591.CrossRefGoogle Scholar
  11. Department of Education NSW. (2012). Essential Secondary Science Assessment. http://www.schools.nsw.edu.au/learning/7-12assessments/essa/consultation.php, http://www.schools.nsw.edu.au/learning/7-12assessments/essa/index.php. Accessed 11 Oct 2012.
  12. Dolin, J. (2007). Science education standards and science assessment in Denmark. In D. Waddington, P. Nentwig, & S. Schanze (Eds.), Making it comparable: Standards in science education (pp. 71–82). Munster: Waxmann.Google Scholar
  13. Drechsel, B., Carstensen, C., & Prenzyl, M. (2011). The role of content and context in PISA interest scales: A study of embedded interest items in the PISA 2006 science assessment. International Journal of Science Education, 33(1), 73–95.CrossRefGoogle Scholar
  14. Fensham, P. J. (1995). One step forward…. Australian Science Teachers Journal, 41(4), 24–29.Google Scholar
  15. Fensham, P. J. (2007). Context or culture: Can TIMSS and PISA teach us about what determines educational achievement in science? In B. Atweh, A. C. Barton, M. C. Borba, N. Gough, C. Keitel, C. Vistro-Yu, & R. Vithal (Eds.), Internationalisation and globalization in mathematics and science education (pp. 151–172). Dordrecht: Springer.CrossRefGoogle Scholar
  16. Fensham, P. J. (2009). Real world contexts in PISA Science: Implications for context-based science education. Journal of Research in Science Teaching, 46(8), 884–896.CrossRefGoogle Scholar
  17. Fensham, P. J. (2011). Globalisation of science education: Comment and commentary. Journal of Research in Science Teaching, 48(6), 698–707.CrossRefGoogle Scholar
  18. Fischler, H. (2011). Didaktik—An appropriate framework for the professional work of science teachers. In D. Corrigan, J. Dillon, & R. Gunstone (Eds.), The professional knowledge base of science teaching (pp. 31–50). Dordrecht: Springer.CrossRefGoogle Scholar
  19. Garden, R. A., & Orpwood, G. (1996). TIMSS test development. In M. O. Martin & D. L. Kelly (Eds.), Third international mathematics and science study technical report (Vol. 1). Chestnut Hill: Boston College.Google Scholar
  20. Hopman, S. T. (2008). No child, no school, no state left behind: Schooling in the age of accountability. Journal of Curriculum Studies, 40(4), 417–456.CrossRefGoogle Scholar
  21. Kjærnsli, M., & Lie, S. (2011). Students’ preferences for science careers: International comparisons based on PISA. International Journal of Science Education, 33(1), 121–144.CrossRefGoogle Scholar
  22. Krapp, A., & Prenzyl, M. (2011). Research on interest in science: Theories, methods and findings. International Journal of Science Education, 33(1), 27–50.CrossRefGoogle Scholar
  23. Leibnitz Institute for Science education. (2009). PISA Research Conference. Kiel, September 2009. www.pisaresconf09.org. Accessed 1 May 2012.
  24. McCrae, B. J. (2009). PISA 2006 test development and design. In R. W. Bybee & B. J. McCrae (Eds.), PISA Science 2006: Implications for science teachers and teaching (pp. 27–38). Arlington: NSTA.Google Scholar
  25. Mullis, I. V. S., Martin, M. D., Smith, T. A., Garden, R. A., Gregory, K. D., Gonzalez, E. J., Chrostowski, S. J., & O’Connor, K. M. (2003). TIMSS assessment frameworks and specifications. Chestnut Hill: I.E.A.Google Scholar
  26. Nentwig, P., Roennebeck, S., Schoeps, K., Rumann, S., & Carstensen, C. (2009). Performance and levels of contextualisation in a selection of OECD countries in PISA 2006. Journal of Research in Science Teaching, 46(8), 897–908.CrossRefGoogle Scholar
  27. OECD. (2006). Assessing scientific, reading and mathematical literacy: A framework for PISA 2006. Paris: OECD.CrossRefGoogle Scholar
  28. OECD. (2009). PISA 2006 technical report. Paris: OECD.CrossRefGoogle Scholar
  29. Olsen, R. V., & Lie, S. (2011). Profiles of students’ interests in science issues around the world: Analysis of data from PISA 2006. International Journal of Science Education, 33(1), 97–120.CrossRefGoogle Scholar
  30. Ramseier, S. J. (2001). Scientific literacy of upper secondary students: A Swiss perspective. Studies in Educational Evaluation, 27(1), 47–64.CrossRefGoogle Scholar
  31. Roberts, D. (2007). Scientific literacy/scientific literacy. In S. K. Abel & N. G. Lederman (Eds.), Handbook of research on science education (pp. 125–171). Mahwah: Lawrence Erlbaum Associates.Google Scholar
  32. Roth, K. J., Druker, S. L., Garnier, H. E., Lemmens, M., Chen, C., Kawanka, T., et al. (2006). Teaching science in seven countries. Results from the TIMSS video study. Washington, DC: National Center for Educational Statistics.Google Scholar
  33. Sadler, T. D., & Zeidler, D. L. (2009). Scientific literacy, PISA, and socioscientific discourse: Assessment for progressive aims of science education. Journal of Research in Science Teaching, 46(8), 909–921.CrossRefGoogle Scholar
  34. Schreiner, C., & Sjøberg, S. (2007). Science education and youth’s identity construction – two incompatible projects? In D. Corrigan, J. Dillon, & R. Gunstone (Eds.), Re-emergence of values in science education (pp. 231–247). Rotterdam: Sense.Google Scholar
  35. Shirley, D. (2008). The coming of post-standardisation in science education: What role for the German Didaktik tradition? Zeitschrift für Erziehungswissenschaft, 10(9), 35–45.Google Scholar
  36. Sjøberg, S. (2004). Science and technology in the new millennium—Friend or foe? In Proceedings of the 11th IOSTE Symposium, Lublin, July 25–30.Google Scholar
  37. Solomon, J., & Aikenhead, G. (Eds.). (1994). STS education: International perspectives on reform. New York: Teachers College.Google Scholar
  38. Valverde, G. A., & Schmidt, W. H. (1998). Greater expectations: Learning from other nations in the quest for “World Class Standards” in US school mathematics and science. Journal of Curriculum Studies, 32(5), 651–687.CrossRefGoogle Scholar
  39. Waddington, D., Nentwig, P., & Schanze, S. (Eds.). (2007). Making it comparable: Standards in science education. Munster: Waxmann.Google Scholar
  40. Yip, D. Y., Chiu, M. M., & Ho, E. S. (2004). Hong Kong student achievement in OECD-PISA study: Gender differences in science content, literacy skills, and test item formats. International Journal of Science Education, 26(1), 91–106.Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2013

Authors and Affiliations

  1. 1.Monash University & Queensland, University of TechnologyClaytonAustralia

Personalised recommendations