Advertisement

FROM PISA TO EDUCATIONAL STANDARDS: THE IMPACT OF LARGE-SCALE ASSESSMENTS ON SCIENCE EDUCATION IN GERMANY

  • Knut NeumannEmail author
  • Hans E. Fischer
  • Alexander Kauertz
Article

Abstract

The German education system does not traditionally rely on standardized testing. However, when the Programme for International Student Assessment (PISA) study revealed an average performance of German students compared to other participating countries, a particular proportion of low-performing students, and remarkable disparities between the federal states, German policy makers decided for a major reform of the education system. A core piece of this reform was the introduction of National Education Standards. For science education, these standards were heavily influenced by the PISA results and its underlying framework. That is, with the standards, a paradigm shift took place from the German notion of Bildung towards the Anglo-American notion of literacy. With the introduction of these standards, a new field of empirical educational research was created: research on models of scientific literacy or competency models as a basis of benchmarking the standards. This article describes the German education system before PISA, summarizes the major findings from PISA, and describes how these findings informed the formulation of the performance standards for science education. It also details the measures undertaken to benchmark these standards. Finally, it provides insight into the issues with developing and benchmarking performance standards and points out future areas of research on evidence-based decision making in educational policy.

Key words

competencies Germany PISA science education standards 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson, J. O., Lin, H.-L., Treagust, D. F., Ross, S. P., & Yore, L. D. (2007). Using large-scale assessment datasets for research in science and mathematics education: Programme for International Student Assessment (PISA). International Journal of Science and Mathematics Education, 5(4), 591–614.CrossRefGoogle Scholar
  2. Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.Google Scholar
  3. Baumert, J., Lehmann, R., Lehrke, M., Schmitz, B., Clausen, M., Hosenfeld, I., et al. (1997). TIMSS—Mathematisch-Naturwissenschaftlicher Unterricht im internationalen Vergleich [TIMSS—Mathematic and scientific instruction in international comparison]. Opladen, Germany: Leske + Buderich.Google Scholar
  4. Baumert, J., & Schümer, G. (2001). Familiäre Lebensverhältnisse, Bildungsbeiteiligung und Kompetenzerwerb [Family life situation, educational participation, and competency acquisition]. In J. Baumert, E. Klieme, M. Neubrand, M. Prenzel, U. Schiefele, W. Schneider, et al. (Eds.), PISA 2000. Basiskompetenzen von Schülerinnen und Schülern im internationalen Vergleich [PISA 2000: Basic compencies of students in international comparison] (pp. 324–410). Opladen, Germany: Leske + Budrich.Google Scholar
  5. Beaton, A. E., Martin, M. O., Mullis, I. V., Gonzalez, E. J., Smith, T. A., & Kelly, D. L. (1996). Science achievement in the middle school years: IEA’s Third International Mathematics and Science Study (TIMSS). Chestnut Hill, MA: TIMSS International Study Center, Boston College.Google Scholar
  6. Bernholt, S., Parchmann, I., & Commons, M. L. (2009). Kompetenzmodellierung zwischen Forschung und Unterrichtspraxis [Modelling competencies between research and instructional practice]. Zeitschrift für Didaktik der Naturwissenschaften, 15, 219–245.Google Scholar
  7. Bybee, R. W. (1998). National standards, deliberation, and design: The dynamics of developing meaning in science curriculum. In D. A. Roberts & L. Ostman (Eds.), Problems of meaning in science curriculum (pp. 150–165). New York: Teachers College Press.Google Scholar
  8. Bybee, R. W., McCrae, B., & Laurie, R. (2009). PISA 2006: An assessment of scientific literacy. Journal of Research in Science Teaching, 46(8), 865–883.CrossRefGoogle Scholar
  9. Carstensen, C. H., Prenzel, M., & Baumert, J. (2008). Trendanalysen: Wie haben sich die Kompetenzen in Deutschland zwischen PISA 2000 und PISA 2006 entwickelt? [Trend analyses: How have competencies in Germany developed from PISA 2000 to PISA 2006? Special Issue]. Zeitschrift für Erziehungswissenschaft, 10, 11–34.Google Scholar
  10. Döbert, H. (2007). Germany. In W. Hörner, H. Döbert, B. von Knopp, & W. Mitter (Eds.), The education systems of Europe (pp. 299–325). Amsterdam, The Netherlands: Springer.CrossRefGoogle Scholar
  11. Ehmke, T., & Baumert, J. (2007). Soziale Herkunft und Kompetenzerwerb: Vergleiche zwischen PISA 2000, 2003 und 2006 [Social background and competence acquisition: Comparisons between PISA 2000, 2003 and 2006]. In M. Prenzel, C. Artelt, J. Baumert, W. Blum, M. Hammann, E. Klieme, et al. (Eds.). PISA 2006: Die Ergebnisse der dritten internationalen Vergleichsstudie [PISA 2006: Results of the third international comparison study] (pp. 309–336). Münster, Germany: Waxmann.Google Scholar
  12. Eisenhart, M., Finkel, E., & Marion, S. F. (1996). Creating the conditions for scientific literacy: A re-examination. American Educational Research Journal, 33, 261–295.Google Scholar
  13. Fensham, P. J. (2009a). Real world contexts in PISA science: Implications for context-based science education. Journal of Research in Science Teaching, 46(8), 884–896.CrossRefGoogle Scholar
  14. Fensham, P. J. (2009b). The link between policy and practice in science education: The role of research. Science Education, 93(5), 1076–1095.CrossRefGoogle Scholar
  15. Fischer, H. E., Kauertz, A., & Neumann, K. (2008). Standards of science education. In S. Mikelskis-Seifert, U. Ringelband, & M. Brückmann (Eds.), Four decades of research in science education—From curriculum development to quality improvement (pp. 29–41). Münster, Germany: Waxmann.Google Scholar
  16. Jürgens, E. (2004). Pädagogische Implikationen der KMK-Entwürfe für Bildungsstandards [Pedagogical implications of the KMK drafts of education standards]. In J. Schlömerkemper (Ed.), Bildung und Standards (pp. 48–65). Weinheim, Germany: Juvena.Google Scholar
  17. Kauertz, A., & Fischer, H. E. (2006). Assessing students’ level of knowledge and analysing the reasons for learning difficulties in physics by Rasch analysis. In X. Liu & W. Boone (Eds.), Applications of Rasch measurement in science education (pp. 212–246). Maple Grove, MA: Jam Press.Google Scholar
  18. Klafki, W. (1996). Neue Studien zur Bildungstheorie und Didaktik [New studies on educational theory and didactics] (5th ed.). Weinheim, Germany: Beltz.Google Scholar
  19. Klieme, E., Avenarius, H., Blum, W., Döbrich, P., Gruber, H., Prenzel, M., et al. (2003). Zur Entwicklung nationaler Bildungsstandards [Regarding the development of National Education Standards]. Berlin, Germany: Bundesministerium für Bildung und Forschung.Google Scholar
  20. Liu, X. (2009). Linking competencies to opportunities to learn. New York: Springer.CrossRefGoogle Scholar
  21. Mayer, J., Grube, C., & Möller, A. (2008). Kompetenzmodell naturwissenschaftlicher Erkenntnisgewinnung [A competency model of scientific inquiry]. In U. Harms & A. Sandmann (Eds.), Lehr- und Lernforschung in der Biologiedidaktik (vol. 3, pp. 63–80). Innsbruck, Austria: Studienverlag.Google Scholar
  22. Nentwig, P., & Schanze, S. (2007). Making it comparable—Standards in science education. In D. Waddington, P. Nentwig, & S. Schanze (Eds.), Standards in science education (pp. 11–19). Münster, Germany: Waxmann.Google Scholar
  23. Organisation for Economic Co-operation and Development (2001). Knowledge and skills for life: First results from PISA 2000. Paris: Author.Google Scholar
  24. Organisation for Economic Co-operation and Development (2004). Learning for tomorrow’s world: First results from PISA 2003. Paris: Author.Google Scholar
  25. Ostermeier, C., Prenzel, M., & Duit, R. (2009). Improving science and mathematics instruction: The SINUS Project as an example for reform as teacher professional development. International Journal of Science Education, 32(3), 303–327.CrossRefGoogle Scholar
  26. Prenzel, M., Carstensen, C. H., Rost, J., & Senkbeil, M. (2002). Naturwissenschaftfliche Grundbildung im Ländervergleich [Scientific literacy compared across the federal states]. In J. Baumert, C. Artelt, E. Klieme, M. Neubrand, M. Prenzel, U. Schiefele, et al. (Eds.). PISA 2000—Die Länder der Bundesrepublik Deutschland im Vergleich [PISA 2000—The federal states of the Federal Republic of Germany compared] (pp. 129–158). Opladen, Germany: Leske + Budrich.Google Scholar
  27. Prenzel, M., Rönnebeck, S., & Carstensen, C. H. (2008). PISA 2006—Eine Einführung in den Ländervergleich [PISA 2006—An introduction to the comparison of the federal states]. In M. Prenzel, C. Artelt, J. Baumert, W. Blum, M. Hammann, E. Klieme, et al. (Eds.). PISA 2006 in Deutschland. Die Kompetenzen der Jugendlichen im dritten Ländervergleich [PISA 2006 in Germany: Teenagers’ competencies in the third comparison of the federal states] (pp. 31–64). Münster, Germany: WaxmannGoogle Scholar
  28. Prenzel, M., Rost, J., Senkbeil, M., Häußler, P., & Klopp, A. (2001). Naturwissenschaftliche Grundbildung: Testkonzeption und Ergebnisse [Scientific literacy: Test conception and results]. In J. Baumert, E. Klieme, M. Neubrand, M. Prenzel, U. Schiefele, W. Schneider, et al. (Eds.) PISA 20000. Basiskompetenzen von Schülerinnen und Schülern im internationalen Vergleich [PISA 2000: Basic compencies of students in international comparison] (pp. 191–248). Opladen, Germany: Leske + Budrich.Google Scholar
  29. Prenzel, M., Schöps, K., Rönnebeck, S., Senkbeil, M., Walter, O., Carstensen C. H., & Hammann, M. (2007). Naturwissenschaftliche Kompetenz im internationalen Vergleich [Scientific competency in international comparison]. In M. Prenzel, C. Artelt, J. Baumert, W. Blum, M. Hammann, E. Klieme, et al. (Eds.). PISA 2006: Die Ergebnisse der dritten internationalen Vergleichsstudie [PISA 2006: Results of the third international comparison study] (pp. 63–106). Münster, Germany: Waxmann.Google Scholar
  30. Roberts, D. A. (2007). Scientific literacy/science literacy. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 729–780). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  31. Rönnebeck, S., Schöps, K., Prenzel, M., & Hammann, M. (2008). Naturwissenschaftliche Kompetenz im Ländervergleich [Scientific competency in comparison of the federal states]. In M. Prenzel, C. Artelt, J. Baumert, W. Blum, M. Hammann, E. Klieme, et al. (Eds.). PISA 2006 in Deutschland. Die Kompetenzen der Jugendlichen im dritten Ländervergleich [PISA 2006 in Germany: Teenagers’ competencies in the third comparison of the federal states] (pp. 67–94). Münster, Germany: Waxmann.Google Scholar
  32. Rost, J., Senkbeil, M., Walter, O., Carstensen, C. H., & Prenzel, M. (2005). Naturwissenschaftliche Grundbildung im Ländervergleich [Scientific literacy compared across the federal states]. In M. Prenzel, J. Baumert, W. Blum, R. Lehmann, D. Leutner, M. Neubrand, et al. (Eds.). PISA 2003. Der zweite Vergleich der Länder in Deutschland [PISA 2003: The second comparison of the German federal states] (pp. 103–124). Münster, Germany: Waxmann.Google Scholar
  33. Rost, J., Walter, O., Carstensen, C.H., Senkbeil, M., & Prenzel, M. (2004). Naturwissenschaftliche Kompetenz [Scientific competency]. In M. Prenzel, J. Baumert, W. Blum, R. Lehmann, R. D. Leutner, M. Neubrand, et al. (Eds.). PISA 2003. Der Bildungsstand der Jugendlichen in Deutschland—Ergebnisse des zweiten internationalen Vergleichs [PISA 2003: Results of the second international comparison] (pp. 111–146). Münster, Germany: Waxmann.Google Scholar
  34. Sadler, T. D., & Zeidler, D. L. (2009). Scientific literacy, PISA and socioscientific discourse: Assessment for progressive aims of science education. Journal of Research in Science Teaching, 46(8), 909–921.CrossRefGoogle Scholar
  35. Schecker, H., & Parchmann, I. (2006). Modellierung naturwissenschaftlicher Kompetenz [Modelling scientific literacy]. Zeitschrift für die Didaktik der Naturwissenschaften, 12, 45–66.Google Scholar
  36. Schecker, H., & Parchmann, I. (2007). Standards and competence models: The German situation. In D. Waddington, P. Nentwig, & S. Schanze (Eds.), Standards in science education (pp. 147–164). Münster, Germany: Waxmann.Google Scholar
  37. Sorkin, D. (1983). Wilhelm von Humboldt: The theory and practice of self-formation (Bildung). Journal of the History of Ideas, 44, 55–73.CrossRefGoogle Scholar
  38. Ständige Konferenz der Kultusminister der Länder der Bundesrepublik Deutschland [Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany] (1997). Pressemitteilungen vom 27.06. und 24.10.1997 zu den Plenarsitzungen Nr. 279/280 der Kultusministerkonferenz am 26./27.06 und 23./24.10.1997 [Press release from 27.06 and 24.10.1997 concerning the plenary meetings no. 279/280 of the Conference of the Ministers of Education and Cultural Affairs on 26./27 and 23./24.10.1997]. Bonn, Germany: Author.Google Scholar
  39. Walpuski, M., Kamper, N., Kauertz, A., & Wellnitz, N. (2008). Evaluation der Bildungsstandards in den Naturwissenschaften [Evaluation of the National Education Standards in the sciences]. Mathematisch und Naturwissenschaftlicher Unterricht, 6, 223–226.Google Scholar
  40. Weinert,, W F.-E. (2001). Concept of competence: A conceptual clarification. In D. S. Rychen & L. H. Salganik (Eds.), Defining and selecting key competencies (pp. 45–65). Seattle, WA: Hogrefe & Huber.Google Scholar

Copyright information

© National Science Council, Taiwan 2010

Authors and Affiliations

  • Knut Neumann
    • 1
    Email author
  • Hans E. Fischer
    • 2
  • Alexander Kauertz
    • 3
  1. 1.Department of Physics EducationLeibniz-Institute for Science and Mathematics Education (IPN)KielGermany
  2. 2.University Duisburg–EssenEssenGermany
  3. 3.University of Education of WeingartenWeingartenGermany

Personalised recommendations