Advertisement

Zeitschrift für Erziehungswissenschaft

, Volume 15, Issue 1, pp 95–109 | Cite as

Learning progressions: significant promise, significant challenge

  • Alicia C. Alonzo
Schwerpunkt

Abstract

Over the past five years or so, the concept of alearning progression has gained traction in the United States (US) science education community. Learning progressions have been touted as a means of addressing a number of vexing and persistent problems facing science education in the US (and around the world). In particular, there is great excitement over the potential of learning progressions to provide much needed coherence to standards, curricula, and assessments. Although relatively new, this promising approach is already having a significant impact on the landscape of US science education. However, there are also concerns that this influence may be premature. A substantial challenge lies in the validation of learning progressions—how to ensure that these tools, which are rapidly being incorporated into policy and practice, have sufficient empirical justification. This paper discusses both the promise and challenge of work to validate learning progressions for use in science education reform.

Keywords

Learning progression Science education Validation 

Learning progressions: Große Versprechungen, große Herausforderungen

Zusammenfassung

Seit etwa fünf Jahren hat das Konzept derlearning progressions an Bedeutung in der Naturwissenschaftsdidaktik in den USA gewonnen.Learning progressions wecken große Hoffnungen in Bezug auf die Bewältigung der komplexen und andauernden Herausforderungen, die sich naturwissenschaftlicher Bildung in den USA wie auch weltweit stellen. Vor allem wird erwartet, dasslearning progressions einen Beitrag zur Kohärenz zwischen Standards, Lehrplänen und Assessment leisten können. Obwohl dieser viel versprechende Ansatz noch relativ neu ist, hat er bereits einen großen Einfluss auf naturwissenschaftlichen Unterricht in den USA ausgeübt. Allerdings gibt es auch Bedenken, dass dieser Einfluss verfrüht sein könnte, denn eine zentrale Herausforderung ist die Validierung vonlearning progressions: Wie kann sichergestellt werden, dass dieser Ansatz, der so schnell in die Bildungspolitik bzw. in die pädagogische Praxis aufgenommen worden ist, empirisch abgesichert wird? Dieser Beitrag setzt sich sowohl mit dem Potenzial als auch mit den Herausforderungen der Validierung vonlearning progressions als Mittel zur Reform des naturwissenschaftlichen Unterrichts auseinander. Das Beispiel einerlearning progression zum Thema „Kraft und Bewegung“ wird dabei zur Illustration genutzt.

Schlüsselwörter

Learning progression Naturwissenschaftliches Lernen Validierung 

References

  1. Alonzo, A. C. (2009, April).Design criteria for learning progressions to support teachers’ formative assessment practices. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA.Google Scholar
  2. Alonzo, A. C. (2010). Discourse as a lens for reframing consideration of learning progressions. In K. Gomez, L. Lyons, & J. Radinsky (Eds.),Learning in the disciplines: Proceedings of the 9th International Conference of the Learning Sciences (ICLS 2010)—Vol. 1, Full Papers (pp. 588–595). Chicago, IL: International Society of the Learning Sciences.Google Scholar
  3. Alonzo, A. C. (2011). Learning progressions that support formative assessment practices.Measurement: Interdisciplinary Research and Perspectives, 9, 124–129.CrossRefGoogle Scholar
  4. Alonzo, A. C., & Steedle, J. T. (2009). Developing and assessing a force and motion learning progression.Science Education, 93, 389–421.CrossRefGoogle Scholar
  5. Alonzo, A. C., Neidorf, T., & Anderson, C. W. (in press). Using learning progressions to inform large-scale assessment. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  6. Alonzo, A. C., Benus, M., Bennett, W., & Pinney, B. (2009, September).A learning progression for elementary school students’ understanding of plant nutrition. Paper presented at the biennial meeting of the European Science Education Research Association, Istanbul, Turkey.Google Scholar
  7. American Association for the Advancement of Science. (1993).Benchmarks for science literacy. New York: Oxford University Press.Google Scholar
  8. Anderson, C. W. (2008, February).Conceptual and empirical validation of learning progressions. Retrieved from Consortium for Policy Research in Education website: http://www.cpre.org/ccii/images/stories/ccii_pdfs/learning%20progressions%20anderson.pdf.Google Scholar
  9. Black, P., Wilson, M., & Yao, S.-Y. (2011). Roadmaps for learning: A guide to the navigation of learning progressions.Measurement: Interdisciplinary Research and Perspectives, 9, 71–123.CrossRefGoogle Scholar
  10. Briggs, D. C., Alonzo, A. C., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with ordered multiple-choice items.Educational Assessment, 11, 33–63.CrossRefGoogle Scholar
  11. Catley, K., Lehrer, R., & Reiser, B. (2004).Tracing a prospective learning progression for developing understanding of evolution. Paper commissioned by the National Academy of Sciences Committee on Test Design for K-12 Science Achievement. Retrieved from the National Academies website: http://www7.nationalacademies.org/bota/Evolution.pdf.Google Scholar
  12. Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Characterization and representation of physics problems by experts and novices.Cognitive Science, 5, 121–152.CrossRefGoogle Scholar
  13. Corcoran, T., Mosher, F. A., & Rogat, A. D. (2009, May).Learning progressions in science: An evidence-based approach to reform of teaching. (CPRE Research Report # RR-63). Philadelphia: Consortium for Policy Research in Education.Google Scholar
  14. diSessa, A. A. (1993). Toward an epistemology of physics.Cognition and Instruction, 10, 105–225.CrossRefGoogle Scholar
  15. diSessa, A. A., Gillespie, N. M., & Esterly, J. B. (2004). Coherence versus fragmentation in the development of the concept of force.Cognitive Science, 28, 843–900.CrossRefGoogle Scholar
  16. Duit, R. (2009, March).Bibliography—STCSE: Students’ and teachers’ conceptions and science education. Retrieved from IPN – Leibniz Institute for Science Education at the University of Kiel website: http://www.ipn.uni-kiel.de/aktuell/stcse/stcse.html.Google Scholar
  17. Ericsson, K. A., & Simon, H. A. (1993).Protocol analysis: Verbal reports as data. Cambridge, MA: MIT Press.Google Scholar
  18. Finegold, M., & Gorsky, P. (1991). Students’ concepts of force as applied to related physical systems: A search for consistency.International Journal of Science Education, 13, 97–113.CrossRefGoogle Scholar
  19. Foster, J., & Wiser, M. (in press). The potential of learning progression research to inform the design of state science standards. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  20. Furtak, E. M., Thompson, J., Braaten, M., & Windschitl, M. (in press). Learning progressions to support ambitious teaching practices. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  21. Gotwals, A. W., & Songer, N. B. (2010). Reasoning up and down a food chain: Using an assessment framework to investigate students’ middle knowledge.Science Education, 94, 259–281.Google Scholar
  22. Gotwals, A. W., Songer, N. B., & Bullard, L. (in press). Assessing students’ progressing abilities to construct scientific explanations. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  23. Gunckel, K. L., Mohan, L., Covitt, B. A., & Anderson, C. W. (in press). Addressing challenges in developing learning progressions for environmental science literacy. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  24. Halloun, I. A., & Hestenes, D. (1985). Common sense concepts about motion.American Journal of Physics, 53, 1056–1065.CrossRefGoogle Scholar
  25. Ioannides, C., & Vosniadou, S. (2001). The changing meanings of force: From coherence to fragmentation.Cognitive Science Quarterly, 2(1), 5–62. Retrieved from the University of Athens website: http://www.cs.phs.uoa.gr/el/staff/vosniadou/force.pdf.Google Scholar
  26. Jin, H., & Anderson, C. W. (in press). Developing assessments for a learning progression on carbon transforming processes in socio-ecological systems. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  27. Krajcik, J. S. (in press). The importance, cautions, and future of learning progression research: Some comments on Richard Shavelson and Amy Kurpius’s “Reflections on Learning Progressions”. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  28. Lehrer, R., & Schauble, S. (2008, September).A modeling foundation for reasoning about evolution. Presentation to the Consortium for Policy Research in Education Learning Progression Working Group, Ann Arbor, MI.Google Scholar
  29. National Assessment Governing Board. (2008).Science framework for the 2009 National Assessment of Educational Progress. Retrieved from National Assessment Governing Board website: http://www.nagb.org/publications/frameworks/science-09.pdf.Google Scholar
  30. National Research Council. (1996).National science education standards. Washington, DC National Academy Press.Google Scholar
  31. National Research Council. (2001).Knowing what students know: The science and design of educational assessment. Washington, DC National Academy Press.Google Scholar
  32. National Research Council. (2006).Systems for state science assessment. Washington, DC National Academies Press.Google Scholar
  33. National Research Council. (2007).Taking science to school. Washington, DC National Academies Press.Google Scholar
  34. National Research Council. (2010, July).A framework for science education: Preliminary public draft. Retrieved from the National Academies website: http://www7.nationalacademies.org/bose/Standards_Framework_Homepage.html.Google Scholar
  35. Plummer, J. D. (in press). Challenges in defining and validating an astronomy learning progression. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  36. Schmidt, W. H., McKnight, C. C., & Raizen, S. A. (1997).A splintered vision: An investigation of U.S. science and mathematics education. Dordrecht: Kluwer.Google Scholar
  37. Schwarz, C., Reiser, B. J., Acher, A., Kenyon, L., & Fortus, D. (in press). MoDeLS: Challenges in defining a learning progression for scientific modeling. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  38. Shavelson, R. J., & Kurpius, A. (in press). Reflections on learning progressions. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  39. Shavelson, R. J., Yin, Y., Furtak, E. M., Ruiz-Primo, M. A., Ayala, C. C., Young, D. B., Pottenger, F. M., III (2008). On the role and impact of formative assessment on science inquiry teaching and learning. In J. E. Coffey, R. Douglas, & C. Sterns (Eds.),Assessing science learning: Perspectives from research and practice (pp. 21–36). Arlington, VA: NSTA Press.Google Scholar
  40. Smith, C. L., Wiser, M., Anderson, C. W., & Krajcik, J. (2006). Implications of research on children’s learning for standards and assessment: A proposed learning progression for matter and the atomic molecular theory.Measurement: Interdisciplinary Research and Perspectives, 14, 1–98.CrossRefGoogle Scholar
  41. Songer, N. B., Kelcey, B., & Gotwals, A. W. (2009). How and when does complex reasoning occur? Empirically driven development of a learning progression focused on complex reasoning about biodiversity.Journal of Research in Science Teaching, 46, 610–631.CrossRefGoogle Scholar
  42. Steedle, J. T., & Shavelson, R. J. (2009). Supporting valid interpretations of learning progression level diagnoses.Journal of Research in Science Teaching, 46, 699–715.CrossRefGoogle Scholar
  43. Stevens, S. Y., Shin, N., Delgado, C., Krajcik, J., & Pellegrino, J. (2007, April).Developing a learning progression for the nature of matter as it relates to nanoscience. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL. Retrieved from the University of Michigan website: http://www.umich.edu/~hiceweb/presentations/documents/UM_LP_AERA_2007.pdf.Google Scholar
  44. Wilson, M. (2009). Measuring progressions: Assessment structures underlying a learning progression.Journal of Research in Science Teaching, 46, 716–730.CrossRefGoogle Scholar
  45. Wilson, M. (in press). Responding to a challenge that learning progressions pose to measurement practice: Hypothesized links between dimensions of the outcome progression. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar
  46. Wiser, M., Smith, C. L., & Doubler, S. (in press). Learning progressions as a tool for curriculum development: Lessons from the Inquiry Project. In A. C. Alonzo & A. W. Gotwals (Eds.),Learning progressions in science: Current challenges and future directions. Rotterdam: Sense Publishers.Google Scholar

Copyright information

© VS Verlag für Sozialwissenschaften 2012

Authors and Affiliations

  1. 1.Department of Teacher Education, Michigan State UniversityEast LansingUSA

Personalised recommendations