Skip to main content

Eliciting Student Responses Relative To A Learning Progression

Assessment Challenges

  • Chapter
Learning Progressions in Science

Abstract

The assessing strand is critical to work on learning progressions. Obtaining evidence to support or revise a proposed learning progression requires assessments (methods to elicit student responses relative to the learning progression) in order to test hypotheses about student thinking and its evolution over time. In addition, many proposed applications of learning progressions involve assessments—either directly or indirectly. The most recent science framework for the National Assessment of Educational Progress (NAEP) calls for the inclusion of learning progressions in this influential national test (National Assessment Governing Board [NAGB], 2008).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Alonzo, A. C. (2009, April). Design criteria for learning progressions to support teachers' formative assessment practices In A. C. Alonzo (Chair), Learning progressions in science: Tools for teaching, learning, and assessment. Symposium conducted at the annual meeting of the American Educational Research Association, San Diego, CA.

    Google Scholar 

  • Alonzo AC (2010) Discourse as a lens for reframing consideration of learning progressions. In: Gomez K, Lyons L, Radinsky J (eds) Learning in the Disciplines: Proceedings of the 9th International Conference of the Learning Sciences (ICLS 2010) -, vol 1, Full Papers. International Society of the Learning Sciences, Chicago, IL, pp 588–595

    Google Scholar 

  • Alonzo AC (2011) Learning progressions that support formative assessment practices. Measurement: Interdisciplinary Research and Perspectives 9:124–129

    Article  Google Scholar 

  • Alonzo AC, Steedle JT (2009) Developing and assessing a force and motion learning progression. Science Education 93:389–421

    Article  Google Scholar 

  • Black P, Wilson M, Yao S-Y (2011) Roadmaps for learning: A guide to the navigation of learning progressions. Measurement: Interdisciplinary Research and Perspectives 9:71–123

    Article  Google Scholar 

  • Briggs DC, Alonzo AC, Schwab C, Wilson M (2006) Diagnostic assessment with ordered multiple-choice items. Educational Assessment 11:33–63

    Article  Google Scholar 

  • Common Core State Standards Initiative. (2010). Common Core State Standards for mathematics. Retrieved from http://www.corestandards.org/assets/CCSSI_Math%20Standards.pdf

  • Corcoran, T., Mosher, F. A., & Rogat, A. (2009, May). Learning progressions in science: An evidence- based approach to reform. (CPRE Research Report #RR-63). Philadelphia, PA: Consortium for Policy Research in Education.

    Google Scholar 

  • Daro, P., Mosher, F. A., & Corcoran, T. (2011, January). Learning trajectories in mathematics: A foundation for standards, curriculum, assessment, and instruction (CPRE Research Report #RR-68). Philadelphia, PA: Consortium for Policy Research in Education.

    Google Scholar 

  • diSessa AA (1993) Toward an epistemology of physics. Cognition and Instruction 10:105–225

    Article  Google Scholar 

  • diSessa AA, Gillespie NM, Esterly JB (2004) Coherence versus fragmentation in the development of the concept of force. Cognitive Science 28:843–900

    Article  Google Scholar 

  • Educational Testing Service. (2009, December). Response to request for input on the Race to the Top assessment program. Retrieved from http://www.ets.org/Media/Home/pdf/ETS_Response_RacetotheTopAssessment.pdf

  • Embretson S, Gorin J (2001) Improving construct validity with cognitive psychology principles. Journal of Educational Measurement 38:343–368

    Article  Google Scholar 

  • Gotwals AW, Songer NB (2010) Reasoning up and down a food chain: Using an assessment framework to investigate students' middle knowledge. Science Education 94:259–281

    Google Scholar 

  • Ioannides, C., & Vosniadou, S. (2001). The changing meanings of force: From coherence to fragmentation. Cognitive Science Quarterly, 2(1), 5-62. Retrieved from University of Athens website: http://www.cs.phs.uoa.gr/el/staff/vosniadou/force.pdf

    Google Scholar 

  • Lehrer R, Schauble S (2010) March). Seeding evolutionary thinking by engaging children in modeling its foundations, Paper presented at the annual meeting of the National Association for Research in Science Teaching, Philadelphia, PA

    Google Scholar 

  • Messick S (1989) Validity. In: Linn RL (ed) Educational measurement, 3rd edn. Macmillan, New York, NY, pp 13–103

    Google Scholar 

  • Messick S (1995) Validity of psychological assessment: Validation of inferences from persons' responses and performances as scientific inquiry into score meaning. American Psychologist 50:741–749

    Article  Google Scholar 

  • Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003).On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1, 3-62.

    Google Scholar 

  • National Assessment Governing Board. (2008, September). Science framework for the 2009 National Assessment of Educational Progress. Retrieved from http://www.nagb.org/publications/frameworks/science-09.pdf

  • National Research Council (2001) Knowing what students know: The science and design of educational assessment. National Academy Press, Washington, DC

    Google Scholar 

  • National Research Council (2006) Systems for state science assessment. The National Academies Press, Washington, DC

    Google Scholar 

  • National Research Council (2007) Taking science to school: Learning and teaching science in grades K-8. The National Academies Press, Washington, DC

    Google Scholar 

  • National Research Council (2011) A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC

    Google Scholar 

  • Schmidt, W. H., McKnight, C. C., & Raizen, S. A. (1997). A splintered vision: An investigation of U.S. science and mathematics education. Dordrecht, The Netherlands: Kluwer Academic Publishers.

    Google Scholar 

  • Shavelson, R. J., Yin, Y., Furtak, E. M., Ruiz-Primo, M. A., Ayala, C. C., Young, D. B..,... Pottenger, F. (2008). On the role and impact of formative assessment on science inquiry teaching and learning. In J. E. Coffey, R. Douglas, & C. Stearns (Eds.), Assessing science learning: Perspectives from research and practice. (pp. 21-36). Arlington, VA: NSTA Press.

    Google Scholar 

  • Smith CL, Wiser M, Anderson CW, Krajcik J (2006) Implications of research on children's learning for standards and assessment: A proposed learning progression for matter and the atomic- molecular theory. Measurement: Interdisciplinary Research and Perspectives 4:1–98

    Article  Google Scholar 

  • Stevens SY, Delgado C, Krajcik JS (2010) Developing a hypothetical multi-dimensional learning progression for the nature of matter. Journal of Research in Science Teaching 47:687–715

    Article  Google Scholar 

  • Vosniadou S, Brewer WF (1992) Mental models of the earth: A study of conceptual change in childhood. Cognitive Psychology 24:535–585

    Article  Google Scholar 

  • Wainer H (2000) Computer-adaptive testing: A primer, 2nd edn. Lawrence Erlbaum Associates, Inc, Mahwah, NJ

    Google Scholar 

  • Wainer, H. (2010). Computerized adaptive testing. In I. B. Weiner & W. E. Craighead (Eds.), Corsini Encyclopedia of Psychology. doi: 10.1002/9780470479216.corpsy0213

  • Webb, N. L. (1997). Criteria for alignment of expectations and assessments in mathematics and science education (Research Monograph No. 6). Retrieved from University of Wisconsin, Wisconsin Center for Education Research website: http://facstaff.wcer.wisc.edu/normw/WEBBMonograph6criteria.pdf

  • Wilson M (ed) (2004) Towards coherence between classroom assessment and accountability: The 103rd yearbook of the National Society for the Study of Education, Part II. The University of Chicago Press, Chicago, IL

    Google Scholar 

  • Wilson M (2005) Constructing measures: An item response modeling approach. Lawrence Erlbaum Associates, Mahwah, NJ

    Google Scholar 

Download references

Authors

Editor information

Alicia C. Alonzo Amelia Wenk Gotwals

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Sense Publishers

About this chapter

Cite this chapter

Alonzo, A.C. (2012). Eliciting Student Responses Relative To A Learning Progression. In: Alonzo, A.C., Gotwals, A.W. (eds) Learning Progressions in Science. SensePublishers, Rotterdam. https://doi.org/10.1007/978-94-6091-824-7_11

Download citation

Publish with us

Policies and ethics

Societies and partnerships