Educational Psychology Review

, Volume 26, Issue 4, pp 543–560

Assessment of Critical-Analytic Thinking

  • Nathaniel J. S. Brown
  • Peter P. Afflerbach
  • Robert G. Croninger
Review Article

Abstract

National policy and standards documents, including the National Assessment of Educational Progress frameworks, the Common Core State Standards, and the Next Generation Science Standards, assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for developers of both large-scale and classroom assessments: Current CAT assessments often suffer from questionable item contexts, subjective rubrics, and underdeveloped construct formulations. Attention to these aspects of assessment would improve understanding of the development of students’ CAT and provide tools for helping teachers teach and students learn. We discuss these challenges within the context of several content areas and highlight the importance of developing formative assessments that capture the development of CAT in different domains of learning.

Keywords

Critical-analytic thinking Formative assessment Large-scale assessment Classroom assessment National Assessment of Educational Progress Common Core State Standards Next Generation Science Standards 

References

  1. Afflerbach, P. (2012). Understanding and using reading assessment, K-12 (2nd edn.). Newark, DE: International Reading Association.Google Scholar
  2. Alexander, P. A. (2014). Thinking critically-analytically about critical-analytic thinking: an introduction. Educational Psychology Review. doi:10.1007/s10648-014-9283-1.
  3. Arum, R., & Roksa, J. (2011). Academically adrift: limited learning on college campuses. Chicago: University of Chicago Press.Google Scholar
  4. Bailin, S. (2002). Critical thinking and science education. Science & Education, 11(4), 361–375.CrossRefGoogle Scholar
  5. Billing, D. (2007). Teaching for transfer of core/key skills in higher education: cognitive skills. Higher Education, 53(4), 483–516.CrossRefGoogle Scholar
  6. Bloom, B. (Ed.). (1956). Taxonomy of educational objectives, the classification of educational goals–handbook I: cognitive domain. New York: McKay.Google Scholar
  7. Brown, N. J. S., & Wilson, M. (2011). A model of cognition: the missing cornerstone of assessment. Educational Psychology Review, 23, 221–234.CrossRefGoogle Scholar
  8. Brown, N. J. S., Nagashima, S. O., Fu, A., Timms, M., & Wilson, M. (2010). A framework for analyzing scientific reasoning in assessments. Educational Assessment, 15, 142–174.CrossRefGoogle Scholar
  9. Byrnes, J. P., & Dunbar, K. (2014). The nature and development of critical-analytic thinking. Educational Psychology Review. doi:10.1007/s10648-014-9284-0.
  10. Cobb, P., & Jackson, K. (2011). Assessing the quality of the common core state standards for mathematics. Educational Researcher, 40(40), 183–185.CrossRefGoogle Scholar
  11. Common Core State Standards Initiative. (2010). Common Core State Standards for English language arts and literacy in history/social studies, science, and technical subjects. Retrieved from www.corestandards.org/ELA-Literacy. Accessed 1 Oct 2014.
  12. De La Paz, S., Felton, M., Monte-Sano, C., Croninger, R., Jackson, C., Deogracias, J., & Hoffman, B. (2014). Developing historical reading and writing with adolescent readers: effects on student learning. Theory and Research in Social Education, 42(2), 228–274.CrossRefGoogle Scholar
  13. Dewey, J. (1933). How we think, a restatement of the relation of reflective thinking to the educative process. Boston: D. C. Heath.Google Scholar
  14. Dray, A. J., Brown, N. J. S., Lee, Y., Diakow, R., & Wilson, M. (2011). The assessment of reading comprehension in adolescents: the San Diego striving readers project (final report to the Institute of Education Sciences). Berkeley, CA: University of California, Berkeley Evaluation and Assessment Research Center.Google Scholar
  15. Facione, P. (2000). The disposition toward critical thinking: its character, measurement, and relation to critical thinking skill. Informal Logic, 20(1), 61–84.Google Scholar
  16. Farah, M. (2010). Mind, brain and education in socioeconomic context. In M. Ferarri & L. Vuletic (Eds.), The developmental relations of mind, brain and education (pp. 243–256). New York: Springer.CrossRefGoogle Scholar
  17. Guthrie, J. T., Wigfield, A., & Perencevich, K. C. (Eds.). (2004). Motivating reading comprehension: concept-oriented reading instruction. Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
  18. Guthrie, J., Klauda, S., & Ho, A. (2013). Modeling the relationships among reading instruction, motivation, engagement, and achievement for adolescents. Reading Research Quarterly, 48, 9–26.Google Scholar
  19. Halpern, D. (1998). Teaching critical thinking for transfer across domains: dispositions, skills, structure training and metacognitive monitoring. American Psychologist, 53(4), 449–455.CrossRefGoogle Scholar
  20. Halpern, D. (2001). Assessing the effectiveness of critical thinking instruction. The Journal of General Education, 50(4), 270–286.CrossRefGoogle Scholar
  21. Kennedy, M., Fisher, M., & Ennis, R. (1991). Critical thinking: literature review and needed research. In L. Idol & B. Jones (Eds.), Educational values and cognitive instruction: implications for reform (pp. 11–40). Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
  22. Knapp, M. & Associates. (1995). Teaching for meaning in high-poverty classrooms. New York: Teachers College Press.Google Scholar
  23. McDonald, T., Thornley, C., Staley, R., & Moore, D. W. (2009). The San Diego striving readers’ project: building academic success for adolescent readers. Journal of Adolescent and Adult Literacy, 52(8), 720–722.CrossRefGoogle Scholar
  24. Murphy, K., Wilkinson, I., Soter, A., Hennessey, M., & Alexander, J. (2009). Examining the effects of classroom discussion on students’ comprehension of text: a meta-analysis. Journal of Educational Psychology, 101, 740–764.CrossRefGoogle Scholar
  25. National Assessment Governing Board. (2010). Reading Framework for the 2011 National Assessment of Educational Progress. Retrieved from: www.nagb.org/publications/frameworks.html. Accessed 1 Oct 2014.
  26. National Research Council. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: The National Academies.Google Scholar
  27. National Research Council. (2012). A framework for K-12 science education: practices, crosscutting concepts, and core ideas. Washington, DC: The National Academies.Google Scholar
  28. NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. Retrieved from: http://www.nextgenscience.org/next-generation-science-standards. Accessed 1 Oct 2014.
  29. Noddings, N. (2006). Critical lessons: What our schools should teach. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
  30. Partnership for Assessment of Readiness for College and Careers. (2013). Advances in the PARCC ELA/Literacy summative assessment. Retrieved from www.parcconline.org/samples/ELA. Accessed 1 Oct 2014.
  31. Partnership for Assessment of Readiness for College and Careers. (2014). Grades 6–11 condensed scoring rubric for prose constructed response items. Retrieved from www.parcconline.org/samples/english-language-artsliteracy/grades-6-11-generic-rubrics. Accessed 1 Oct 2014.
  32. Porter, A., McMaken, J., Hwang, J., & Yang, R. (2011). Common core standards: the new U.S. intended curriculum. Educational Researcher, 40(3), 103–116.CrossRefGoogle Scholar
  33. Quellmalz, E. S., & Haertel, G. D. (2004). Use of technology-supported tools for large-scale science assessment: Implications for assessment practice and policy at the state level. Washington, DC: National Research Council.Google Scholar
  34. Quellmalz, E. S., Timms, M. J., Silberglitt, M. D., & Buckley, B. C. (2012a). Science assessments for all: integrating science simulations into balanced state science assessment systems. Journal of Research in Science Teaching, 49(3), 363–393.CrossRefGoogle Scholar
  35. Quellmalz, E., Davenport, J., & Timms, M. (2012b). 21st century science assessments. Washington, DC: American Association for the Advancement of Science.Google Scholar
  36. Schraw, G., & Robinson, D. (2012). Assessment of higher order thinking skills. Charlotte, NC: Information Age Publishers.Google Scholar
  37. Shavelson, R. J., Baxter, G. P., & Pine, J. (1992). Performance assessments: political rhetoric and measurement reality. Educational Researcher, 21(4), 22–27.Google Scholar
  38. Tienken, C., & Zhao, Y. (2013). How common standards and standardized testing widen the opportunity gap. In P. Carter & K. Welner (Eds.), Closing the opportunity gap. What American must do to give every child an even chance (pp. 111–122). New York: Oxford University Press.CrossRefGoogle Scholar
  39. VanSledright, B. A. (2013). Assessing historical thinking and understanding: Innovative designs for new standards. New York: Routledge.Google Scholar
  40. Vygotsky, L. S. (1978). Mind in society: the development of higher psychological processes. Cambridge, MA: Harvard University Press.Google Scholar
  41. Wentzel, K. (2009). Students’ relationships with teachers as motivational contexts. In K. Wentzel & A. Wigfield (Eds.), Handbook of motivation at school (pp. 301–322). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
  42. Willingham, D. (2007 summer). Critical thinking. Why is it so hard to teach? American Educator, 8–19.Google Scholar
  43. Wilson, M. (2005). Constructing measures: an item response modeling approach. Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  • Nathaniel J. S. Brown
    • 1
  • Peter P. Afflerbach
    • 2
  • Robert G. Croninger
    • 2
  1. 1.Educational Research, Measurement, and Evaluation, Lynch School of EducationBoston CollegeChestnut HillUSA
  2. 2.Teaching and Learning, Policy and Leadership, College of EducationUniversity of MarylandCollege ParkUSA

Personalised recommendations