Skip to main content
Log in

Do Curriculum Outcomes and Assessment Activities in Science Encourage Higher Order Thinking?

  • Article
  • Published:
Canadian Journal of Science, Mathematics and Technology Education Aims and scope Submit manuscript

Abstract

The curriculum for two science units in each of Grades 6 and 7 was analyzed to determine the cognitive levels of the outcomes and their cognitive alignment with the assessments that corresponded with these outcomes. This was done for British Columbia, Alberta, Ontario, and Atlantic Canada. The outcomes and assessments included a variety of higher and lower thinking skills, with several jurisdictions having distinctly fewer higher than lower order outcomes and assessments. The cognitive alignment between outcomes and assessments ranged from 42 to 71%. Strong alignment between outcomes and classroom assessment increases students’ opportunity to learn and become good thinkers.

Résumé

Le programme couvert par deux unités pédagogiques de sciences chacune, destinées à la 6e et à la 7e année, a été analysé afin de déterminer le niveau cognitif des résultats obtenus, ainsi que leur alignement cognitif avec les épreuves qui correspondent à ces résultats. Cette analyse a été faite pour: la Colombie Britannique, l’Alberta, l’Ontario et les provinces de l’Atlantique. Les résultats et les épreuves comprenaient une série d’habiletés cognitives simples et complexes, et il ressort que les résultats et épreuves de plusieurs juridictions comprenaient nettement moins d’habiletés complexes comparativement aux habiletés simples. L’alignement cognitif entre les résultats et les épreuves allait de 42% à 71%. Un alignement fort entre les résultats et les épreuves en classe augmente les possibilités d’apprentissage et de croissance cognitive des élèves.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Adey, P., & Shayer, M. (1994). Really raising standards: Cognitive intervention and academic achievement. London, England: Routledge.

    Google Scholar 

  • Airasian, P. W., & Miranda, H. (2002). The role of assessment in the revised taxonomy. Theory Into Practice, 41(4), 249–254.

    Google Scholar 

  • Alberta Learning. (2003, updated 2009). Science (7-8-9). Edmonton, AB, Canada: Author.

    Google Scholar 

  • Alberta Learning. (2009). Science (7-8-9). Edmonton, AB, Canada: Author.

    Google Scholar 

  • Anderson, L. W. (2002). Curricular alignment: A re-examination. Theory Into Practice, 41(4), 255–260.

    Google Scholar 

  • Anderson, L. W., & Krathwohl, D. R., (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman.

    Google Scholar 

  • Bailin, S. (2002). Critical thinking and science education. Science and Education, 11, 361–375.

    Google Scholar 

  • Bailin, S., & Battersby, M. (2010). Reason in the balance: An inquiry approach to critical thinking. Canada: McGraw-Hill Ryerson.

    Google Scholar 

  • Bailin, S., Case, R., Coombs, J. R., & Daniels, L. B. (1999). Common misconceptions of critical thinking. Journal of Curriculum Studies, 31(3), 269–283.

    Google Scholar 

  • Beyer, B. K. (2001). Teaching thinking skills—Defining the problem. In A. L. Costa (Ed.), Developing minds: A resource book for teaching thinking (3rd ed., pp. 35–43). Alexandria, VA: Association for Supervision and Curriculum Development.

    Google Scholar 

  • Beyer, C. J., & Davis, E. A. (2008). Fostering second graders’ scientific explanations: A beginning elementary teacher’s knowledge, beliefs, and practice. The Journal of the Learning Sciences, 17(3), 381–414.

    Google Scholar 

  • Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21(1), 5–31.

    Google Scholar 

  • Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals, Handbook I: Cognitive domain. New York, NY: Longman.

    Google Scholar 

  • Braaten, M., & Windschitl, M. (2011). Working toward a stronger conceptualization of scientific explanation for science education. Science Education, 95(4), 577–770.

    Google Scholar 

  • Bransford, J. D., & Donovan, M. S. (2005). Scientific inquiry and how people learn. In M. S. Donovan & J. D. Bransford (Eds.), How students learn: Science in the classroom (pp. 397–413). Washington, DC: The National Academies Press.

    Google Scholar 

  • British Columbia Ministry of Education. (2005). Science K to 7: Integrated resource package 2005. Vancouver, BC, Canada: Province of British Columbia.

    Google Scholar 

  • Brookhart, S. M., & Devoge, J. G. (1999). Testing a theory about the role of classroom assessment in student motivation and achievement. Applied Measurement in Education, 12(4), 409–425.

    Google Scholar 

  • Case, R. (2005). Moving critical thinking to the main stage. Education Canada, 45(2), 45–49.

    Google Scholar 

  • Case, R., & Daniels, L. (2008). Teaching the tools to think critically. In R. Case & P. Clark (Eds.), The anthology of social studies: Vol. 1. Issues and strategies for elementary teachers (pp. 77–88). Vancouver, BC, Canada: Pacific Educational Press.

    Google Scholar 

  • Cavagnetto, A. R. (2010). Argument to foster scientific literacy: A review of argument interventions in K-12 science contexts. Review of Educational Research, 80(3), 336–371.

    Google Scholar 

  • Chinn, C., & Brown, D. E. (2000). Learning in science: A comparison of deep and surface approaches. Journal of Research in Science Teaching, 37(2), 109–138.

    Google Scholar 

  • Corliss, S. B., & Linn, M. (2011). Assessing learning from inquiry science instruction. In G. Schraw & D. R. Robinson (Eds.), Assessment of higher order thinking skills (pp. 219–243). Charlotte, NC: Information Age Publishing.

    Google Scholar 

  • Council of Ministers of Education, Canada. (1997). Common taxonomy of science learning outcomes K to 12. Ottawa, ON, Canada: Author.

    Google Scholar 

  • Crooks, T. J. (1988). The impact of classroom evaluation practices on students. Review of Educational Research, 58(4), 438–481.

    Google Scholar 

  • Darling-Hammond, L. (2008). Teaching and learning for understanding. In L. Darling-Hammond, B. Barron, P. D. Pearson, A. H. Schoenfeld, E. K. Stage, T. D. Zimmerman, … J. L. Tilson (Eds.), Powerful learning: What we know about teaching for understanding (pp. 1–9). San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Dewey, J. (1933). How we think: A restatement of the relation of reflective thinking to the educative process. Boston, MA: D. C. Heath and Company.

    Google Scholar 

  • Duschl, R. A. (2003). Assessment of inquiry. In J. M. Atkin & J. E. Coffey (Eds.), Everyday assessment in the science classroom (pp. 41–59). Arlington, VA: NSTA Press.

    Google Scholar 

  • Ennis, R. H. (1987). A taxonomy of critical thinking dispositions and abilities. In J. B. Baron & R. J. Sternberg (Eds.), Teaching thinking skills: Theory and practice (pp. 1–26). New York, NY: W. H. Freeman and Company.

    Google Scholar 

  • Ennis, R. H. (2003). Critical thinking assessment. In D. Fasko, Jr. (Ed.), Critical thinking and reasoning: Current research, theory, and practice (pp. 293–313). Cresskill, NJ: Hampton Press.

    Google Scholar 

  • Fisher, R. (2005). Teaching children to think. Cheltenham, England: Nelson Thornes Ltd.

    Google Scholar 

  • FitzPatrick, B., & Schulz, H. (2010). Assessing higher order thinking: What teachers think and do. Paper presented at the 2010 AERA Annual Meeting in Denver, CO.

    Google Scholar 

  • FitzPatrick, B., & Schulz, H. (2012). How a research-based intervention and teacher learning community supported teachers in teaching and assessing higher order thinking. Paper presented at the Classroom Assessment SIG for the 2012 AERA Annual Meeting in Vancouver, BC.

    Google Scholar 

  • Gil-Perez, D., & Vilches, A. (2005). The contribution of science and technological education to citizens’ culture. Canadian Journal of Science, Mathematics, and Technology Education, 5(2), 253–263.

    Google Scholar 

  • Government of Newfoundland and Labrador. (2002). Science elementary curriculum guide. St. John’s, NL, Canada: Author.

    Google Scholar 

  • Guindon, R. (1990). Knowledge exploited by experts during software systems design. International Journal of Man-Machine Studies, 33(3), 279–182.

    Google Scholar 

  • Hazzan, O. (2003). How students attempt to reduce abstraction in the learning of mathematics and in the learning of computer science. Computer Science Education, 13(2), 95–123.

    Google Scholar 

  • Hodge, B. K., & Steele, W. G. (2002). A survey of computational paradigms in undergraduate mechanical engineering education. Journal of Engineering Education, 91(4), 415–417.

    Google Scholar 

  • Kahney, H. (1989). What do novice programmers know about recursion? In E. Soloway & J. C. Sphorer (Eds.), Studying the novice programmer (pp. 209–228). Hillsdale, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Katai, Z. (2011). Multi-sensory method for teaching-learning recursion. Computer Applications in Engineering Education, 19(2), 324–243.

    Google Scholar 

  • Meyer, J. H., F., & Land, R. (2003). Threshold concepts and troublesome knowledge: Linkages to ways of thinking and practising. In C. Rust (Ed.), Improving student learning—Theory and practice ten years on (pp. 412–424). Oxford, England: Oxford Centre for Staff and Learning Development (OCSLD).

    Google Scholar 

  • Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: A sourcebook of new methods (2nd ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Milne, I., & Rowe, G. (2002). Difficulties in learning and teaching programming—Views of students and tutors. Education and Information Technologies, 7(1), 55–66.

    Google Scholar 

  • Pea, R. D. (1986). Language-independent conceptual “bugs” in novice programming. Journal of Educational Computing Research, 2(1), 25–36.

    Google Scholar 

  • Putnam, T. R., Sleeman, D., Baxter, J. A., & Kuspa, L. K. (1986). A summary of misconceptions of high school BASIC programmers. Journal of Educational Computing Research, 2(4), 459–472.

    Google Scholar 

  • Rogalski, J., & Samurcay, R. (1990). Acquisition of programming knowledge and skills. In J. M. Hoc, T. R., G. Green, R. Samurçay, & D. J. Gillmore (Eds.), Psychology of programming (pp. 157–174). London, England: Academic Press.

    Google Scholar 

  • Samurcay, R. (1989). The concept of variable in programming: Its meaning and use in problem solving by novice programmers. In E. Soloway & J. C. Sphorer (Eds), Studying the novice programmer (pp. 161–178). Hillsdale, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Sleeman, D., Putnam, R. T., Baxter, J., & Kuspa, L. (1986). Pascal and high-school students: A study of misconceptions. Journal of Educational Computing Research, 2(1), 5–23.

    Google Scholar 

  • Soloway, E., Bonar, J., & Ehrlich, K. (1983). Cognitive strategies and looping constructs: An empirical study. Communications of the ACM, 26(11), 853–860.

    Google Scholar 

  • Teddlie, C., & Tashakkori, A. (2003). Major issues and controversies in the use of mixed methods in the social and behavioral sciences. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 3–50). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Vessey, I. (1985). Expertise in debugging computer programs: A process analysis. International Journal of Man-Machine Studies, 23, 459–494.

    Google Scholar 

  • Wiedenbeck, S., Ramalingam, V., Sarasamma, S., & Corritore, C. (1999). A comparison of the comprehension of object-oriented and procedural programs by novice programmers. Interacting With Computers, 11(3), 255–282.

    Google Scholar 

  • Winslow, L. E. (1996). Programming pedagogy—A psychological overview. SIGCSE Bulletin, 28(3), 17–22.

    Google Scholar 

  • Yin, R. K. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage Publications.

    Google Scholar 

  • Zendler, A., Spannagel, C., & Klaudt, D. (2011). Marrying content and process in computer science education. IEEE Transactions on Computing Education, 54(3), 387–397.

    Google Scholar 

  • Government of Newfoundland and Labrador. (2007, updated 2010). Intermediate science curriculum guide. St. John’s, NL, Canada: Author.

    Google Scholar 

  • Government of Newfoundland and Labrador. (2010). Intermediate science curriculum guide. St. John’s, NL, Canada: Author.

    Google Scholar 

  • Haertel, E. H., Moss, P. A., Pullin, D. C., & Gee, J. P. (2008). Introduction. In P. A. Moss, D. C. Pullin, J. P. Gee, E. H. Haertel, & L. J. Young (Eds.), Assessment, equity, and opportunity to learn (pp. 1–16). New York, NY: Cambridge University Press.

  • Harlen, W. (2001). Primary science: Taking the plunge (2nd ed.). Portsmouth, NH: Heinemann.

    Google Scholar 

  • Herman, J. L., & Webb, N. M. (2007). Alignment methodologies. Applied Measurement in Education, 20(1), 1–5.

    Google Scholar 

  • Jenkins, F., & Norris, S. P. (2012). CRYSTAL—Alberta: A case of science-science education research collaboration. In S. P. Norris (Ed.), Reading for evidence and interpreting visualizations in mathematics and science education (pp. 3–15). Boston, MA: Sense Publishers.

    Google Scholar 

  • Kesidou, S., & Roseman, J. E. (2002). How well do middle school science program measure up? Findings from project 2061’s curriculum review. Journal of Research in Science Teaching, 39(6), 522–549.

    Google Scholar 

  • Kuhn, D., Black, J., Keselman, A., & Kaplan, D. (2000). The development of cognitive skills to support inquiry learning. Cognition and Instruction, 18(4), 495–523.

    Google Scholar 

  • Levinson, R. (2006). Teachers’ perceptions of the role of evidence in teaching controversial socio-scientific issues. The Curriculum Journal, 17(3), 247–262.

    Google Scholar 

  • Lipman, M. (2003). Thinking in education. New York, NY: Cambridge University Press.

    Google Scholar 

  • Martone, A., & Sireci, S. J. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79(4), 1332–1361.

    Google Scholar 

  • Marzano, R. J. (2001). Designing a new taxonomy of educational objectives. Thousand Oaks, CA: Corwin Press.

    Google Scholar 

  • Marzano, R. J., & Kendall, J. S. (2007). The new taxonomy of educational objectives (2nd ed.). Thousand Oaks, CA: Corwin Press.

    Google Scholar 

  • Mayer, R. E. (2002). Rote versus meaningful learning. Theory Into Practice, 41(4), 226–232.

    Google Scholar 

  • McMillan, J. H. (2010). The practical implications of educational aims and contexts for formative assessment. In H. L. Andrade & G. J. Cizek (Eds.), Handbook of formative assessment (pp. 41–58). New York, NY: Routledge.

    Google Scholar 

  • McNeill, K. L., & Krajcik, J. S. (2008). Assessing middle school students’ content knowledge and reasoning through written scientific explanations. In J. Coffey, R. Douglas, & C. Stearns (Eds.), Assessing science learning: Perspectives from research and practice (pp. 101–116). Arlington, VA: NSTA Press.

    Google Scholar 

  • Ministry of Education, Ontario. (2007). The Ontario curriculum Grades 1–8: Science and technology. Toronto, ON, Canada.

    Google Scholar 

  • Moon, J. (2008). Critical thinking: An exploration of theory and practice. London, England: Routledge.

    Google Scholar 

  • Moseley, D., Baumfield, V., Higgins, S., Lin, M., Miller, J., Newton, D., … Gregson, M. (2004). Thinking skill taxonomys for post-16 learners: An evaluation:A research report for the Learning and Skills Research Center. Wiltshire, England: Cromwell Press Ltd.

    Google Scholar 

  • Moseley, D., Elliott, J., Gregson, M., & Higgins, S. (2005). Thinking skills frameworks for use in education and training. British Educational Research Journal, 31(3), 367–390.

    Google Scholar 

  • Mullis, I. V., S., Martin, M. O., Ruddock, G. J., O’Sullivan, C. Y., & Preuschoff, C. (2009). TIMSS 2011 assessment frameworks. Amsterdam, the Netherlands: International Association for the Evaluation of Educational Achievement.

    Google Scholar 

  • National Research Council. (2012). A taxonomy for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: The National Academies Press.

    Google Scholar 

  • Nessel, D. D., & Graham, J. M. (2007). Thinking strategies for student achievement: Improving learning across the curriculum, K-12 (2nd ed.). Thousand Oaks, CA: Corwin Press.

    Google Scholar 

  • Newton, L. D., Newton, D. P., Blake, A., & Brown, K. (2002). Do primary school science books show a concern for explanatory understanding? Research in Science and Technological Education, 20(2), 227–240.

    Google Scholar 

  • Nitko, A. J., & Brookhart, S. M. (2011). Educational assessment of students (6th ed.). Upper Saddle River, NJ: Pearson.

    Google Scholar 

  • Norris, S. P., & Phillips, L. M. (2003). How literacy in its fundamental sense is central to scientific literacy. Science Education, 87, 224–240.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2013). PISA 2015 draft science framework. Paris, France: Author.

    Google Scholar 

  • Osborne, J., Erduran, S., & Simon, S. (2004). Enhancing the quality of argumentation in school science. Journal of Research in Science Teaching, 41(10), 994–1020.

    Google Scholar 

  • Paul, R. W. (1991). Teaching critical thinking in the strong sense. In A. L. Costa (Ed.), Developing minds: A resource book for teaching thinking (Revised ed., Vol. 1, pp. 77–84). Arlington, VA: Association for Supervision and Curriculum Development.

    Google Scholar 

  • Paul, R. W., & Elder, L. (2006). Critical thinking: Tools for taking charge of your learning and your life (2nd ed.). Upper Saddle River, NJ: Pearson Prentice Hall.

    Google Scholar 

  • Pegg, J., & Karuku, S. (2012). Explanatory reasoning in junior high science textbooks. In S. P. Norris (Ed.), Reading for evidence and interpreting visualizations in mathematics and science education (pp. 65–81). Boston, MA: Sense Publishers.

    Google Scholar 

  • Pithers, R. T., & Soden, R. (2000). Critical thinking in education: A review. Educational Research, 42(3), 237–249.

    Google Scholar 

  • Resnick, L. B. (1987). Education and learning to think. Washington, DC: National Academy Press.

    Google Scholar 

  • Resnick, L. B. (2001). Making America smarter: The real goal of school reform. In A. L. Costa (Ed.), Developing minds: A resource for teaching thinking (3rd ed., pp. 3–6). Arlington, VA: Association for Supervision and Curriculum Development.

    Google Scholar 

  • Roach, A. T., Niebling, B. C., & Kurz, A. (2008). Evaluating the alignment among curriculum, instruction, and assessments: Implications and applications for research and practice. Psychology in the Schools, 45(2), 158–176.

    Google Scholar 

  • Sampson, V., & Clark, D. B. (2008). Assessment of the ways students generate arguments in science education: Current perspectives and recommendations for future directions. Science Education, 92(3), 447–472.

    Google Scholar 

  • Schulz, H., & FitzPatrick, B. (2013). Formative assessment as part of guided inquiry to develop thinking in grade 6 science. Paper presented at the 2013 AERA Annual Meeting in San Francisco, CA.

    Google Scholar 

  • Shepard, L. A. (2006). Classroom assessment. In R. L. Brennan (Ed.): Educational measurement (4th ed., pp. 623–646). Westport, CT: American Council on Education and Praeger Publishers.

    Google Scholar 

  • Siegel, H. (1997). Rationality redeemed? Further dialogues on an educational ideal. New York, NY: Routledge.

    Google Scholar 

  • Smolkin, L. B., McTigue, E. M., Donovan, C. A., & Colemean, C. A. (2008). Explanation in science trade books recommended for use with elementary students. Science Education, 93(4), 587–610.

    Google Scholar 

  • Underbakke, M., Borg, J., & Peterson, D. (1993). Researching and developing the knowledge base for teaching higher order thinking. Theory Into Practice, 32(3), 138–146.

    Google Scholar 

  • Webb, N. L. (2002). Alignment study in language arts, mathematics, science, and social studies of state standards and assessments for four states. Washington, DC: Council of Chief State School Officers.

    Google Scholar 

  • White, R., & Gunstone, R. (1992). Probing understanding. London, England: The Falmer Press.

    Google Scholar 

  • Woodward, A., & Elliott, D. L. (1990). Textbook use and teacher professionalism. In D. L. Elliott & A. Woodward (Eds.), Textbooks and schooling in the United States (89th ed., pp. 178–193). Chicago, IL: University of Chicago Press.

    Google Scholar 

  • Zohar, A. (2004). Higher order thinking in science classrooms: Students’ learning and teachers’ professional development. Boston, MA: Kluwer Academic Publishers.

    Google Scholar 

  • Zohar, A., & Dori, Y. G. (2003). Higher-order thinking skills and low-achieving students: Are they mutually conclusive? The Journal of the Learning Sciences, 12(2), 145–181.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Beverly FitzPatrick.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

FitzPatrick, B., Schulz, H. Do Curriculum Outcomes and Assessment Activities in Science Encourage Higher Order Thinking?. Can J Sci Math Techn 15, 136–154 (2015). https://doi.org/10.1080/14926156.2015.1014074

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1080/14926156.2015.1014074

Navigation