Skip to main content

An Analysis of Standards-Based High School Physics Textbooks of Finland and the United States

  • Chapter
  • First Online:
Critical Analysis of Science Textbooks

Abstract

This study examines how the curriculum is in alignment with the reform standards using the questioning style and level of inquiry activities, which are key components of the National Science Education Standards [NSES], in terms of the inquiry milieu (National Research Council [NRC], 1996). Two countries’ textbooks were chosen for analysis in this study: Physica (meaning physics in Greek) of Finland and Active Physics of the United States’ high school physics, which are products of reform efforts in science education. In 2003, Finland undertook a major change in the curriculum at the national level, which produced the “National Core Curriculum” (FNBE, 2003), whereas the United States went through a major reform in science education in the past decade, which produced the “National Science Education Standards” (NRC, 1996). Physica of Finland was developed as a high school physics textbook based on the National Core Curriculum for Science Education, and Active Physics of the US high school curriculum was developed based on the National Science Education Standards. The United States developed a new curriculum based on the national standards as an alternative to the traditional curriculum. Finnish Physica was developed based on a “traditional” national level curriculum, which includes aims for upper secondary physics and short descriptions of core content (FNBE, 2003). However, a study of how those particular curriculums have met the visions espoused by the National Science Education Standards is yet to be studied.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • American Association for the Advancement of Science. (1989). Science for all Americans. New York: Oxford University Press.

    Google Scholar 

  • American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York: Oxford University Press.

    Google Scholar 

  • Association for Supervision and Curriculum Development. (1997). Education Update, 39(1).

    Google Scholar 

  • Ball, D. L., & Cohen, D. K. (1996). Reform by the book: What is–or might be–the role of curriculum materials in teacher learning and instructional reforms? Educational Researcher, 25(9), 6–8. 14.

    Google Scholar 

  • Beatty, I., Gerace, W., Leonard, W., & Dufresne, R. (2006). Designing effective questions for classroom response system teaching. American Journal of Physics, 74(1).

    Google Scholar 

  • Bereiter, C., & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 361–392). Hillsdale, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Bryant, J. (2000). A comparison of pre-college earth science teaching practices in Iowa during the decades of 1975–1985 and 1985–1995. Unpublished doctoral dissertation, The University of Iowa, Iowa City, IA.

    Google Scholar 

  • Chiang-Soong, B. (1988). An analysis of the most used science textbooks in secondary schools in the United States. Unpublished doctoral dissertation, The University of Iowa, Iowa City, IA.

    Google Scholar 

  • Chiappetta, E. L., Fillman, D. A., & Sethna, G. H. (1991). A method to quantify major themes of scientific literacy in science textbooks. Journal of Research in Science Teaching, 28(8), 713–725.

    Article  Google Scholar 

  • Chinn, C. A., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoritical framework and implications for science education. Review of Educational Research, 63(1), 1–49.

    Article  Google Scholar 

  • Davis, E. A., & Krajcik, J. S. (2005). Designing educative curriculum materials to promote teacher learning. Educational Researcher, 34(3), 3–14.

    Article  Google Scholar 

  • Doran, R. L., & Sheard, D. M. (1974). Analyzing science textbooks. School Science and Mathematics, 74(1), 31–39.

    Article  Google Scholar 

  • Ebel, R. L., & Frisbie, D. A. (1991). Essentials of educational measurement (5th ed.). Englewood Cliffs, NJ: Prentice Hall.

    Google Scholar 

  • Eisenkraft, A. (1998). Active physics. New York: It’s About Time.

    Google Scholar 

  • Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data. Cambridge, MA: MIT Press.

    Google Scholar 

  • FNBE. (2003). National core curriculum for upper secondary school 2003. Vammala: National Board of Education (NBE)/Vammalan kirjapaino.

    Google Scholar 

  • Harms, N. C., & Yager, R. E. (1981). What research says to the science teacher (Vol. 3). Washington, DC: National Science Teachers Association.

    Google Scholar 

  • Hatakka, J., Saari, H., Sirviö, J., Viiri, J., & Yrjänäinen, S. (2005). Physica: Lämpö. Helsinki: WSOY.

    Google Scholar 

  • Herron, M. D. (1971). The nature of scientific enquiry. School Review, 79(2), 171–212.

    Article  Google Scholar 

  • Hurd, P. D. (1994). New minds for a new age: Prologue to modernizing the science curriculum. Science Education, 78(1), 103–116.

    Article  Google Scholar 

  • Kahl, S., & Harms, N. (1981). Project synthesis: Purpose, organization and procedures. In N. Harms & R. E. Yager (Eds.), What research says to the science teacher (Vol. 3). Washington, DC: National Science Teachers Association.

    Google Scholar 

  • Kleinman, G. (1965). Teacher’s questions and student understanding of science. Journal of Research in Science Teaching, 3(4), 307–317.

    Article  Google Scholar 

  • Kulm, G., Roseman, J., & Treistman, M. (1999). A benchmarks-based approach to textbook evaluation. Science Books & Films, 35(4), 147–153. Retrieved August 5, 2011, from http://www.project2061.org/publications/textbook/articles/approach.htm.

    Google Scholar 

  • Lavonen, J., & Laaksonen, S. (2009). Context of teaching and learning school science in Finland: Reflections on PISA 2006 results. Journal of Research in Science Teaching, 46(8), 922–944.

    Article  Google Scholar 

  • Lawson, A. E., & Renner, J. W. (1975). Relationship of science subject matter and developmental levels of learners. Journal of Research in Science Teaching, 12(4), 347–358.

    Article  Google Scholar 

  • Lowery, L. F., & Leonard, W. H. (1978a). Development and methods for use of an instrument designed to assess textbook questioning style. School Science and Mathematics, 78(5), 393–400.

    Article  Google Scholar 

  • Lowery, L. F., & Leonard, W. H. (1978b). A comparison of questioning styles among four widely used high school biology textbooks. Journal of Research in Science Teaching, 15(1), 1–10.

    Article  Google Scholar 

  • Marbach-Ad, G., & Sokolove, P. G. (2000). Can undergraduate biology students learn to ask better questions? Journal of Research in Science Teaching, 37(8), 854–870.

    Article  Google Scholar 

  • Martin, J. (1979). Effects of teacher higher-order questions on student process and product variables in a single-classroom study. The Journal of Educational Research, 72(4), 183–187.

    Google Scholar 

  • Martin, M., Mullis, I., Gregory, K., Hoyle, C., & Shen, C. (2000). Effective schools in science and mathematics. Chestnut Hill, MA: International Study Center of Lynch School of Education, Boston College.

    Google Scholar 

  • Minner, D. D., Levy, A. J., & Century, J. (2010). Inquiry-based science instruction – What is it and does it matter? Results from a research synthesis years 1984 to 2002. Journal of Research in Science Teaching, 47(4), 474–496.

    Article  Google Scholar 

  • National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.

    Google Scholar 

  • National Research Council. (2007). Taking science to school: Learning and teaching science in grades K–8. Washington, DC: National Academies Press.

    Google Scholar 

  • National Research Council. (2011). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academy Press.

    Google Scholar 

  • Organization for Economic Cooperation and Development (OECD). (2005). PISA2003 (Technical Report). Paris: Author.

    Google Scholar 

  • Organization for Economic Cooperation and Development (OECD). (2007). PISA 2006. Science competencies for tomorrow’s world. Vol. I: Analysis. Paris: Author.

    Google Scholar 

  • Organization for Economic Cooperation and Development (OECD). (2010). PISA 2009 results: What students know and can do – Student performance in reading, mathematics and science (Vol. I). Paris: Author.

    Google Scholar 

  • Park, D. (2005). Differences between a standards-based curriculum and traditional textbooks in high school earth science. Journal of Geoscience Education, 53(5), 540–547.

    Google Scholar 

  • Park, D., Yager, R., & Smith, M. (2005). Implementing EarthComm: Teacher professional development and its impact on student achievement scores in a standards-based earth science curriculum. Electronic Journal of Science Education, 9(3). Retrieved September 15, 2011, from http://wolfweb.unr.edu/homepage/crowther/ejse/ejsev9n3.html

  • Piaget, J. (1964). Development and learning. Journal of Research in Science Teaching, 2(3), 176–186.

    Article  Google Scholar 

  • Pizzini, E. L., & Shepardson, D. P. (1991). Student questioning in the presence of the teacher during problem solving in science. School Science and Mathematics, 91(8), 348–352.

    Article  Google Scholar 

  • Renner, J. W. (1972). The laboratory and science teaching. In J. W. Renner & D. G. Stafford (Eds.), Teaching science in secondary schools. New York: Harper and Row.

    Google Scholar 

  • Rowe, M. (1986). Waiting time: Slowing down may be a way of speeding up. Journal of Teacher Education, 37, 43–50.

    Article  Google Scholar 

  • Schauble, L., Glaser, R., Duschl, R. A., Schulze, S., & John, J. (1995). Students’ understanding of the objectives and procedures of experimentation in the science classroom. The Journal of the Learning Sciences, 4, 131–166.

    Article  Google Scholar 

  • Schneider, R. M., & Krajcik, J. (2002). Supporting science teacher learning: The role of educative curriculum materials. Journal of Science Teacher Education, 13, 221–245.

    Article  Google Scholar 

  • Schwab, J. (1962). The teaching science as inquiry. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Shymansky, J. A., & Kyle, W. C., Jr. (1992). Establishing a research agenda: Critical issues of science curriculum reform. Journal of Research in Science Teaching, 29(8), 749–778.

    Article  Google Scholar 

  • Stake, R. E., & Easley, J. (1978). Case studies in science education, Vol. 13. Center for Instructional Research and Curriculum Evaluation. Urbana-Champaign, IL: University of Illinois.

    Google Scholar 

  • Staver, J. R., & Bay, M. (1987). Analysis of the project synthesis goal cluster orientation and inquiry emphasis of elementary science textbooks. Journal of Research in Science Teaching, 24(7), 629–643.

    Article  Google Scholar 

  • Stavy, R. (1990). Pupil’s problems in understanding conservation of matter. International Journal of Science Education, 12(5), 501–512.

    Article  Google Scholar 

  • Stinner, A. (1995). Science textbooks: Their present role and future form. In S. M. Glynn & R. Duit (Eds.), Learning science in the schools: Research reforming practice. Mahwah, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Strangman, N., & Hall, T. (2003). Text transformations. Wakefield, MA: National Center on Accessing the General Curriculum. Retrieved September 20, 2011, from http://aim.cast.org/learn/historyarchive/backgroundpapers/text_transformations.

    Google Scholar 

  • Tamir, P. (1976). The Role of the Laboratory in Science Teaching (Technical Report 10). The University of Iowa, Iowa City, IA: Science Education Center.

    Google Scholar 

  • Vellom, R. P., & Anderson, C. W. (1999). Reasoning about data in middle school science. Journal of Research in Science Teaching, 36(2), 179–199.

    Article  Google Scholar 

  • Weiss, I. R. (1978). Report of the 1977 national survey of science, mathematics and social studies education. Research Triangle Park, NC: Center for Educational Research and Evaluation, Research Triangle Institute.

    Google Scholar 

  • Yager, R.E. (1980). Crisis in Science Education (Technical Report 21). The University of Iowa, Iowa City, IA: Science Education Center.

    Google Scholar 

  • Yager, R. E. (1983). The importance of terminology in teaching K-12 science. Journal of Research in Science Teaching, 20(6), 577–588.

    Article  Google Scholar 

  • Yager, R. E. (1992). Viewpoint: What we did not learn from the 60’s about science curriculum reform. Journal of Research in Science Teaching, 29(8), 905–910.

    Article  Google Scholar 

  • Yager, R. E. (1996). Scope, sequence, and coordination; A national reform effort in the U.S. – The Iowa Project. Paper presented at “History and Philosophy in Science Teaching – A Means to Improve Scientific Literacy?” Evangelische Akademic Loccum, Rehburg-Loccum, Germany.

    Google Scholar 

  • Zoller, U. (1987). The fostering of question-asking capability: A meaningful aspect of problem-solving in chemistry. Journal of Chemical Education, 64(6), 510–512.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Do-Yong Park Ph.D. .

Editor information

Editors and Affiliations

Appendices

Appendix: Textbook Questioning Style Assessment Instrument (TQSAI)

A. Not experiential

1.

2.

3.

4.

5.

Rhetorical

Direct information

Focusing

Open-ended

Valuing

B. Experiential

 

2.

3.

4.

5.

 

Direct information

Focusing

Open-ended

Valuing

a. Observing

    

b. Communicating

    

c. Comparing

    

d. Organizing

    

e. Experimenting

    

f.  Inferring

    

g. Applying

    

Definitions of Types of Questions

On the TQSAI, the question categories are sequenced along a horizontal dimension. The categories are defined as follows:

  • Rhetorical questions: Questions that do not expect some participation by the reader. Such questions never require the student to do anything, thus non-experiential, and are tallied in a special cell on the TQSAI.

  • Direct-information questions: Questions that ask the reader to recall or recognize specific information (concepts, principles, laws, and so on) read, heard, or previously discussed.

  • Focusing questions: Questions that contain clues that suggest what the expected response is to be. Such questions guide the student toward an answer that the author wants to be developed in the student’s own terms.

  • Open-ended questions: Questions that do not indicate one expected answer. Such questions invite exploration of relationships and consideration of meaning or implications.

  • Valuing questions: Questions that ask the reader to make a cognitive or an affective judgment or to explain the criteria used in an evaluation.

The science/learning processes are sequenced along a vertical dimension on the TQASI. They are defined as follows:

  • Observing: Questions that ask the reader to look, listen, touch, taste, smell, and the like. Such questions may ask the reader how she/he felt or what thoughts were elicited.

  • Communicating: Questions that ask the reader to verbalize, write, picture, and the like. Such questions may ask the reader to furnish a name, offer a descriptive term, or verbalize a rule. They may ask the reader what was hopeful or to identify the words that elicited a feeling.

  • Comparing: Questions that ask the reader to compare lengths, weights, capacities, or times. Such questions may ask the reader to identify similarities, to measure, to count parts, or to state a preference and the reason for the preference.

  • Organizing: Questions that ask the reader to seriate, order, sequence, group, or classify. The reader may be asked to sort into groups, to identify the basis for a grouping, or to provide criteria for a grouping.

  • Experimenting: Questions that ask the reader to hypothesize or to control and manipulate variables. The reader may be asked to identify conditions necessary for results or whether and when to change his/her attitudes on the basis of new evidence.

  • Inferring: Questions that ask the reader to synthesize, abstract, analyze, recognize patterns, predict, generalize, or to formulate a theoretical model. The reader may be asked to furnish a reason for an occurrence, provide a conclusion, or to identify the generalizations that apply.

  • Applying: Questions that ask the reader to use his/her knowledge or to invent. The reader may be asked to embark upon a course of action based upon a choice of alternatives.

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer Science+Business Media B.V.

About this chapter

Cite this chapter

Park, DY., Lavonen, J. (2013). An Analysis of Standards-Based High School Physics Textbooks of Finland and the United States. In: Khine, M. (eds) Critical Analysis of Science Textbooks. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-4168-3_11

Download citation

Publish with us

Policies and ethics