Skip to main content
Log in

Why Students Answer TIMSS Science Test Items the Way They Do

  • Published:
Research in Science Education Aims and scope Submit manuscript

Abstract

The purpose of this study was to explore how Year 8 students answered Third International Mathematics and Science Study (TIMSS) questions and whether the test questions represented the scientific understanding of these students. One hundred and seventy-seven students were tested using written test questions taken from the science test used in the Third International Mathematics and Science Study. The degree to which a sample of 38 children represented their understanding of the topics in a written test compared to the level of understanding that could be elicited by an interview is presented in this paper. In exploring student responses in the interview situation this study hoped to gain some insight into the science knowledge that students held and whether or not the test items had been able to elicit this knowledge successfully. We question the usefulness and quality of data from large-scale summative assessments on their own to represent student scientific understanding and conclude that large scale written test items, such as TIMSS, on their own are not a valid way of exploring students' understanding of scientific concepts. Considerable caution is therefore needed in exploiting the outcomes of international achievement testing when considering educational policy changes or using TIMSS data on their own to represent student understanding.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Angell, C., Kjaernsli, M., & Lie, S. (2000). Exploring students' responses on free-response science items in TIMSS. In D. Shorrocks-Taylor & E. W. Jenkins (Eds.), Learning from others: International comparisons in education (pp. 159–187). Dordrecht, The Netherlands: Kluwer Academic Publishers.

    Google Scholar 

  • Atkin, J. M., & Black, P. (1997). Policy perils of international comparisons: The TIMSS case. Phi Delta Kappan, 79(1), 22–30

    Google Scholar 

  • Black, P. (1995a). Can teachers use assessment to improve learning? British Journal of Curriculum and Assessment, 5(2), 7–11.

    Google Scholar 

  • Black, P. (1995b). Assessment and feedback in science education. In A. Hofstein, B. Eylon, & G. Giddings, Science education: From theory to practice (pp. 73–88). Rehovot, Israel: Department of Science Teaching, The Weizmann Institute of Science.

    Google Scholar 

  • Carr, M. (1991). Methods for studying personal construction. In J. Northfield & D. Symington (Eds.), Learning in science viewed as personal construction: An Australasian perspective (Key Centre Monograph No. 3) (pp. 16–24). Perth, Australia: Curtin University of Technology.

    Google Scholar 

  • Chamberlain, M., Chamberlain, G., & Garden, R. (1998). Student performance on open-ended questions in the 3rd International Mathematics and Science Study: New Zealand results. Wellington, New Zealand: Research Division, Ministry of Education.

    Google Scholar 

  • Chamberlain, M., & Walker, M. (2001). Trends in year 9 students' mathematics and science achievement. Wellington, New Zealand: Ministry of Education.

    Google Scholar 

  • Education Review Office. (2000). In time for the future: A comparative study of mathematics and science education. Available: http://www.ero.govt.nz. [26/04/02]

  • Fensham, P. J. (1998). Student response to the TIMSS test. Research in Science Education, 28(4), 481–489.

    Google Scholar 

  • Garden, R. A. (Ed.). (1996). Science performance of New Zealand form 2 and form 3 students: National results from New Zealand's participation in the Third Interna-tional Mathematics and Science Study. Wellington, New Zealand: Research and International Section, Ministry of Education.

    Google Scholar 

  • Gipps, C. (1994). Beyond testing: Towards a theory of educational assessment. London: The Falmer Press.

    Google Scholar 

  • Harlen, W. (1996). The teaching of science in primary schools (2 nd ed.). London: David Fulton Publishers.

    Google Scholar 

  • Harlen, W. (1999). Purposes and procedures for assessing science process skills. Assessment in Education, 6(1), 129–144.

    Google Scholar 

  • Lokan, J., Adams, R., & Doig, B. (1999). Broadening assessment, improving fair-ness? Some examples from school science. Assessment in Education, 6(1), 83–99.

    Google Scholar 

  • Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18, 5–11.

    Google Scholar 

  • Murphy, P. (1995). Sources of inequity: Understanding students' responses to assessment. Assessment in Education, 2(3), 249–270.

    Google Scholar 

  • Murphy, P. (1996). The IEA assessment of science achievement. Assessment in Education, 3(2), 213–232.

    Google Scholar 

  • Northfield, J., & Symington, D. (Eds.). (1991). Learning in science viewed as personal construction: An Australasian perspective (Key Centre Monograph No. 3). Perth, Australia: Curtin university of Technology.

    Google Scholar 

  • Nuthall, G., & Alton-Lee, A. (1997). Student learning in the classroom: Under-standing learning and teaching project 3 (Report to the Ministry of Education). Wellington, New Zealand: Ministry of Education.

    Google Scholar 

  • Osborne, R., & Freyberg, P. (1985). Learning in science: The implications of children's science. Auckland, New Zealand: Heinemann Education.

    Google Scholar 

  • Shapiro, B. (1994). What children bring to light: A constructivist perspective on children's learning in science. New York: Teachers College Press.

    Google Scholar 

  • TIMSS web site — http://timss.bc.edu/ [26/04/02]

  • Treagust, D. F. (1991). Implications for research. In J. Northfield & D. Symington <nt>(Eds.)</nt>, Learning in science viewed as personal construction: An Australasian perspective (pp. 62–71) (Key Centre Monograph No. 3). Perth, Australia: Curtin University of Technology.

    Google Scholar 

  • Wang, J. (1998). A content examination of the TIMSS results. Phi Delta Kappan, 80(1), 36–38.

    Google Scholar 

  • Warwick, P., Sparks Linfield, R., & Stephenson, P. (1999). A comparison of primary school pupils' ability to express procedural understanding in science through speech and writing. International Journal of Science Education, 21(8), 823–838.

    Google Scholar 

  • Zuzovsky, R., & Tamir, P. (1999). Growth patterns in students' ability to supply scientific explanations: Findings from the Third International Mathematics and Science Study in Israel. International Journal of Science Education, 21(10), 1101–1121.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Harlow, A., Jones, A. Why Students Answer TIMSS Science Test Items the Way They Do. Research in Science Education 34, 221–238 (2004). https://doi.org/10.1023/B:RISE.0000033761.79449.56

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/B:RISE.0000033761.79449.56

Navigation