Advertisement

Measuring Changing Educational Contexts in a Changing World: Evolution of the TIMSS and PIRLS Questionnaires

  • Ina V. S. Mullis
  • Michael O. Martin
  • Martin Hooper
Chapter
Part of the Methodology of Educational Measurement and Assessment book series (MEMA)

Abstract

With each TIMSS and PIRLS assessment, IEA’s TIMSS & PIRLS International Study Center at Boston College has improved the quality of the context questionnaire data collected about educational policies and practices. Over the 20 years that TIMSS and PIRLS have measured trends in educational achievement, the questionnaire data have been evolving to measure a stable set of policy-relevant constructs. With trends in valid and reliable context questionnaire scales, changes in students’ achievement from one assessment cycle to the next can be examined in relation to changes in the policies and practices of interest to determine whether there are patterns. TIMSS 2015 provided trend results for about a dozen such scales (e.g., Instruction Affected by Resource Shortages, Safe and Orderly School, and Early Literacy and Numeracy Activities) and PIRLS 2016 is expected to provide similar results.

References

  1. Comber, L. C., & Keeves, J. P. (1973). Science education in nineteen countries: An empirical study. Stockholm: Almqvist & Wiksell.Google Scholar
  2. Gustafsson, J.-E. (2007). Understanding causal influences on educational achievement through differences over time within countries. In T. Loveless (Ed.), Lessons learned (pp. 37–64). Washington D.C.: The Brookings Institute.Google Scholar
  3. Husén, T. (1967). International Study of Achievement in Mathematics. A Comparison of Twelve Countries. Volume I. New York: Wiley.Google Scholar
  4. Liu, H., Bellens, K., Gielen, S., Van Damme, J., & Onghena, P. (2014). A country level longitudinal study on the effect of student age, class size, and socio-economic status – Based on PIRLS 2001, PIRLS 2006, and PIRLS 2011. In R. Strietholt, W. Bos, J.-E. Gustafsson & M. Rosen (Eds.), Educational policy evaluation through international comparative assessments (pp. 223–242). Muenster, Germany: Waxman.Google Scholar
  5. Martin, M. O. (1996). Third International Mathematics and Science Study: An overview. In M. O. Martin & D. L. Kelly (Eds.), Third International Mathematics and Science Study: Technical report. Volume I: Design and development (pp. 1.1–1.18). Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.Google Scholar
  6. Martin, M. O., & Preuschoff, C. (2008). Creating the TIMSS 2007 Background Indices. In J. F. Olsen, M. O. Martin, & I. V. S. Mullis (Eds.), TIMSS 2007 Technical Report (pp. 281–338). Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.Google Scholar
  7. Martin, M. O., Mullis, I. V. S., Arora, A., & Preuschoff, C. (2014). Context questionnaire scales in TIMSS and PIRLS 2011. In L. Rutkowski, M. von Davier & D. Rutkowski (Eds.), Handbook in international large-scale assessment: Background, technical issues, and methods of data analysis. Boca Raton: CRC press.Google Scholar
  8. McLaughlin, M., Mc.Grath, D.J., Burian-Fitzgerald, M.A., Lanahan, L., Scotchmer, M., Enyeart, C., et al. (2005, April). Student content engagement as a construct for the measurement of effective classroom instruction and teacher knowledge. Paper presented at the annual meeting of the American Educational Researchers Association, Montreal, Canada.Google Scholar
  9. Mullis, I. V. S., Martin, M. O., & Foy, P. (2008). TIMSS 2007 International Mathematics Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades. (Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College)Google Scholar
  10. Mullis, I. V. S., Martin, M. O., Foy, P., & Arora, A. (2012). TIMSS 2011 International Results in Mathematics. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.Google Scholar
  11. Mullis, I. V. S., Martin M. O., Foy, P & Hooper, M. (2016). TIMSS 2015 International Results in Mathematics. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.Google Scholar
  12. Postlethwaite, T. N., & Wiley, D. E. (1992). The IEA Study of Science II. Science Achievement in twenty-three countries. Oxford: Pergamon Press.Google Scholar
  13. Rosén, M., & Gustafsson, J.-E. (2014). Has the increased access to computers at home caused reading achievement to decrease in Sweden? In R. Strietholt, W. Bos, J.-E. Gustafsson & M. Rosen (Eds.), Educational policy evaluation through international comparative assessments (pp. 207–222). Muenster, Germany: Waxman.Google Scholar
  14. Robitaille, D. F., & Garden, R. A. (1989). The IEA study of mathematics II. Contexts and outcomes of school mathematics. Oxford: Pergamon Press.Google Scholar
  15. Robitaille, D. F., & Garden, R. A. (1996). Design of the study. In D. F. Robitaille & R. A. Garden (Eds.), Third International Mathematics and Science Study: Research questions & study design (pp. 44–68). TIMSS Monograph No. 2. Vancouver, Canada: Pacific Educational Press.Google Scholar
  16. Suen, H. K. (1990). Principles of test theory. Hilldale, NJ: LEA Publisher.Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Ina V. S. Mullis
    • 1
  • Michael O. Martin
    • 1
  • Martin Hooper
    • 1
  1. 1.TIMSS & PIRLS International Study CenterBoston CollegeChestnut HillUSA

Personalised recommendations