Time in Internationally Comparative Studies

Chapter
Part of the SpringerBriefs in Education book series (BRIEFSEDUCAT)

Abstract

In this chapter, illustrative internationally comparative data about time at school, time spent in out-of-school programs, and homework/ individual study time are presented. In the first section this is done in a more descriptive way, while in the second and third sections, the association between the various indicators of instruction time and student performance, between and within countries, are discussed. The overall conclusion was that the results from international comparative studies concerning the association of time with educational achievement should be interpreted with a lot of caution. Negative associations of facets of time and student achievement at country level could mean that the causal direction is reversed, in the sense that more investment in time happens as a reaction to low performance rather than as a cause of higher performance. The finding that negative associations persisted in the secondary analyses of the PISA 2009 data-set, when change in time investment was related to change in performance between countries indicates that this phenomenon is not just an artifact of cross-sectional research design, but a matter of reactive policy (more time investment when achievement results are low), which compensates insufficiently for more important sources of low achievement, such as low SES composition.

Keywords

Student Achievement Reading Achievement Regular School Instruction Time Instructional Time 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Baker, D. P., Fabrega, R., Galindo, C., & Mishook, J. (2004). Instructional time and national achievement: cross-national evidence. Prospects, XXXIV, 311–334.CrossRefGoogle Scholar
  2. Baker, D. P., & LeTendre, G. K. (2005). National differences, global similarities. Stanford: Stanford University Press.Google Scholar
  3. Bishop, J. (1997). The effect of national standards and curriculum-based exams on achievement. The American Economic Review, 87(2), 260–264.Google Scholar
  4. Fuller, B. (1987). What factors raise achievement in the Third World? Review of Educational Research, 57, 255–292.Google Scholar
  5. Gustavsson, J. E.(2010, Augest). Causal inference in educational effectiveness research: A comparison of three methods to investigate effects of homework on student achievement. Invited key-note address of the second meeting of EARLI SIG 18, Centre for Evaluation and Educational Effectiveness, University of Leuven, Belgium.Google Scholar
  6. Mullis, I.V.S., Martin, M.O., Gonzalez, E.J., Gregory, K.D., Garden, R.A., O’Connor, K.M., Chrostowski, S.J., & Smith, T.A.(2000).TIMSS 1999 International Mathematics Report. Findings from IEA’s Repeat of the Third International Mathematics and Science Study at the Eighth Grade. Boston College, The International Study Center.Google Scholar
  7. OECD (2001). Education at a glance. OECD indicators. Paris: OECD Publishing.Google Scholar
  8. OECD (2007). Education at a glance. OECD indicators. Paris: OECD Publishing.Google Scholar
  9. OECD (2009). Education at a glance. OECD indicators. Paris: OECD Publishing.Google Scholar
  10. OECD(2010). PISA 2009 Results What Students Know and Can Do: Student Performance in Reading, Mathematics and Science (Vol. I). Paris: OECD Publishing.Google Scholar
  11. OECD (2011). Quality time for students: Learning in and out-of-school. Paris: OECD Publishing.Google Scholar
  12. OECD (2012). Education at a Glance 2012: OECD Indicators. Paris: OECD Publishing.Google Scholar
  13. Scheerens, J. (2004). The evaluation culture. Studies in Educational Evaluation, 30, 105–124.CrossRefGoogle Scholar
  14. Schildkamp, K. (2007). The utilisation of a self-evaluation instrument for primary education. Enschede: University of Twente.Google Scholar
  15. Scheerens, J., Glas, C., & Thomas, S. (2003). Educational evaluation, assessment and monitoring. Lisse: Swets & Zeitlinger.Google Scholar
  16. Scheerens, J., Glas, C.W., Jehangir, K., Luyten, H. & Steen, R. (2012). System level correlates of educational performance. Thematic report based on PISA 2009 data. Enschede: University of Twente, Department of educational organisation and management.Google Scholar
  17. Woessmann, L., Luedemann, E., Schuetz, G., & West, M. R. (2009). School accountability, autonomy and choice around the world. Cheltenham: Edward Elgar.Google Scholar

Copyright information

© The Author(s) 2014

Authors and Affiliations

  1. 1.Department of Educational SciencesUniversity of TwenteEnschedeThe Netherlands
  2. 2.Department of Research Methodology, Measurement and Data AnalysisUniversity of TwenteEnschedeThe Netherlands

Personalised recommendations