Advertisement

Implications of International Studies for National and Local Policy in Mathematics Education

  • John A. DosseyEmail author
  • Margaret L. Wu
Chapter
Part of the Springer International Handbooks of Education book series (SIHE, volume 27)

Abstract

This chapter examines large-scale comparative studies of mathematics education focussed on student achievement in an attempt to explain how such investigations influence the formation and implementation of policies affecting mathematics education. In doing so, we review the nature of comparative studies and policy research. Bennett’s (1991) formulation of policy development and implementation is used in examining national reactions to the results of international studies. Focus is given to the degree to which mathematics educators and others have played major roles in determining related policy outcomes affecting curriculum and the development and interpretations of the assessment instruments and processes themselves.

Keywords

Policy Convergence Mathematics Education Community 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Alexiadou, N. (2007). The Europeanisation of education policy: Researching a changing governance and “new” modes of coordination. Research in Comparative and International Education, 2(2), 102–116.CrossRefGoogle Scholar
  2. American Institutes for Research. (2005). What the United States can learn from Singapore’s world-class mathematics system (and what Singapore can learn from the United States): An exploratory study. Washington, DC: AIR. Retrieved from http://www.air.org/reports-products/index.cfm?fa=viewContent&content_id=598
  3. Astala, K., Kivelä, S. K., Koskela, P., Martio, O., Näätänen, M., & Tarvainen, K. (2005, December). The PISA survey tells only a partial truth of Finnish children’s mathematical skills. Matilde, 29, 9.Google Scholar
  4. Beaton, A. O., Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., Kelly, D. L., & Smith, T. A. (1996). Mathematics achievement in the middle school years: IEA’s Third International Mathematics and Science Study. Chestnut Hill, MA: Center for the Study of Testing, Evaluation, and Educational Policy, Boston College.Google Scholar
  5. Bennett, C. J. (1991). What is policy convergence and what causes it? British Journal of Political Science, 21(2), 215–233.CrossRefGoogle Scholar
  6. Bidwell, J. K., & Clason, R. G. (Eds.). (1970). Readings in the history of mathematics education. Washington, DC: National Council of Teachers of Mathematics.Google Scholar
  7. Blum, W., & Kaiser, G. (2004). Kassel Project in Germany. Budapest, Hungary: Wolters−Kluwer.Google Scholar
  8. Bohl, T. (2004). Empirische Unterrichtsforschung und Allgemeine Didaktik. Entstehung, Situation und Konsequenzen eines prekären Spannungsverhältnisses im Kontext der PISA-Studie [Empirical research on teaching and general didactics: A precarious tension and consequences in the context of the PISA study]. Die Deutsche Schule, 96(4), 414–425.Google Scholar
  9. Burghes, D., Geach, R., & Roddick, M. (Eds.). (2004). IPMA. Budapest, Hungary: Wolters−Kluwer.Google Scholar
  10. Burghes, D., Kaur, B., & Thompson, D.R. (Eds.). (2004). Kassel Project—Final Report. Budapest, Hungary: Wolters−Kluwer.Google Scholar
  11. Burstein, L. (Ed.). (1993). The IEA Study of Mathematics III: Student growth and classroom processes. Oxford, UK: Pergamon.Google Scholar
  12. Carnoy, M. (1999). Globalization and educational reform: What planners need to know. Paris, France: United Nations Education, Scientific, and Cultural Organisation.Google Scholar
  13. Carnoy, M. (2006). Rethinking the comparative and the international. Comparative Education Review, 50(4), 551–570.CrossRefGoogle Scholar
  14. Casassus, J., Froemel, J. E., Palafox, J. C., & Cusato, S. (1998). First international comparative study of language, mathematics, and associated factors in third and fourth grades. Santiago, Chile: Latin American Laboratory for Evaluation of the Quality of Education.Google Scholar
  15. Chatterji, M. (2002). Models and methods for examining standards-based reforms and accountability initiatives: Have the tools of inquiry answered pressing questions on improving schools? Review of Educational Research, 72(3), 345–386.CrossRefGoogle Scholar
  16. College Entrance Examination Board. (1959). Program for college preparatory mathematics. New York, NY: AuthorGoogle Scholar
  17. Council of Chief State School Officers & National Governors Association. (2010). Common core state standards for mathematics. Retrieved from http://www.corestandards.org/assets/CCSSI_Math%20Standards.pdf.
  18. Cussó, R., & D’Amico, S. (2005). From development comparativism to globalization comparativism: Towards more normative international education statistics. Comparative Education, 41(1), 1–16.CrossRefGoogle Scholar
  19. Dossey, J. A., Mullis, I. V. S., Lindquist, M. M., & Chambers, D. L. (1988). The mathematics report card: Are we measuring up? Princeton, NJ: Educational Testing Service.Google Scholar
  20. Eaton, J., & Kortum, S. (1996). Trade in ideas: Patenting and productivity in the OECD. Journal of International Economics, 40(3–4), 251–278.CrossRefGoogle Scholar
  21. Ertl, H. (2006). Educational standards and the changing discourse on education: The reception and consequences of the PISA study in Germany. Oxford Review of Education, 32(5), 619–634.CrossRefGoogle Scholar
  22. Figazzolo, L. (2009). Impact of PISA 2006 on the education policy debate. Education International. Retrieved from http://www.ei-ie.org/research/en/documentation.php.
  23. Greaney, V., & Kellaghan, T. (2008). Assessing national achievement levels in education. Washington, DC: International Bank for Reconstruction and Development/The World Bank.Google Scholar
  24. Grek, S. (2009). Governing by numbers: The PISA “effect” in Europe. Journal of Educational Policy, 24(1), 23–37.CrossRefGoogle Scholar
  25. Grek, S., Lawn, M., Lingard, B., Rinne, R., Segerholm, C., & Simola, H. (2011). National policy brokering and the construction of the European Education Space in England, Sweden, Finland and Scotland. In J. Ozga, P. Dahler-Larsen, C. Segerholm, & H. Simola (Eds.), Fabricating quality in education: Data and governance in Europe (pp. 47–65). London, UK: Routledge.Google Scholar
  26. Grier, K., & Grier, R. (2007). Only income diverges: A neoclassical anomaly. Journal of Developmental Economics, 84, 25–45.CrossRefGoogle Scholar
  27. Hautamäki, J., Harjunen, E., Hautamäki, A., Karjalainen, T., Kupiainen, S., Laaksonen, S., Lavonen, J., Pehkonen, E., Rantanen, P., Scheinin, P., Halinen, I., & Jakku-Sihvonen, R. (2008). PISA 2006 Finland: Analyses, reflections and explanations. Helsinki, Finland: Ministry of Education.Google Scholar
  28. Hopmann, S. T., & Brinek, G. (2007). Introduction: PISA according to PISA—Does PISA keep what it promises? In S. T. Hopmann, G. Brinek, & M. Retzel (Eds.), PISA according to PISA—Does PISA keep what it promises (pp. 9–19). Vienna, Austria: LIT-Verlag.Google Scholar
  29. Husén, T. (Ed.). (1967). International Study of Achievement in Mathematics: A comparison of twelve countries (2 vols.). Stockholm, Sweden: Almqvist & Wiksell.Google Scholar
  30. Hutchison, D., & Schagen, I. (2007). Comparisons between PISA and TIMSS—Are we the man with two watches? In T. Loveless (Ed.), Lessons learned—What international assessments tell us about math achievement (pp. 227–261). Washington, DC: Brookings Institute Press.Google Scholar
  31. International Association for the Evaluation of Educational Achievement. (2011). Brief history of the IEA. Retrieved from http://www.iea.nl/brief_history_of_iea.html.
  32. International Commission on Mathematical Instruction. (2011a). A historical sketch of ICMI. Retrieved from http://www.mathunion.org/icmi/about-icmi/a-historical-sketch-of-icmi/.
  33. International Commission on Mathematical Instruction. (2011b). The first century of the International Commission on Mathematical Instruction (1908–2008). Retrieved from http://www.icmihistory.unito.it/timeline.php.
  34. International Project on Mathematical Attainment. (2011). CIMT research projects and publications. Retrieved from http://www.cimt.plymouth.ac.uk/menus/research.htm.
  35. Jones, P. W. (2004). Taking the credit: Financing and policy linkages in the educational portfolio of the World Bank. In G. Steiner-Khamsi (Ed.), The global politics of educational borrowing and lending (pp. 188–200). New York, NY: Teachers College Press.Google Scholar
  36. Jones, G. A. (2005). What do studies like PISA mean to the mathematics education community? In H. L. Chick & J. L. Vincent (Eds.), Proceedings of the 29th Conference of the International Group for the Psychology of Mathematics Education (Vol. 1, pp. 71–74). Melbourne, Australia: PME.Google Scholar
  37. Jones, P. W., & Coleman, D. (2005). The United Nations and education: Multilateralism, development and globalization. London, UK: Routledge/Falmer.CrossRefGoogle Scholar
  38. Jones, P. S., & Coxford, A. F., Jr. (Eds.). (1970). A history of mathematics education in the United States and Canada (Thirty-second Yearbook of the National Council of Teachers of Mathematics). Washington, DC: National Council of Teachers of Mathematics.Google Scholar
  39. Kang, H. J. (2009). A cross-cultural curriculum study on U.S. elementary mathematics textbooks. In S. L. Swars, D. W. Stinson, & S. Lemons-Smith (Eds.), Proceedings of the 31st Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education (Vol. 5, pp. 379–386). Atlanta, GA: Georgia State University.Google Scholar
  40. Kaur, B. (2009). Performance of Singapore students in Trends in International Mathematics and Science Studies (TIMSS). In K. Y. Wong, P. P. Lee, B. Kaur, F. P. Yee, & N. S. Fong (Eds.), Mathematics education: The Singapore journey (pp. 439–463). Singapore: World Scientific.CrossRefGoogle Scholar
  41. Kellaghan, T., Greaney, V., & Murray, T. S. (2009). Using the results of a national assessment of educational achievement. Washington, DC: World Bank.CrossRefGoogle Scholar
  42. Kerr, C. (1983). The future of industrial societies: Convergence or continuing diversity? Cambridge, MA: Harvard University Press.Google Scholar
  43. Kilpatrick, J. (1971). Some implications of the International Study of Achievement in Mathematics for mathematics educators. Journal for Research in Mathematics Education, 2(2), 164–171.CrossRefGoogle Scholar
  44. Kilpatrick, J. (2009, February–March). TIMSS 2007 mathematics: Where are we? MAA Focus, 4–7.Google Scholar
  45. Kilpatrick, J., Mesa, V., & Sloane, F. (2007). U.S. algebra performance in an international context. In T. Loveless (Ed.), Lessons learned: What international assessments tell us about math achievement (pp. 85–126). Washington, DC: Brookings Institution Press.Google Scholar
  46. Kimmelman, P., Kroze, D., Schmidt, W., van der Ploef, A., McNeely, M., & Tan, A. (1999). A first look at what we can learn from high-performing school districts: An analysis of TIMSS data from the First in the World Consortium. Washington, DC: National Institute on Student Achievement, Curriculum, and Assessment.Google Scholar
  47. Klieme, E., Avenarius, H, Blum, W., Döbrich, P., Gruber, H., Prenzel, M., Reiss, K., Riquarts, K., Rost, J., Tenorth, H., & Vollmer, H. (2003). Zur Entwicklung nationaler Bildungsstandards [Toward the development of national standards for education]. Berlin, Germany: Bundesministerium fűr Bildung und Forschung.Google Scholar
  48. Kline, M. (1961, October). Math teaching assailed as peril to U.S. scientific progress. New York University Alumni News, n.p.Google Scholar
  49. Kupiainen, S., & Pehkonen, E. (2008). PISA 2006 mathematical literacy assessment. In J. Hautamäki, E. Harjunen, A. Hautamäki, T. Karjalainen, S. Kupiainen, S. Laaksonen, J. Lavonen, E. Pehkonen, P. Rantanen, P. Scheinin, I. Halinen, & R. Jakku-Sihvonen (Eds.), PISA 2006 Finland: Analyses, reflections and explanations (pp. 117–143). Helsinki, Finland: Ministry of Education.Google Scholar
  50. Lapointe, A. E., Mead, N. A., & Askew, J. M. (1992). Learning mathematics. Princeton, NJ: Educational Testing Service.Google Scholar
  51. Lapointe, A. E., Mead, N. A., & Phillips, G. W. (1988). A world of differences: An international assessment of mathematics and science. Princeton, NJ: Educational Testing Service.Google Scholar
  52. Latin American Laboratory for Assessment of the Quality of Education. (2001). First international comparative study of language, mathematics, and associated factors for third and fourth grade primary school students. Santiago, Chile: Latin American Laboratory for Assessment of the Quality of Education.Google Scholar
  53. Lester, F., Jr Dossey, K. J. A., & Lindquist, M. M. (2007). Challenges and opportunities in the analysis of NAEP mathematics results. In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 2003 mathematics assessment of the National Assessment of Educational Progress (pp. 311–331). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  54. Luhmann, N. (1990). Essays on self-reference. New York, NY: Columbia University Press.Google Scholar
  55. Malaty, G. (2006, December). What are the reasons behind the success of Finland in PISA? Matilde, 29, 4–8.Google Scholar
  56. Mathematical Sciences Education Board. (1989). Everybody counts. Washington, DC: National Academy Press.Google Scholar
  57. Mayer-Foulkes, D. (2010). Divergences and convergences in human development (Human Development Research Paper 2010/20). New York, NY: United Nations Development Programme.Google Scholar
  58. McGaw, B. (2008). The role of the OECD in international comparative studies of achievement. Assessment in Education: Principles, Policy & Practice, 15(3), 223–243.CrossRefGoogle Scholar
  59. McKnight, C. C., Crosswhite, F. J., Dossey, J. A., Kifer, E., Swafford, J. O., Travers, K. J., & Cooney, T. J. (1987). The underachieving curriculum: Assessing U.S. school mathematics from an international perspective. Champaign, IL: Stipes Publishing.Google Scholar
  60. McLeod, D. B., Stake, R. E., Schappelle, B. P., Mellissinos, M., & Gierl, M. J. (1996). Setting the standards: NCTM’s role in the reform of mathematics education. In S. A. Raizen & E. D. Britton (Eds.), Bold ventures: Case studies of U.S. innovations in mathematics education (Vol. 3, pp. 13–132). Dordrecht, The Netherlands: Kluwer.CrossRefGoogle Scholar
  61. Miserable Noten für deutsche Schüler [Abysmal marks for German students]. (2001, December 4). Frankfurter Allgemeine.Google Scholar
  62. Moskowitz, J. H., & Stephens, M. (Eds.). (2004). Comparing learning outcomes: International assessments and educational policy. London, UK: Routledge/Falmer.Google Scholar
  63. Mullis, I. V. S., Martin, M. O., Foy, P., Olson, J. F., Preuschoff, C., Erberber, E., Arora, A., & Galia, J. (2008). TIMSS 2007 international mathematics report: Findings from IEA’s Trends in International Mathematics and Science Study at the fourth and eighth grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.Google Scholar
  64. Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., & Chrostowski, S. J. (2004). TIMSS 2003 international mathematics report. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College.Google Scholar
  65. Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., Gregory, K. D., Garden, R. A., O’Connor, K. M., Chrostowski, S. J., & Smith, T. A. (2000). TIMSS 1999 international mathematics report. Chestnut Hill, MA: International Study Center, Lynch School of Education, Boston College.Google Scholar
  66. Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., O’Connor, K. M., Chrostowski, S. J., Gregory, K. D., Garden, R. A., & Smith, T. A. (2001). Mathematics benchmarking report: TIMSS 1999–Eighth grade. Chestnut Hill, MA: TIMSS International Study Center, Lynch School of Education, Boston College.Google Scholar
  67. Mullis, I. V. S., Martin, M. O., Ruddock, G. J., O’Sullivan, C. Y., & Preuschoff, C. (2009). TIMSS 2011 assessment frameworks. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.Google Scholar
  68. Murphy, S. (2010). The pull of PISA: Uncertainty, influence, and ignorance. Interamerican Journal of Education for Democracy, 3(1), 28–44.Google Scholar
  69. National Commission on Excellence in Education. (1983). A nation at risk: The imperatives of education reform. Washington, DC: Government Printing Office.Google Scholar
  70. National Council of Teachers of Mathematics. (1964). The revolution in school mathematics. Washington, DC: Author.Google Scholar
  71. National Council of Teachers of Mathematics. (1961). An analysis of new mathematics programs. Washington, DC: Author.Google Scholar
  72. National Council of Teachers of Mathematics. (1980). An agenda for action. Reston, VA: Author.Google Scholar
  73. National Council of Teachers of Mathematics. (1989). Curriculum and evaluation standards for school mathematics. Reston, VA: Author.Google Scholar
  74. National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA: Author.Google Scholar
  75. Nguyen, M., Elliott, J., Terlouw, C., & Pilot, A. (2009). Neocolonialism in education: Cooperative learning, Western pedagogy in an Asian context. Comparative Education, 45(1), 109–130.CrossRefGoogle Scholar
  76. Niss, M. A. (1999). Kompetencer og uddannelsesbeskrivelse [Competencies and description of education]. Uddannelse, 9, 21–29.Google Scholar
  77. Nóvoa, A., & Yariv-Masal, T. (2003). Comparative research in education: A mode of governance or a historical journey? Comparative Education, 39(4), 423–438.CrossRefGoogle Scholar
  78. Olson, L. (2004, September 22). No Child Left Behind Act changes weighed. Education Week, 24(4), 31, 34.Google Scholar
  79. On the mathematics curriculum of the high school. (1962). American Mathematical Monthly, 69(3), 189–193.Google Scholar
  80. Organisation for Economic Co-operation and Development. (2001). Knowledge and skills for life: First results from PISA 2000. Paris, France: Directorate for Education, OECD.Google Scholar
  81. Organisation for Economic Co-operation and Development. (2002). PISA in the news in Germany: Dec 2001–Jan 2002. Paris, France: Directorate for Education, OECD.Google Scholar
  82. Organisation for Economic Co-operation and Development. (2003). The PISA 2003 assessment framework: Mathematics, reading, science and problem solving knowledge and skills. Paris, France: Directorate for Education, OECD.Google Scholar
  83. Organisation for Economic Co-operation and Development. (2004a). Learning for tomorrow’s world: First results from PISA 2003. Paris, France: Directorate for Education, OECD.Google Scholar
  84. Organisation for Economic Co-operation and Development. (2004b). What makes school systems perform? Seeing school systems through the prism of PISA. Paris, France: Directorate for Education, OECD.Google Scholar
  85. Organisation for Economic Co-operation and Development. (2007). PISA 2006: Science competencies for tomorrow’s world (Vol. 1: Analysis). Paris, France: Directorate for Education, OECD.Google Scholar
  86. Organisation for Economic Co-operation and Development. (2010a). PISA 2009 results: What students know and can do–Student performance in reading, mathematics and science (Vol. 1). Paris, France: Directorate for Education, OECD.Google Scholar
  87. Organisation for Economic Co-operation and Development. (2010b). Draft of PISA 2012 Mathematics Framework. Paris, France: Directorate for Education, OECD.Google Scholar
  88. Organisation for Economic Co-operation and Development. (2010c). Education at a glance 2010: OECD indicators. Paris, France: Directorate for Education, OECD.Google Scholar
  89. Owen, E., Stephens, M., Moskowitz, J., & Gil, G. (2004). Toward education improvement: The future of international assessment. In J. H. Moskowitz & M. Stephens (Eds.), Comparing learning outcomes: International assessments and educational policy (pp. 3–23). London, UK: Routledge/Falmer.Google Scholar
  90. Pashley, P. J., & Phillips, G. W. (1993). Toward world class standards: A research study linking international and national assessments. Princeton, NJ: Educational Testing Service.Google Scholar
  91. Pellegrino, J. W., Jones, L. R., & Mitchell, K. J. (Eds.). (1999). Grading the nation’s report card: Evaluating NAEP and transforming the assessment of educational progress. Washington, DC: National Academy Press.Google Scholar
  92. Peters, M. (2002). Education policy research and the global knowledge economy. Educational Philosophy and Theory, 34(1), 91–102.CrossRefGoogle Scholar
  93. Phillips, D., & Ochs, K. (2003). Processes of policy borrowing in education: Some explanatory and analytic devices. Comparative Education, 39(4), 451–464.CrossRefGoogle Scholar
  94. Porter, A. C., & Gamoran, A. (Eds.). (2002). Methodological advances in cross-national surveys of educational achievement. Washington, DC: National Academy Press.Google Scholar
  95. Porter, A., McMaken, J., Hwang, J., & Yang, R. (2011, April). Common core standards: The new U.S. intended curriculum. Educational Researcher, 40(3), 103–116.Google Scholar
  96. Prais, S. J. (2003). Cautions on OECD’s recent educational survey (PISA). Oxford Review of Education, 29(2), 139–163.CrossRefGoogle Scholar
  97. Rautalin, M., & Alasuutari, P. (2009). The uses of the national PISA results by Finnish officials in central government. Journal of Education Policy, 24(5), 539–556.CrossRefGoogle Scholar
  98. Ripley, A. (2011). The world’s schoolmaster: How a German scientist is using test data to revolutionize global learning. Atlantic Magazine, 308(1), 109–110.Google Scholar
  99. Robitaille, D. F., & Beaton, A. E. (Eds.). (2002). Secondary analysis of the TIMSS data. Dordrecht, The Netherlands: Kluwer.Google Scholar
  100. Robitaille, D. F., & Garden, R. A. (Eds.). (1989). The IEA Study of Mathematics II: Contexts and outcomes of school mathematics. Oxford, UK: Pergamon.Google Scholar
  101. Rutkowski, D. (2008). Towards an understanding of educational indicators. Policy Futures in Education, 6(4), 470–481.CrossRefGoogle Scholar
  102. Sahlberg, P. (2010, December 27). Learning from Finland: How one of the world’s top educational performers turned around. Boston Globe, A9.Google Scholar
  103. Schmidt, W. H., Jorde, D., Cogan, L. S., Barrier, E., Gonzalo, I., Moser, U., Shimizu, K., Sawada, T., Valverde, G., McKnight, D., Prawat, R., Wiley, D. E., Raizen, S., Britton, E. D., & Wolfe, R. G. (1996). Characterizing pedagogical flow: An investigation of mathematics and science teaching in six countries. Dordrecht, The Netherlands: Kluwer.Google Scholar
  104. Schmidt, W. H., McKnight, C., Cogan, L. S., Jakwerth, P. M., & Houang, R. T. (1999). Facing the consequences: Using TIMSS for a closer look at U.S. mathematics and science education. Dordrecht, The Netherlands: Kluwer.Google Scholar
  105. Schmidt, W. H., McKnight, C. C., Houang, R. T., Wang, H. C., Wiley, D. E., Cogan, L. S., & Wolfe, R. G. (2001). Why schools matter: A cross-national comparison of curriculum and schooling. San Francisco, CA: Jossey-Bass.Google Scholar
  106. Schmidt, W. H., McKnight, C., & Raizen, S. (1997). A splintered vision: An investigation of U.S. science and mathematics education. Dordrecht, The Netherlands: Kluwer.Google Scholar
  107. Schmidt, W. H., McKnight, C., Valverde, G. A., Houange, R. T., & Wiley, D. E. (1997). Many visions, many aims (Vol. 1: A cross-national investigation of curricular intentions in school mathematics). Dordrecht, The Netherlands: Kluwer.Google Scholar
  108. Schriewer, J. (1990). The method of comparison and the need for externalization: Methodological criteria and sociological concepts. In J. Schriewer & B. Holmes (Eds.), Theories and methods in comparative education (pp. 25–83). Frankfurt am Main, Germany: Lang.Google Scholar
  109. Sjøberg, S. (2007). PISA and “real life challenges”: Mission impossible? In S. T. Hopmann, G. Brinek, & M. Retzel (Eds.), PISA according to PISA—Does PISA keep what it promises? (pp. 203–224). Vienna, Austria: LIT-Verlag.Google Scholar
  110. Sloane, P. F. E., & Dilger, B. (2005) The competence clash—Dilemata bei der űbertragung des “Konzepts der nationalen Bildungsstandards” auf die berufliche Bildung [The competence clash: Dilemmas in the transmission of the “concept of national education standards” to vocational training]. Berufs- und Wirtschaftspädagogik. Retrieved from http://www.bwpat.de.
  111. Smith, D. E. (1909). The teaching of arithmetic. Boston, MA: Ginn.Google Scholar
  112. Southern and Eastern Africa Consortium for Monitoring Educational Quality. (2011). SACMEQ: 1995–2010. Retrieved from http://www.sacmeq.org/.
  113. Stanat, P., Artelt, C., Baumert, J., Klieme, E., Neubrand, M., Prenzel, M., Schiefele, U., Schneider, W., Schümer, G., Tillmann, K., & Weiß, M. (2002). PISA 2000: Overview of the study—Design, method and results. Berlin, Germany: Max Planck Institute for Human Development.Google Scholar
  114. Steiner-Khamsi, G. (2006). The economics of policy borrowing and lending: A study of late adopters. Oxford Review of Education, 32(5), 665–678.CrossRefGoogle Scholar
  115. Steiner-Khamsi, G. (2007). International knowledge banks and the production of educational crises. European Educational Research Journal, 6(3), 285–292.Google Scholar
  116. Thomas, J. (2001). Globalization and the politics of mathematics education. In B. Atweh, B. Nebres, & H. Forgasz (Eds.), Sociocultural research on mathematics education: An ­international perspective (pp. 95–112). Mahwah, NJ: Erlbaum.Google Scholar
  117. Travers, K. J., & Westbury, I. (Eds.). (1989). The IEA Study of Mathematics II: The analysis of mathematics curricula. Oxford, UK: Pergamon.Google Scholar
  118. UNESCO. (1990). World declaration on education for all. Retrieved July 15, 2011 from http://www.unesco.org/education/efa/ed_for_all/background/jomtien_declaration.shtml.
  119. UNESCO. (2011). The hidden crisis: Armed conflict and education. Paris, France: United Nations Educational, Scientific, and Cultural Organization.Google Scholar
  120. United States Bureau of Education. (1911). Mathematics in the elementary schools of the United States. Washington, DC: Government Printing Office.Google Scholar
  121. Välijärvi, J., Linnakylä, P., Kupari, P., Reinikainen, P., & Arffman, I. (2002). The Finnish success in PISA—And some reasons behind it. Jyväskylä, Finland: Finnish Institute for Educational Research, University of Jyväskylä.Google Scholar
  122. Valverde, G. A., Bianchi, L. J., Wolfe, R. G., Schmidt, W. H., & Houang, R. T. (2002). According to the book: Using TIMSS to investigate the translation of policy into practice through the world of textbooks. Dordrecht, The Netherlands: Kluwer.Google Scholar
  123. Vithal, R., Adler, J., & Keitel, C. (Eds.). (2005). Researching mathematics education in South Africa: Perspectives, practices and possibilities. Cape Town, South Africa: Human Sciences Research Press.Google Scholar
  124. Wong, K. Y., Lee, P. P., Kaur, B., Yee, F. P., & Fong, N. S. (Eds.). (2009). Mathematics education: The Singapore journey. Singapore: World Scientific.Google Scholar
  125. World Bank. (2005, October 12). Education for all—Fast track initiative. Fact sheet: About aid effectiveness. Washington, DC: World Bank.Google Scholar
  126. Wu, M. (2010a). Comparing the similarities and differences of PISA 2003 and TIMSS (OECD Education Working Paper No. 32). Paris, France: Directorate for Education, OECD.Google Scholar
  127. Wu, M. (2010b). Measurement, sampling, and equating errors in large-scale assessments. Educational Measurement: Issues and Practice, 29(4), 15–27.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2012

Authors and Affiliations

  1. 1.Illinois State UniversityNormalUSA
  2. 2.Victoria UniversityMelbourneAustralia

Personalised recommendations