Abstract
This chapter provides an overarching framework to context assessment in international large-scale assessments (ILSAs), and applies it to discuss relationships between context and cognitive assessment. One of the most critical differences between these two types of assessment is the variety of topics, perspectives, and levels in the education systems that need to be taken into account. This chapter first points out similarities in context assessments across ILSA programs, thus defining a set of common content in context assessment. Although there are important conceptual similarities in context assessments across ILSAs, there are good reasons why this lowest common denominator should be enriched according to the respective study goals and designs of the different programs. This chapter discusses some possible directions and further, provides suggestions as to how the scope of ILSAs may be increased to provide better information about education research and policy in the future. Although this framework model is applicable to learning contexts world-wide, context assessments in ILSAs need to take into account the many similarities and differences of education systems world-wide. A final aim of this chapter therefore is to discuss some critical issues that arise from an international perspective in ILSAs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
This and the following three sections were based on the OECD draft framework for context assessments for PISA 2015 (2013: http://www.oecd.org/pisa/pisaproducts/PISA-2015-draft-questionnaire-framework.pdf), which was authored by the authors of this chapter.
- 2.
References
Adams, R. J., Lietz, P., & Berezner, A. (2013). On the use of rotated context questionnaires in conjunction with multilevel item response models. Large-Scale Assessments in Education, 1(1), 1–27. doi:10.1186/2196-0739-1-5.
Allen, C. S., Chen, Q., Willson, V. L., & Hughes, J. N. (2009). Quality of research design moderates effects of grade retention on achievement: A meta-analytic, multilevel analysis. Educational Evaluation and Policy Analysis, 31(4), 480–499. doi:10.3102/0162373709352239.
Almlund, M., Duckworth, A. L., Heckman, J. J., & Kautz, T. (2011). Personality psychology and economics. Bonn: Forschungsinstitut zur Zukunft der Arbeit. http://nbn-resolving.de/urn:nbn:de:101:1-201104113733. Accessed 28 Apr 2016.
Baker, D. P. (2009). The invisible hand of world education culture. In G. Sykes, B. Schneider, & D. N. Plank (Eds.), Handbook of education policy research (pp. 958–968). New York: Routledge.
Berliner, D. C. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18–20.
Blau, D., & Currie, J. (2006). Pre-school, day care, and after-school care: Who’s minding the kids? Handbook of the Economics of Education, 2, 1163–1278.
Bottani, N., & Tuijnman, A. (1994). International education indicators: Framework, development and interpretation. In OECD (Ed.), Making education count: Developing and using international indicators (pp. 21–36). Paris: OECD Publishing.
Bray, M. (2007). The shadow education system: Private tutoring and its implications for planners (2nd ed.). Paris: UNESCO.
Bray, M., & Lykins, C. (2012). Shadow education: Private supplementary tutoring and its implications for policy makers in Asia. Philippines: Asian Development Bank.
Brophy, J. (2006). Grade repetition. Paris: International Academy of Education/International Institute for Educational Planning.
Bryk, A., & Hermanson, K. (1994). Observations on the structure, interpretation and use of education indicator systems. In OECD (Ed.), Making education count: Developing and using international indicators (pp. 37–53). Paris: OECD Publishing.
Bryk, A. S., Sebring, P. B., Allensworth, E., Easton, J. Q., & Luppescu, S. (2010). Organizing schools for improvement: Lessons from Chicago. Chicago: University of Chicago Press.
Burstein, R. (Ed.). (1993). The IEA study of mathematics III: Student growth and classroom processes. Oxford: Pergamon Press.
Cliffordson, C. (2010). Methodological issues in investigations of the relative effects of schooling and age on school performance: The between-grade regression discontinuity design applied to Swedish TIMSS 1995 data. Educational Research and Evaluation, 16(1), 39–52.
CONFEMEN. (2013). Partenariat SACMEQ/PASEC: une rencontre des ministres de l’éducation a eu lieu à Paris. [Partnership SACMEQ/PASEC: A meeting of Ministers of Education held in Paris]. http://www.confemen.org/4061/partenariat-sacmeqpasec-une-rencontre-des-ministres-de-l%E2%80%99education-a-eu-lieu-a-paris/. Accessed 28 Apr 2016.
Couper, M., & Lyberg, L. (2005). The use of paradata in survey research. Proceedings of the 55th Session of the International Statistical Institute. Sydney, Australia.
Couper, M., Kreuter, F., & Lyberg, L. (2010). The use of paradata to monitor and manage survey data collection. In American Statistical Association (Ed.), Proceedings of the survey research methods section (pp. 282–296). Alexandria.
Creemers, B., & Kyriakides, L. (2008). The dynamics of educational effectiveness: A contribution to policy, practice, and theory in contemporary schools. London: Routledge.
Creemers, B., & Kyriakides, L. (2010). School factors explaining achievement on cognitive and affective outcomes: Establishing a dynamic model of educational effectiveness. Scandinavian Journal of Educational Research, 54(3), 263–294. doi:10.1080/00313831003764529.
Creemers, B. P., & Reezigt, G. J. (1997). School effectiveness and school improvement: Sustaining links. School Effectiveness and School Improvement, 8(4), 396–429.
Cunha, F., Heckman, J. J., Lochner, L., & Masterov, D. V. (2006). Interpreting the evidence on life cycle skill formation. Handbook of the Economics of Education, 1, 697–812.
Decristan, J., Klieme, E., Kunter, M., Hochweber, J., Büttner, G., Fauth, B., Hondrich, L., Rieser, S., Hertel, S., & Hardy, I. (2015). Embedded formative assessment and classroom process quality: How do they interact in promoting students’ science understanding? American Educational Research Journal, 52, 1133–1159.
Dewey, J. (1910). How we think. Boston: D. C. Heath & Co.
Dewey, J. (1938). Experience and education. New York: Macmillan.
Dronkers, J., & Avram, S. (2009). Choice and effectiveness of private and public schools in seven countries: A reanalysis of three PISA data sets. Zeitschrift für Pädagogik, 55(6), 895–909.
Ecarius, J., Klieme, E., Stecher, L., & Woods, J. (Eds.). (2013). Extended education: An international perspective. Opladen: Budrich.
Education, Audiovisual and Culture Executive Agency. (2014). Recommended annual instruction time in full-time compulsory education in Europe 20013/2014. Brussels: European Commission. doi:10.2797/340111.
Fraillon, J., Schulz, W., & Ainley, J. (2013). International computer and information literacy study. Amsterdam: IEA. http://www.iea.nl/fileadmin/user_upload/Publications/Electronic_versions/ICILS_2013_Framework.pdf. Accessed 03 May 2016.
Gebhardt, E., & Adams, R. J. (2007). The influence of equating methodology on reported trends in PISA. Journal of Applied Measurement, 8(3), 305–322.
Givvin, K. B., Hiebert, J., Jacobs, J. K., Hollingsworth, H., & Gallimore, R. (2005). Are there national patterns of teaching? Evidence from the TIMSS 1999 video study. Comparative Education Review, 49(3), 311–343. doi:10.1086/430260.
Goldhammer, F., Naumann, J., Stelter, A., Tóth, K., Rölke, H., & Klieme, E. (2014). The time on task effect in reading and problem solving is moderated by task difficulty and skill: Insights from a computer-based large-scale assessment. Journal of Educational Psychology, 106(3), 608–626.
Graham, J. W., Taylor, B. J., Olchowski, A. E., & Cumsille, P. E. (2006). Planned missing data designs in psychological research. Psychological Methods, 11(4), 323–343.
Gustafsson, J.-E. (2007). Understanding causal influences on educational achievement through analysis of differences over time within countries. In T. Loveless (Ed.), Lessons learned: What international assessments tell us about math achievement (pp. 37–63). Washington, DC: The Brookings Institution.
Hanushek, E. A., & Woessmann, L. (2010). The economics of international differences in educational achievement (NBER Working Papers, 15949). Cambridge, MA: National Bureau of Economic Research. http://www.nber.org/papers/w15949.pdf. Accessed 28 Apr 2016.
Hanushek, E. A., & Woessmann, L. (2011). How much do educational outcomes matter in OECD countries? Economic Policy, 26(67), 427–491. doi:10.1111/j.1468-0327.2011.00265.x.
Hanushek, E. A., & Wößmann, L. (2005). Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries (NBER Working paper, 11124). Cambridge, MA: National Bureau of Economic Research. doi:10.3386/w11124.
Hattie, J. A. C. (2009). Visible learning. A synthesis of over 800 meta-analyses relating to achievement. London: Routledge.
Heckman, J. J., Stixrud, J., & Urzua, S. (2006). The effects of cognitive and noncognitive abilities on labor market outcomes and social behavior (No. w12006). National Bureau of Economic Research.
Hiebert, J., Gallimore, R., Garnier, H., Givvin, K. B., Hollingsworth, H., Jacobs, J. K., … Stigler, J. W. (2003). Teaching mathematics in seven countries: Results from the TIMSS 1999 video study. Washington, DC: National Center for Education Statistics.
Hofstede, G., Hofstede, G. J., & Minkov, M. (2010). Cultures and organizations: Software of the mind: Intercultural cooperation and its importance for survival. Columbus: McGraw-Hill.
Hoover, M., Mullis, I. V. S., & Martin, M. O. (2013). TIMSS 2015 context questionnaire framework. In I. V. S. Mullis & M. O. Martin (Eds.), TIMSS 2015 assessment frameworks (pp. 61–83). Chestnut Hill: Boston College. http://timssandpirls.bc.edu/timss2015/downloads/T15_FW_Chap3.pdf. Accessed 28 Apr 2016.
Jakubowski, M. (2010). Institutional tracking and achievement growth: Exploring difference-in-differences approach to PIRLS, TIMSS, and PISA data. In J. Dronkers (Ed.), Quality and inequality of education (pp. 41–81). Dordrecht: Springer.
Jude, N. (2016). The assessment of learning contexts in PISA. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Assessing contexts of learning: An international perspective. Dordrecht: Springer.
Kaplan, D. (2014). Bayesian statistics for the social sciences. New York: Guilford Press.
Kaplan, D. (2016). Causal inference with large-scale assessments in education from a Bayesian perspective: A review and synthesis. Large-Scale Assessments in Education, 4(1), 1–24. doi:10.1186/s40536-016-0022-6.
Kaplan, D., & Elliott, P. R. (1997). A model-based approach to validating education indicators using multilevel structural equation modeling. Journal of Educational and Behavioral Statistics, 22(3), 323–348. doi:10.3102/10769986022003323.
Kaplan, D., & Kuger, S. (2016). The methodology of PISA: Past, present, and future. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Assessing contexts of learning: An international perspective. Dordrecht: Springer.
Kaplan, D., & McCarthy, A. T. (2013). Data fusion with international large scale assessments: A case study using the OECD PISA and TALIS surveys. Large-Scale Assessments in Education, 1(6), 1–26. http://link.springer.com/article/10.1186/2196-0739-1-6#/page-1. Accessed 28 Apr 2016.
Kaplan, D., & Su, D. (2016). On matrix sampling and imputation of context questionnaires with implications for the generation of plausible values in large-scale assessments. Journal of Educational and Behavioral Statistics, 41(1), 57–80. doi:10.3102/1076998615622221.
Keeves, J. P., & Lietz, P. (2011). The relationship of IEA to some developments in educational research methodology and measurement during the years from 1962 to 1992. In C. Papanastasiou, T. Plomp, & E. C. Papanastasiou (Eds.), IEA 1958–2008: 50 years of experiences and memories (pp. 217–252). Nicosia: Cultural Center of the Kykkos Monastery.
Klieme, E. (2012). The role of large-scale assessments in research on educational effectiveness and school development. In M. von Davier, E. Gonzalez, I. Kirsch, & K. Yamamoto (Eds.), The role of international large-scale assessments: Perspectives from technology, economy, and educational research (pp. 115–147). Heidelberg: Springer.
Klieme, E., & Kuger, S. (2016). PISA 2015 context questionnaires framework. In OECD (Ed.), PISA 2015 assessment and analytical framework (pp. 101–127). Paris: OECD Publishing.
Klieme, E., & Vieluf, S. (2013). Schulische Bildung im internationalen Vergleich: Ein Rahmenmodell für Kontextanalysen in PISA [School and schooling in international comparison studies: A framework for context assessments in PISA]. Zeitschrift für Pädagogik Beiheft, 59, 229–246.
Klieme, E., Pauli, C., & Reusser, K. (2009). The Pythagoras study: Investigating effects of teaching and learning in Swiss and German mathematics classrooms. In T. Janík & T. Seidel (Eds.), The power of video studies in investigating teaching and learning in the classroom (pp. 137–160). Münster: Waxmann.
Krosnick, J. A., & Presser, S. (2010). Questionnaire design. In J. D. Wright & P. V. Marsden (Eds.), Handbook of survey research (2nd ed.). West Yorkshire: Emerald Group.
Kuger, S., Jude, N., Klieme, E., & Kaplan, D. (2016). An introduction to the PISA 2015 field trial: Study design and analyses procedures. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Assessing contexts of learning: An international perspective. Dordrecht: Springer.
Kyriakides, L., Creemers, B., Antoniou, P., & Demetriou, D. (2010). A synthesis of studies searching for school factors: Implications for theory and research. British Educational Research Journal, 36(5), 807–830. doi:10.1080/01411920903165603.
La Belle, T. (1982). Formal, nonformal and informal education: A holistic perspective on lifelong learning. International Review of Education, 28(2), 159–175.
Lenkeit, J. (2012). How effective are educational systems? A value-added approach to measure trends in PIRLS. Journal for Educational Research Online, 4(2), 143–173.
Liu, H., Bellens, K., Gielen, S., Van Damme, J., & Onghena, P. (2014). A country level longitudinal study on the effect of student age, class size and socio-economic status–based on PIRLS 2001, 2006 & 2011. In R. Strietholt, W. Bos, J. E. Gustafsson, & M. Rosén (Eds.), Educational policy evaluation through international comparative assessments (pp. 223–243). Münster: Waxmann Verlag.
Lüdtke, O., Robitzsch, A., Trautwein, U., & Kunter, M. (2009). Assessing the impact of learning environments: How to use student ratings in multilevel modelling. Contemporary Educational Psychology, 34, 120–131.
Martin, M. O., & Mullis, I. V. S. (Eds.). (2012). Methods and procedures in TIMSS and PIRLS 2011. Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.
Martin, M. O., Mullis, I. V. S., & Foy, P. (2015). Assessment design for PIRLS, PIRLS literacy, and ePIRLS in 2016. In I. V. S. Mullis & M. O. Martin (Eds.), TIMSS & PIRLS international study center (pp. 55–69). Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.
Meroni, E. C., Vera-Toscano, E., & Costa, P. (2015). Can low skill teachers make good students? Empirical evidence from PIAAC and PISA. Journal of Policy Modeling, 37(2), 308–323. doi:10.1016/j.jpolmod.2015.02.006.
Mullis, I. V., Martin, M. O., Kennedy, A. M., Trong, K. L., & Sainsbury, M. (2009a). PIRLS 2011 assessment framework. Amsterdam: International Association for the Evaluation of Educational Achievement.
Mullis, I. V., Martin, M. O., Ruddock, G. J., O’Sullivan, C. Y., & Preuschoff, C. (2009b). TIMSS 2011 assessment frameworks. Amsterdam: International Association for the Evaluation of Educational Achievement.
OECD. (2004). Problem solving for tomorrow’s world. Paris: OECD.
OECD. (2005). PISA 2003 technical report. Paris: OECD Publishing. https://www.oecd.org/edu/school/programmeforinternationalstudentassessmentpisa/35188570.pdf. Accessed 28 Apr 2016.
OECD. (2010). Pathways to success: How knowledge and skills at age 15 shape future lives in Canada. Paris: OECD Publishing.
OECD. (2011). PISA-quality time for students: Learning in and out of school. Paris: OECD Publishing.
OECD. (2012). PISA 2009 technical report. Paris: OECD Publishing. http://www.oecd.org/pisa/pisaproducts/50036771.pdf. Accessed 28 Apr 2016.
OECD. (2013a). PIAAC 2013 technical report. Paris: OECD Publishing. http://www.oecd.org/site/piaac/_Technical%20Report_17OCT13.pdf. Accessed 28 Apr 2016.
OECD. (2013b). PISA 2012 results: What makes schools successful? Resources, policies and practices (Vol. 4). Paris: OECD Publishing. http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volume-IV.pdf. Accessed 28 Apr 2016.
OECD. (2013c). PISA 2012 assessment and analytical framework. Paris: OECD Publishing. http://www.oecd.org/pisa/pisaproducts/PISA%202012%20framework%20e-book_final.pdf. Accessed 28 Apr 2016.
OECD. (2014a). PISA 2012 results: Students and money: Financial literacy skills for the 21st century (Vol. 4). Paris: OECD Publishing. http://www.oecd.org/pisa/keyfindings/PISA-2012-results-volume-vi.pdf. Accessed 28 Apr 2016.
OECD. (2014b). PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Vol. 5). Paris: OECD Publishing. http://www.oecd.org/pisa/keyfindings/PISA-2012-results-volume-V.pdf. Accessed 28 Apr 2016.
OECD. (2014c). PISA 2012 technical report. Paris: OECD Publishing. http://www.oecd.org/pisa/pisaproducts/PISA-2012-technical-report-final.pdf. Accessed 28 Apr 2016.
OECD. (2014d). TALIS 2013 results: An international perspective on teaching and learning. Paris: TALIS, OECD Publishing. doi:10.1787/9789264196261-en.
OECD. (2015). Call for tender n° 100001311: Longitudinal study of social and emotional skills in cities. Paris: OECD. http://www.oecd.org/callsfortenders/CfT%20100001311%20Longitudinal%20Study%20of%20Social%20and%20Emotional%20Skills%20in%20Cities.pdf. Accessed 28 Apr 2016.
Piaget, J. (1950). The psychology of intelligence. New York: Harcourt and Brace.
Pianta, R. C., & Hamre, B. K. (2009). Conceptualization, measurement, and improvement of classroom processes: Standardized observation can leverage capacity. Educational Researcher, 38(2), 109–119. doi:10.3102/0013189X09332374.
Piopiunik, M., Hanushek, E. A., & Wiederhold, S. (2014). The impact of teacher skills on student performance across countries. Evidenzbasierte Wirtschaftspolitik, Verein für Socialpolitik, German Economic Association. http://www.econstor.eu/bitstream/10419/100356/1/VfS_2014_pid_1009.pdf. Accessed 28 Apr 2016.
Praetorius, A.-K., Bell, C., Klieme, E., Opfer, D., Stecher, B., van Essen, T., & McCaffrey, D. (in preparation). An international conceptualization of teaching quality based on a systematic review.
Prenzel, M., Baumert, J., Blum, W., Lehmann, R., Leutner, D., Neubrand, M., Pekrun, R., Rost, J., & Schiefele, U. (Eds.). (2006). PISA 2003. Untersuchungen zur Kompetenzentwicklung. Münster: Waxmann.
Purves, A. C. (1987). The evolution of the IEA: A memoir. Comparative Education Review, 31(1), 10–28.
Reynolds, D., Teddlie, C., Creemers, B., Scheerens, J., & Townsend, T. (2000). An introduction to school effectiveness research. In C. Teddlie & D. Reynolds (Eds.), The international handbook of school effectiveness research (pp. 3–25). London: Falmer Press.
Robitzsch, A. (2010). TIMSS 1995 und 2007: Trend der mathematischen Kompetenzen in Österreich [TIMSS 1995 and 2007: Trends of mathematic competences in Austria]. In B. Suchań, C. Wallner-Paschon, & C. Schreiner (Eds.), TIMSS 2007. Mathematik & Naturwissenschaft in der Grundschule: Österreichischer Expertenbericht (pp. 56–63). Graz: Leykam. https://www.bifie.at/buch/1191/1/3. Accessed 28 Apr 2016.
Rychen, D. S., & Salganik, L. H. (2003). Highlights from the OECD project definition and selection competencies: Theoretical and conceptual foundations (DeSeCo). Paper prepared for the Annual Meeting of the American Educational Research Association, Chicago.
Scheerens, J. (2000). Improving school effectiveness (Fundamentals of Educational Planning, Vol. 68). Paris: UNESCO, IIEP.
Scheerens, J., & Bosker, R. (1997). The foundations of educational effectiveness. Oxford: Pergamon.
Scheerens, J., Seidel, T., Witziers, B., Hendriks, M., & Doornekamp, G. (2005). Positioning and validating the supervision framework. Enschede: Department of Educational Organisation and Management, University of Twente.
Scheuren, F. (2005). Paradata from concept to completion (Proceedings of Symposium 2005, Methodological Challenges for Future Information Needs). Canada: Statistics Canada.
Schlotter, M., Schwerdt, G., & Woessmann, L. (2011). Econometric methods for causal evaluation of education policies and practices: A non‐technical guide. Education Economics, 19(2), 109–137.
Schmidt, W. H., McKnight, C. C., Houang, R. T., Wang, H., Wiley, D. E., Cogan, L. S., & Wolfe, R. G. (2001). Why schools matter: A cross-national comparison of curriculum and learning. San Francisco: Jossey-Bass.
Schmidt, W. H., Burroughs, N. A., Zoido, P., & Houang, R. T. (2015). The role of schooling in perpetuating educational inequality: An international perspective. Educational Researcher, 44(7), 371–386. doi:10.3102/0013189x15603982.
Schwille, J., Ingvarson, L., & Holdgreve-Resendez, R. (Eds.). (2013). TEDS-M encyclopedia: A guide to teacher education context, structure, and quality assurance in 17 countries. Findings from the IEA teacher education and development study in mathematics (TEDS-M). Amsterdam: IEA.
Schwippert, K., & Lenkeit, J. (2012). Introduction. In K. Schwippert & J. Lenkeit (Eds.), Progress in reading literacy in national and international context: The impact of PIRLS 2006 in 12 countries (pp. 9–21). Münster: Waxmann.
Seidel, T., & Shavelson, R. J. (2007). Teaching effectiveness research in the past decade: The role of theory and research design in disentangling meta-analysis results. Review of Educational Research, 77(4), 454–499. doi:10.3102/0034654307310317.
Stigler, J. W., & Hiebert, J. (1999). The teaching gap. New York: Free Press.
Stigler, J. W., Gallimore, R., & Hiebert, J. (2000). Using video surveys to compare classrooms and teaching across cultures: Examples and lessons from the TIMSS video studies. Educational Psychologist, 35(2), 87–100. doi:10.1207/S15326985EP3502_3.
Sykes, G., Schneider, B., & Plank, D. N. (Eds.). (2009). Handbook of education policy research. New York: Routledge.
Tatto, M. T. (Ed.). (2013). The teacher education and development study in mathematics (TEDS-M): Policy, practice, and readiness to teach primary and secondary mathematics in 17 countries (Technical report). Amsterdam: IEA.
Thomson, S., & Hillman, K. (2010). Against the odds: Influences on the post-school success of “low performers”. Adelaide: National Centre for Vocational Education Research (NCVER).
Tyack, D., & Tobin, W. (1994). The “grammar” of schooling: Why has it been so hard to change? American Educational Research Journal, 31(3), 453–479.
Van de Vijver, F. J. R., & He, J. (2016). Bias assessment and prevention in noncognitive outcome measures in context assessments. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Assessing contexts of learning: An international perspective. Dordrecht: Springer.
Vieluf, S., Kaplan, D., Klieme, E., & Bayer, S. (2012). Teaching practices and pedagogical innovations: Evidence from TALIS. Paris: OECD.
von Davier, M. (2013). Imputing proficiency data under planned missingness in population models. In L. Rutkowski, M. von Davier, & D. Rutkowski (Eds.), Handbook of international large-scale assessment: Background, technical issues, and methods of data analysis. Boca Raton: Chapman Hall/CRC.
Vygotsky, L. S. (1977). The development of higher psychological functions. Soviet Psychology, 15(3), 60–73.
Willms, J. D. (2006). Learning divides: Ten policy questions about the performance and equity of schools and schooling systems. Montreal: UNESCO Institute for Statistics.
Worldbank (2015). Fertility rate, total (births per woman). http://data.worldbank.org/indicator/SP.DYN.TFRT.IN?order=wbapi_data_value_2013+wbapi_data_value+wbapi_data_value-last&sort=asc. Accessed 28 Apr 2016.
Wößmann, L., & West, M. R. (2006). Class-size effects in school systems around the world: Evidence from between-grade variation in TIMSS. European Economic Review, 50(3), 695–736.
Wößmann, L., Lüdemann, E., Schütz, G., & West, M. R. (2007). School accountability, autonomy, choice and the level of student achievement: International evidence from PISA 2003 (OECD Education Working Paper No. 13, EDU/WKP(2007)8). Paris: OECD. doi:10.1787/24640253161.
Wu, M. (2010). Comparing the similarities and differences of PISA 2003 and TIMSS. Paris: OECD Publishers. doi:10.1787/5km4psnm13nx-en.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Kuger, S., Klieme, E. (2016). Dimensions of Context Assessment. In: Kuger, S., Klieme, E., Jude, N., Kaplan, D. (eds) Assessing Contexts of Learning. Methodology of Educational Measurement and Assessment. Springer, Cham. https://doi.org/10.1007/978-3-319-45357-6_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-45357-6_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-45356-9
Online ISBN: 978-3-319-45357-6
eBook Packages: EducationEducation (R0)