Skip to main content

Comparison of Studies: Comparing Design and Constructs, Aligning Measures, Integrating Data, Cross-validating Findings

  • Living reference work entry
  • First Online:
International Handbook of Comparative Large-Scale Studies in Education

Part of the book series: Springer International Handbooks of Education ((SIHE))

  • 58 Accesses

Abstract

Even after six decades of international student assessments, we only weakly understand how and why educational systems are changing in the long run. One reason is the diversity of studies differing in design, sampling, conceptualization (e.g., research constructs covered), and measures. Such variation can be found both between and within long-standing programs of student assessment like TIMSS and PISA. The chapter aims at showing similarities and differences between ILSAs, understanding what can and what cannot safely be compared and combined, with the goal of finding common grounds for future research. Throughout, the lower secondary samples from TIMSS and PISA will be used as major examples.

Section 1 compares the conceptual foundations and design across ILSAs, especially the selection and definition of constructs covered, including cognitive tests as well as questionnaire-based measures of student background, educational processes, and noncognitive outcomes. Section 2 figures out when and how to match empirical measures and to establish common scales. Section 3 looks at approaches for integrating data from separate ILSAs into complex analyses, such as longitudinal analyses on the individual, class, or school level, and multilevel analyses combining information from multiple studies. Section 4 discusses the cross-validation of trend information from TIMSS and PISA.

Using ILSA’s “Big Data” without considering the details of conceptualization, measurement, and data structure may lead to erroneous findings and policy conclusions. Recently, there has been some convergence in design and methods across ILSA studies and programs. Yet, more systematic approaches to instrument development and study design are in need.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • Adams, R. J., Jackson, J., & Turner, R. (2018). Learning progressions as an inclusive solution to global education monitoring. Australian Council for Educational Research (ACER).

    Google Scholar 

  • Alderson, J. C., Figueras, N., Kuijper, H., Nold, G., Takala, S., & Tardieu, C. (2006). Analysing tests of reading and listening in relation to the common European framework of reference: The experience of The Dutch CEFR Construct project. Language Assessment Quarterly, 3(1), 3–30.

    Google Scholar 

  • Anderson, K. (2019). Strengthening learning assessment systems. A knowledge and innovation exchange (KIX) discussion paper. Global Partnership in Education.

    Google Scholar 

  • Andresen, S., Fegter, S., Hurrelmann, K., & Schneekloth, U. (Eds.). (2017). Well-being, poverty and justice from a child’s perspective: 3rd world vision children study. Springer.

    Google Scholar 

  • Andrews, P., Ryve, A., Hemmi, K., & Sayers, J. (2014). PISA, TIMSS and Finnish mathematics teaching: An enigma in search of an explanation. Educational Studies in Mathematics, 87(1), 7–26.

    Google Scholar 

  • Avvisati, F. (2020). The measure of socio-economic status in PISA: A review and some suggested improvements. Large-Scale Assessments in Education, 8(1), 1.

    Google Scholar 

  • Baker, E., Cai, L., Choi, K., & Buschang, R. (2014). CRESST functional validity model: Deriving formative and summative information from common core assessments. Presentation at the annual meeting of the American Educational Research Association, symposium.

    Google Scholar 

  • Beaton, A. E., Mullis, I., Martin, M. O., Gonzalez, E. J., Kelly, D. L., & Smith, T. A. (1996). Mathematics achievement in the middle school years. TIMSS International Study Center, Boston College.

    Google Scholar 

  • Bischof, L. M., Hochweber, J., Hartig, J., & Klieme, E. (2013). Schulentwicklung im Verlauf eines Jahrzehnts. Erste Ergebnisse des PISA-Schulpanels. In N. Jude & E. Klieme (Eds.), PISA 2009 - Impulse für die Schul- und Unterrichtsforschung (Zeitschrift für Pädagogik, Beiheft 59) (pp. 172–199). Beltz.

    Google Scholar 

  • Bollen, K., Glanville, J., & Stecklov, G. (2001). The role of socioeconomic status and class in health and fertility studies. Annual Review of Sociology, 27, 153–185.

    Google Scholar 

  • Borgonovi, F., Pokropek, A., Keslair, F., Gauly, B., & Paccagnella, M. (2017). Youth in transition. How do some of the cohorts participating in PISA fare in PIAAC? (OECD education working papers no. 155). OECD Publishing.

    Google Scholar 

  • Borgonovi, F., Choi, U., & Paccagnella, M. (2018). The evolution of gender gaps in numeracy and literacy between childhood and adulthood (OECD education working papers no. 184). OECD Publishing.

    Google Scholar 

  • Broer, M., Bai, Y., & Fonseca, F. (2019). Socioeconomic inequality and educational. Outcomes: Evidence from twenty years of TIMSS. Springer.

    Google Scholar 

  • Brunner, M., Keller, U., Wenger, M., Fischbach, A., & Lüdtke, O. (2018). Between-school variation in students’ achievement, motivation, affect, and learning strategies: Results from 81 countries for planning group-randomized trials in education. Journal of Research on Educational Effectiveness, 11(3), 452–478.

    Google Scholar 

  • Burstein, L. (Ed.). (1993). The IEA study of mathematics III: Student growth and classroom processes. Pergamon Press.

    Google Scholar 

  • Cardoso, M. E. (2020). Policy evidence by design: International large-scale assessments and grade repetition. Comparative Education Review, 64(4), 598–618.

    Google Scholar 

  • Carmichael, S. B., Wilson, W. S., Finn, C. E., Jr., Winkler, A. M., & Palmieri, S. (2009). Stars by which to navigate? Scanning national and international education standards in 2009 (An interim report on common core, NAEP, TIMSS and PISA). Thomas B. Fordham Institute.

    Google Scholar 

  • Carnoy, M., Khavenson, T., Loyalka, P., Schmidt, W. H., & Zakharov, A. (2016). Revisiting the relationship between international assessment outcomes and educational production: Evidence from a longitudinal PISA-TIMSS sample. American Educational Research Journal, 53(4), 1054–1085.

    Google Scholar 

  • Cathles, A., Ou, D., Sasso, S., Setrana, M., & van Veen, T. (2018). Where do you come from, where do you go? Assessing skills gaps and labour market outcomes of young adults with different immigration backgrounds (CESifo working paper no. 7157).

    Google Scholar 

  • Council of Europe. (2020). Common European framework of reference for languages: Learning, teaching, assessment – Companion volume. Council of Europe Publishing.

    Google Scholar 

  • Dämmrich, J., & Triventi, M. (2018). The dynamics of social inequalities in cognitive-related competencies along the early life course – A comparative study. International Journal of Educational Research, 88, 73–84.

    Google Scholar 

  • Ehmke, T., van den Ham, A. K., Sälzer, C., Heine, J., & Prenzel, M. (2020). Measuring mathematics competence in international and national large scale assessments: Linking PISA and the national educational panel study in Germany. Studies in Educational Evaluation, 65, 100847.

    Google Scholar 

  • Elliott, J., Stankov, L., Lee, J., & Beckmann, J. F. (2019). What did PISA and TIMSS ever do for us? The potential of large scale datasets for understanding and improving educational practice. Comparative Education, 55(1), 133–155.

    Google Scholar 

  • Gal, I., & Tout, D. (2014). Comparison of PIAAC and PISA frameworks for numeracy and mathematical literacy (OECD education working papers, no. 102). OECD Publishing.

    Google Scholar 

  • Global Partnership for Education (GPE), & Australian Council for Educational Research (ACER). (2019). Analysis of National Learning Assessment Systems [ANLAS] toolkit. Global Partnership for Education.

    Google Scholar 

  • Gustafsson, J. E. (2016). Lasting effects of quality of schooling: Evidence from PISA and PIAAC. Intelligence, 57, 66–72.

    Google Scholar 

  • Hanushek, E. A., & Wößmann, L. (2006). Does early tracking affect educational inequality and performance? Differences-in-differences evidence across countries. Economic Journal, 116(115), C63–C76.

    Google Scholar 

  • Hanushek, E. A., & Wößmann, L. (2015). The knowledge capital of nations: Education and the economics of growth. MIT Press.

    Google Scholar 

  • Hastedt, D., & Desa, D. (2015). Linking errors between two populations and tests: A case study in international surveys in education. Practical Assessment, Research, and Evaluation, 20(1), 14.

    Google Scholar 

  • He, J., Barrera-Pedemonte, F., & Buchholz, J. (2019). Cross-cultural comparability of noncognitive constructs in TIMSS and PISA. Assessment in Education: Principles, Policy & Practice, 26(4), 369–385.

    Google Scholar 

  • Hole, A., Grønmo, L. S., & Onstad, T. (2018). The dependence on mathematical theory in TIMSS, PISA and TIMSS advanced test items and its relation to student achievement. Large-Scale Assessment in Education, 6(1), 3.

    Google Scholar 

  • Husén, T. (Ed.). (1967). International study of achievement in mathematics. A comparison of twelve countries. Wiley.

    Google Scholar 

  • Hutchison, D., & Schagen, L. (2007). Comparisons between PISA and TIMSS – Are we the man with two watches? In T. Loveless (Ed.), Lessons learned: What international assessments tell us about math achievement (pp. 227–262). The Brookings Institution.

    Google Scholar 

  • Jaberian, H., Vista, A., & Care, E. (2018). Monitoring for 21st century skills: Solutions adopted by the United Nations. Brookings.

    Google Scholar 

  • Jensen, B., & Cooper, S. (2015). TALIS-PISA conceptual framework (OECD internal document EDU/INES/TALIS (2015)6. Directorate for Education and Skills). OECD.

    Google Scholar 

  • Johansson, S., & Strietholt, R. (2019). Globalised student achievement? A longitudinal and cross-country analysis of convergence in mathematics performance. Comparative Education, 55(4), 536–556.

    Google Scholar 

  • Jude, N., Hartig, J., Schipolowski, S., Böhme, K., & Stanat, P. (2013). Definition und Messung von Lesekompetenz. PISA und die Bildungsstandards. In N. Jude & E. Klieme (Eds.), PISA 2009 – Impulse für die Schul- und Unterrichtsforschung (Zeitschrift für Pädagogik, Beiheft 59) (pp. 200–228). Beltz.

    Google Scholar 

  • Kaplan, D., & Turner, A. (2012). Statistical matching of PISA 2009 and TALIS 2008 data in Iceland (OECD Education working papers, no. 78).

    Google Scholar 

  • Klette, K., Blikstad-Balas, M., & Roe, A. (2017). Linking instruction and student achievement. A research design for a new generation of classroom studies. Acta Didactica Norge, 11(3), Art. 10, 19, sider. https://doi.org/10.5617/adno.4729

    Article  Google Scholar 

  • Klieme, E. (2013). The role of large-scale assessments in research on educational effectiveness and school development. In M. V. Davier, E. Gonzalez, I. Kirsch, & K. Yamamoto (Eds.), The role of international large-scale assessments: Perspectives from technology, economy, and educational research (pp. 115–147). Springer.

    Google Scholar 

  • Klieme, E., & Baumert, J. (2001). Identifying national cultures of mathematics education: Analysis of cognitive demands and differential item functioning in TIMSS. European Journal of Psychology of Education, 16(3), 383400.

    Google Scholar 

  • Klieme, E., & Kuger, S. (2016). PISA 2015 context questionnaires framework. In OECD (Ed.), PISA 2015 assessment and analytical framework: Science, reading, mathematic and financial literacy (pp. 101–127). OECD Publishing.

    Google Scholar 

  • Klieme, E., & Steinert, B. (2009). Schulentwicklung im Längsschnitt. Ein Forschungsprogramm und erste explorative Analysen. In M. Prenzel & J. Baumert (Eds.), Vertiefende Analysen zu PISA 2006 (pp. 221–238). VS Verlag für Sozialwissenschaften.

    Google Scholar 

  • Klieme, E., Neubrand, M., & Lüdtke, O. (2001). Mathematische Grundbildung: Testkonzeption und Ergebnisse. In J. Baumert, E. Klieme, M. Neubrand, M. Prenzel, U. Schiefele, W. Schneider, et al. (Eds.), PISA 2000: Basiskompetenzen von Schülerinnen und Schülern im internationalen Vergleich (pp. 139–190). Leske + Budrich.

    Google Scholar 

  • Klieme, E., Vieluf, S., Backhoff, E., Blum, W., Buckley, J., Hong, Y., Kaplan, D., Levin, H., Scheerens, J., Schmidt, W., & van de Vijver, A. J. R. (2013). PISA 2012 context questionnaires framework. In OECD (Ed.), PISA 2012 assessment and analytical framework: Mathematics, reading, science, problem solving and financial literacy (pp. 167–258). OECD Publishing.

    Google Scholar 

  • Kröhne, U., Buerger, S., Hahnel, C., & Goldhammer, F. (2019). Construct equivalence of Pisa reading comprehension measured with paper-based and computer-based assessments. Educational Measurement: Issues and Practice, 38(3), 97–111.

    Google Scholar 

  • Kuger, S., & Klieme, E. (2016). Dimensions of context assessment. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Assessing contexts of learning: An international perspective (pp. 3–37). Springer.

    Google Scholar 

  • Kuger, S., Klieme, E., Jude, N., & Kaplan, D. (Eds.). (2016). Assessing contexts of learning: An international perspective. Springer.

    Google Scholar 

  • Kuger, S., Klieme, E., Lüdtke, O., Schiepe-Tiska, A., & Reiss, K. (2017). Mathematikunterricht und Schülerleistung in der Sekundarstufe: Zur Validität von Schülerbefragungen in Schulleistungsstudien. Zeitschrift für Erziehungswissenschaft, 20(2), 61–98.

    Google Scholar 

  • Kyllonen, P., & Bertling, J. (2014). Innovative questionnaire assessment methods to increase cross-country comparability. In L. Rutkowski, M. V. Davier, & D. Rutkowski (Eds.), Handbook of international large scale assessment (pp. 277–286). CRC Press.

    Google Scholar 

  • Lee, J., & Stankov, L. (2018). Non-cognitive predictors of academic achievement: Evidence from TIMSS and PISA. Learning and Individual Differences, 65, 50–64.

    Google Scholar 

  • Lim, H., & Sireci, S. G. (2017). Linking TIMSS and NAEP assessments to evaluate international trends in achievement. Education Policy Analysis Archives, 25, 11.

    Google Scholar 

  • Lindblad, S., & Pettersson, D. (2019). Intellectual and social organisation of international large-scale assessment research. In C. E. Mølstad & D. Pettersson (Eds.), New practices of comparison, quantification and expertise in education (pp. 83–98). Routledge.

    Google Scholar 

  • Loveless, T. (2008). The Brown Center report on American Education: How well are American students learning? Brookings Institute.

    Google Scholar 

  • Luyten, H. (2017). Predictive power of OTL measures in TIMSS and PISA. In J. Scheerens (Ed.), Opportunity to learn, curriculum alignment and test preparation (pp. 103–119). Springer.

    Google Scholar 

  • Maehler, D. B., & Konradt, I. (2020). Adult cognitive and non-cognitive skills: An overview of existing PIAAC data. In D. B. Maehler & B. Rammstedt (Eds.), Large-scale cognitive assessment: Analyzing PIAAC data (pp. 49–91). Springer.

    Google Scholar 

  • Majoros, E., Rosén, M., Johansson, S., & Gustafsson, J. E. (2021). Measures of long-term trends in mathematics: Linking large-scale assessments over 50 years. Educational Assessment, Evaluation and Accountability, 33(1), 71–103.

    Google Scholar 

  • Mazzeo, J., & von Davier, M. (2014). Linking scales in international large-scale assessments. In L. Rutkowski, M. V. Davier, & D. Rutkowski (Eds.), Handbook of international large scale assessment (pp. 229–257). CRC Press.

    Google Scholar 

  • Medrich, E. A., & Griffith, J. E. (1992). International mathematics and science assessments: What have we learned? (NCES Paper 92-011). NCES.

    Google Scholar 

  • Meroni, E. C., Vera-Toscano, E., & Costa, P. (2015). Can low skill teachers make good students? Empirical evidence from PIAAC and PISA. Journal of Policy Modeling, 37(2), 308–323.

    Google Scholar 

  • Mullis, I. V. S., & Martin, M. O. (Eds.). (2013). TIMSS 2015 assessment frameworks. TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Foy, P., & Hooper, M. (2016). TIMSS 2015 international results in mathematics. TIMSS & PIRLS International Student Center, Boston College.

    Google Scholar 

  • NCES. (2013). 2011 NAEP-TIMSS linking study: Linking methodologies and their evaluations (NCES working paper 2013-469). NCES.

    Google Scholar 

  • Neidorf, T. S., Binkley, M., Gattis, K., & Nohara, D. (2006). Comparing mathematics content in the National Assessment of Educational Progress (NAEP), Trends in International Mathematics and Science Study (TIMSS), and Program for International Student Assessment (PISA) 2003 assessments. U.S. Department of Education Economics: National Center for Education Statistics.

    Google Scholar 

  • North, B., Figueras, N., Takala, S., Van Avermaet, P., & Verhelst, N. (2009). Relating language examinations to the common European framework of reference for languages: Learning, teaching, assessment (CEFR). A Manual. Council of Europe.

    Google Scholar 

  • OECD. (2013). PISA 2012 results, Vol. I: What students know and can do. OECD.

    Google Scholar 

  • OECD. (2016a). PISA 2015 assessment and analytical framework. OECD.

    Google Scholar 

  • OECD. (2016b). PISA 2015 results (Volume I). Excellence and equity in education. OECD.

    Google Scholar 

  • Opfer, D., Bell, C., Klieme, E., McCaffrey, D., Schweig, J., & Stecher, B. (2020). Understanding and measuring mathematics teaching practice. In OECD (Ed.), OECD global teaching insights: A video study of teaching (pp. 33–47). OECD Publishing.

    Google Scholar 

  • Panayiotou, A., Kyriakides, L., Creemers, B. P., McMahon, L., Vanlaar, G., Pfeifer, M., Rekalidou, G., & Bren, M. (2014). Teacher behavior and student outcomes: Results of a European study. Educational Assessment, Evaluation and Accountability, 26(1), 73–93.

    Google Scholar 

  • Purves, A. C. (1987). The evolution of the IEA: A memoir. Comparative Education Review, 31(1), 10–28.

    Google Scholar 

  • Robitzsch, A., & Lüdtke, O. (2019). Linking errors in international large-scale assessments: Calculation of standard errors for trend estimation. Assessment in Education: Principles, Policy & Practice, 26(4), 444–465.

    Google Scholar 

  • Robitzsch, A., Lüdtke, O., Köller, O., Kröhne, U., Goldhammer, F., & Heine, J.-H. (2016). Herausforderungen bei der Schätzung von Trends in Schulleistungsstudien. Diagnostica, 63(2), 1–18.

    Google Scholar 

  • Robitzsch, A., Lüdtke, O., Goldhammer, F., Kroehne, U., & Köller, O. (2020). Reanalysis of the German PISA data: A comparison of different approaches for trend estimation with a particular emphasis on mode effects. Frontiers in Psychology, 11, 884.

    Google Scholar 

  • Rozman, M., & Klieme, E. (2017). Exploring cross-national changes in instructional practices: Evidence from four cycles of TIMSS (Policy brief) (Vol. 13). International Association for the Evaluation of Educational Achievement.

    Google Scholar 

  • Schiepe-Tiska, A., Reiss, K., Obersteiner, A., Heine, J.-H., Seidel, T., & Prenzel, M. (2013). Mathematikunterricht in Deutschland: Befunde aus PISA 2012. In M. Prenzel, C. Sälzer, E. Klieme, & O. Köller (Eds.), PISA 2012. Fortschritte und Herausforderungen in Deutschland (pp. 123–154). Waxmann.

    Google Scholar 

  • Schmidt, W. H., & Maier, A. (2009). Opportunity to learn. In G. Sykes, B. Schneider, & D. N. Plank (Eds.), Handbook of education policy research (pp. 541–559). Routledge.

    Google Scholar 

  • Schmidt, W. H., Burroughs, N. A., Zoido, P., & Houang, R. T. (2015). The role of schooling in perpetuating educational inequality: An international perspective. Educational Researcher, 44(7), 371–386.

    Google Scholar 

  • Schmidt, W., Houang, R., Cogan, L., & Solorio, M. (2018). Schooling across the globe: What we have learned from 60 years of mathematics and science international assessments. Cambridge University Press.

    Google Scholar 

  • Sieben, S., & Lechner, C. M. (2019). Measuring cultural capital through the number of books in the household. Measurement Instruments for the Social Sciences, 1(1), 1–6.

    Google Scholar 

  • Smithson, J. (2009). Alignment content analysis of TIMSS and PISA mathematics and science assessments using the surveys of enacted curriculum methodology. A CCSSO paper prepared for National Center for Education Statistics and American Institutes for Research. Wisconsin Center for Education Research.

    Google Scholar 

  • Solheim, O. J., & Lundetræ, K. (2018). Can test construction account for varying gender differences in international reading achievement tests of children, adolescents and young adults? – A study based on Nordic results in PIRLS, PISA and PIAAC. Assessment in Education: Principles, Policy & Practice, 25(1), 107–126.

    Google Scholar 

  • Sørensen, K., Krassel, L., & Fritjof, K. (2019). Childhood and adulthood skill acquisition – importance for labor market outcomes. Journal of Economics and Economic Education Research, 20(1), 1–23.

    Google Scholar 

  • Stanco, G. M., Martin, M. O. & Mullis, I. V. S. (2010). Examining the components of linking error of trend estimation in PIRLS. Paper presented at the IEA International Research Conference.

    Google Scholar 

  • Strello, A., Strietholt, R., Steinmann, I., & Siepmann, C. (2021). Early tracking and different types of inequalities in achievement: Difference-in-differences evidence from 20 years of large-scale assessments. Educational Assessment, Evaluation and Accountability, 33(1), 139–167.

    Google Scholar 

  • Strietholt, R., & Rosén, M. (2016). Linking large-scale reading assessments: Measuring international trends over 40 years. Measurement: Interdisciplinary Research and Perspectives, 14(1), 1–26.

    Google Scholar 

  • Strietholt, R., & Scherer, R. (2018). The contribution of international large-scale assessments to educational research: Combining individual and institutional data sources. Scandinavian Journal of Educational Research, 62(3), 368–385.

    Google Scholar 

  • Sun, H., Creemers, B. P. M., & de Jong, R. (2007). Contextual factors and effective school improvement. School Effectiveness and School Improvement, 18(1), 93–122.

    Google Scholar 

  • Travers, K. J., & Westbury, I. (1989). The IEA study of mathematics I: Analysis of mathematics curricula. Pergamon Press.

    Google Scholar 

  • Turner, R., Adams, R., Schwantner, U., Cloney, D., Scoular, C., Anderson, P., Daraganov, A., Jackson, S. K., O’Connor, G., Munro-Smith, P., Zoumboulis, S., & Rogers, P. (2018). Development of reporting scales for reading and mathematics: A report describing the process for building the UIS Reporting Scales. Australian Council for Educational Research.

    Google Scholar 

  • UNESCO. (2019). The promise of large-scale learning assessments. UNESCO.

    Google Scholar 

  • von Davier, M., Yamamoto, K., Shin, H. J., Chen, H., Khorramdel, L., Weeks, J., … Kandathil, M. (2019). Evaluating item response theory linking and model fit for data from PISA 2000–2012. Assessment in Education: Principles, Policy & Practice, 26(4), 466–488.

    Google Scholar 

  • Watermann, R., Maaz, K., Bayer, S., & Roczen, N. (2016). Social background. In S. Kuger, E. Klieme, N. Jude, & D. Kaplan (Eds.), Assessing contexts of learning: An international perspective (pp. 117–145). Springer.

    Google Scholar 

  • Williams, R. (2019). National higher education policy and the development of generic skills. Journal of Higher Education Policy and Management, 41(4), 404–415.

    Google Scholar 

  • Wu, M. (2010). Comparing the similarities and differences of PISA 2003 and TIMSS (OECD education working papers, No. 32). OECD.

    Google Scholar 

  • Ye, W., Strietholt, R., & Blömeke, S. (2021). Academic resilience: Underlying norms and validity of definitions. Educational Assessment, Evaluation and Accountability, 33(1), 169–202.

    Google Scholar 

  • Zopluoglu, C. (2012). A cross-national comparison of intra-class correlation coefficients in educational achievement outcomes. Journal of Measurement and Evaluation in Education and Psychology, 3(1), 242–278.

    Google Scholar 

  • Zuzovsky, R. (2008). Capturing the dynamics behind the narrowing achievement gap between Hebrew-speaking and Arabic-speaking schools in Israel: findings from TIMSS 1999 and 2003. Educational Research and Evaluation, 14(1), 47–71.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eckhard Klieme .

Editor information

Editors and Affiliations

Section Editor information

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Klieme, E. (2022). Comparison of Studies: Comparing Design and Constructs, Aligning Measures, Integrating Data, Cross-validating Findings. In: Nilsen, T., Stancel-Piątak, A., Gustafsson, JE. (eds) International Handbook of Comparative Large-Scale Studies in Education. Springer International Handbooks of Education. Springer, Cham. https://doi.org/10.1007/978-3-030-38298-8_20-1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-38298-8_20-1

  • Received:

  • Accepted:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-38298-8

  • Online ISBN: 978-3-030-38298-8

  • eBook Packages: Springer Reference EducationReference Module Humanities and Social SciencesReference Module Education

Publish with us

Policies and ethics