Advertisement

STATISTICAL TECHNIQUES UTILIZED IN ANALYZING PISA AND TIMSS DATA IN SCIENCE EDUCATION FROM 1996 TO 2013: A METHODOLOGICAL REVIEW

  • Pey-Yan LiouEmail author
  • Yi-Chen Hung
Article

Abstract

We conducted a methodological review of articles using the Programme for International Student Assessment (PISA) or Trends in International Mathematics and Science Study (TIMSS) data published by the SSCI-indexed science education journals, such as the International Journal of Science and Mathematics Education, the International Journal of Science Education, the Journal of Research in Science Teaching, and Science Education, from 1996 to 2013. A total of 51 empirical articles from 8 journals were identified as the sample for this study. These articles were analyzed in terms of the 2 essential statistical techniques, sampling weights and design effects, used to analyze the international large-scale assessment (ILSA) data. The study also summarized the most commonly used quantitative methods for analyzing PISA and TIMSS data in these articles. The results indicate that the weights and design effects, essential adjustments for analyzing large-scale data, were reported in less than half of the studies. Suggestions regarding the use of appropriate techniques and reporting as well as data analysis methods are made for science education researchers who use ILSA data in their research.

Key words

data analysis reporting design effects literature review PISA TIMSS weights 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

10763_2014_9558_MOESM1_ESM.docx (34 kb)
ESM 1 (DOCX 33 kb)

REFERENCES

  1. Anderson, J. O., Lin, H.-S., Treagust, D. F., Ross, S. P. & Yore, L. D. (2007). Using large-scale assessment datasets for research in science and mathematics education: Programme for International Student Assessment (PISA). International Journal of Science and Mathematics Education, 5, 591–614.CrossRefGoogle Scholar
  2. Anderson, J. O., Chiu, M.-H. & Yore, L. D. (2010). First cycle of PISA (2000-2006)—International perspectives on successes and challenges: Research and policy directions. International Journal of Science and Mathematics Education, 8, 373–388.CrossRefGoogle Scholar
  3. Bangert, A. W. & Baumberger, J. P. (2005). Research and statistical techniques used in the Journal of Counseling & Development: 1990–2001. Journal of Counseling & Development, 83, 480–488.CrossRefGoogle Scholar
  4. Basl, J. (2011). Effect of school on interest in natural sciences: A comparison of the Czech Republic, Germany, Finland, and Norway based on PISA 2006. International Journal of Science Education, 33(1), 145–157.CrossRefGoogle Scholar
  5. Bell, B. A., Onwuegbuzie, A. J., Ferron, J. M., Jiao, Q. G., Hibbard, S. T. & Kromrey, J. D. (2012). Use of design effects of sample weights in complex health survey data: A review of published articles using data from 3 commonly used adolescent health surveys. American Journal of Publish Health, 102(7), 1399–1405.CrossRefGoogle Scholar
  6. Chiu, M.-H. & Duit, R. (2011). Globalization: Science education from an international perspective (editorial). Journal of Research in Science Teaching, 48(6), 553–566.CrossRefGoogle Scholar
  7. Collins, A. (2004). Guest editorial. Science Education, 88(1), 1–3.CrossRefGoogle Scholar
  8. Dippo, C. S., Fay, R. E., & Morganstein, D. H. (1984). Computing variances from complex samples with replicate weights. Proceedings of the American Statistical Association Survey Research Methods Section.Google Scholar
  9. Ene, M., Askew, K., & Bell, B. A. (April, 2012). Reporting of design effects and sample weights: A review of published early childhood longitudinal study, Kindergarten Cohort and NAEP articles. Paper presented at the annual meeting of American Education Research Association, Vancouver, Canada.Google Scholar
  10. Frankel, M. R. (1971). Inference from Survey Samples. Ann Arbor, MI: Institute for Social Research, the University of Michigan.Google Scholar
  11. Goodwin, L. W. & Goodwin, W. L. (1985). Statistical techniques in AERJ articles, 1979-1983: The preparation of graduate students to read the educational literature. Educational Researcher, 14, 5–11.Google Scholar
  12. Hahs-Vaughn, D. L. (2005). A primer for using and understanding weights with national datasets. The Journal of Experimental Education, 73(3), 221–248.CrossRefGoogle Scholar
  13. Hahs-Vaughn, D. L. (2006). Analysis of data from complex samples. International Journal of Research and Method in Education, 29(2), 165–183.CrossRefGoogle Scholar
  14. Henno, I. & Reiska, P. (2013). Impact of the socio-cultural context on student science performance and attitudes: The case of Estonia. Journal of Baltic Science Education, 12(4), 465–481.Google Scholar
  15. Jen, T. H., Lee, C. D., Chien, C. L., Hsu, Y. S. & Chen, K. M. (2013). Perceived social relationships and science learning outcomes for Taiwanese eighth graders: Structural equation modeling with a complex sampling consideration. International Journal of Science and Mathematics Education, 11(3), 575–600.CrossRefGoogle Scholar
  16. Knipprath, H. (2010). What PISA tells us about the quality and inequality of Japanese education in mathematics and science. International Journal of Science and Mathematics Education, 8(3), 389–408.CrossRefGoogle Scholar
  17. Korn, E. L. & Graubard, B. I. (1995). Examples of differing weighted and unweighted estimates from a sample survey. The American Statistician, 49, 291–295.Google Scholar
  18. Landis, J. & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33, 159–174.CrossRefGoogle Scholar
  19. Liu, X. (2010). Using and developing measurement instruments in science education: A Rasch modeling approach. Charlotte: Information Age Publishing.Google Scholar
  20. Liu, X. & Ruiz, M. E. (2008). Using data mining to predict K-12 students’ performance on large-scale assessment items related to energy. Journal of Research in Science Teaching, 45(5), 554–573.CrossRefGoogle Scholar
  21. Lohr, S. L. (1999). Sampling: Design and analysis. Pacific Grove: Brooks/Cole Publishing Company.Google Scholar
  22. Martin, M. O., Mullis, I. V. S., & Foy, P. (2008). TIMSS 2007 International Science Report: Findings From IEA’s Trends in International Mathematics and Science Study at the Eighth and Fourth Grades. Chestnut Hill, MA: Boston College.Google Scholar
  23. McConney, A. & Perry, L. B. (2010). Science and mathematics achievement in Australia: The role of school socioeconomic composition in educational equity and effectiveness. International Journal of Science and Mathematics Education, 8(3), 429–452.CrossRefGoogle Scholar
  24. McGinnis, J. R. & Collins, A. (2009). Editors’ note. Journal of Research in Science Education, 46(8), 861.Google Scholar
  25. Milford, T., Ross, S. & Anderson, J. (2010). An opportunity to better understand schooling: The growing presence of PISA in the Americas. International Journal of Science and Mathematics Education, 8, 453–473.CrossRefGoogle Scholar
  26. Mohammadpour, E. (2012). A multilevel study on trends in Malaysian secondary school students’ science achievement and associated school and student predictors. Science Education, 96(6), 1013–1046.CrossRefGoogle Scholar
  27. Mullis, I. V. S., Martin, M. O., Ruddock, G. J., O’Sullivan, C. Y., Arora, A. & Erberber, E. (2005). TIMSS 2007 assessment frameworks. Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.Google Scholar
  28. Muthén, B. & Satorra, A. (1995). Complex sample data in structural equation modeling. Sociological Methodology, 25, 267–316.CrossRefGoogle Scholar
  29. OECD (Organisation for Economic Co-operation and Development). (2007). PISA 2006: Science competencies for tomorrow’s world. Paris: Author.Google Scholar
  30. OECD (Organisation for Economic Co-operation and Development). (2009). PISA 2006 technical report. Paris: Author.Google Scholar
  31. Olsen, R. V., Prenzel, M. & Martin, R. (2011). Interest in science: A many-faceted picture painted by data from the OECD PISA study. International Journal of Science Education, 33(1), 1–6.CrossRefGoogle Scholar
  32. Olson, J. F., Martin, M. O. & Mullis, I. V. S. (Eds.). (2008). TIMSS 2007 technical report. Chestnut Hill: TIMSS & PIRLS International Study Center, Boston College.Google Scholar
  33. Pedhazur, E. J. & Schmelkin, L. P. (1991). Measurement, design, and analysis: An integrated approach. NJ: New Information Age Publishing.Google Scholar
  34. Raudenbush, S. W. & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Thousand Oaks: Sage.Google Scholar
  35. Rautalin, M. & Alasuutari, P. (2009). The uses of the national PISA results by Finnish officials in central government. Journal of Education Policy, 24(5), 539–556.CrossRefGoogle Scholar
  36. Rutkowski, L., Gonzalez, E., Joncas, M. & von Davier, M. (2010). International large-scale assessment data: Issues in secondary analysis and reporting. Educational Researcher, 39(2), 142–151.CrossRefGoogle Scholar
  37. Sabah, S., Hammouri, H. & Akour, M. (2013). Validation of scale of attitudes toward science across countries using Rasch model: Findings from TIMSS. Journal of Baltic Science Education, 12(5), 692–702.Google Scholar
  38. Sikora, J. & Pokropek, A. (2012). Gender segregation of adolescent science career plans in 50 countries. Science Education, 96(2), 234–264.CrossRefGoogle Scholar
  39. Stapleton, L. M. (2008). Analysis of data from complex surveys. In E. D. de Leeuw, J. J. Hox & D. A. Dillman (Eds.), International handbook of survey methodology (pp. 342–369). New York: Lawrence Erlbaum Associates.Google Scholar
  40. Sun, L., Bradley, K. D. & Akers, K. (2012). A multilevel modeling approach to investigating factors impacting science achievement for secondary school students: PISA Hong Kong sample. International Journal of Science Education, 34(14), 2107–2125.CrossRefGoogle Scholar
  41. Thomas, S. L. & Heck, R. H. (2001). Analysis of large-scale secondary data in higher education research: Potential perils associated with complex sampling designs. Research in Higher Education, 42(5), 517–540.CrossRefGoogle Scholar
  42. Wang, J., Oliver, J. S. & Staver, J. R. (2008). Self-concept and science achievement: Investigating a reciprocal relation model across the gender classification in a cross cultural context. Journal of Research in Science Teaching, 45(6), 711–725.CrossRefGoogle Scholar
  43. Woods-McConney, A., Oliver, M. C., McConney, A., Maor, D. & Schibeci, R. (2013). Science engagement and literacy: A retrospective analysis for indigenous and non-indigenous students in Aotearoa New Zealand and Australia. Research in Science Education, 43(1), 233–252.CrossRefGoogle Scholar
  44. Yip, D. Y., Chiu, M. M. & Ho, E. S. C. (2004). Hong Kong student achievement in OECD-PISA study: Gender differences in science content, literacy skills, and test item formats. International Journal of Science and Mathematics Education, 2, 91–106.CrossRefGoogle Scholar

Copyright information

© Ministry of Science and Technology, Taiwan 2014

Authors and Affiliations

  1. 1.Graduate Institute of Learning and Instruction & Center of Teacher EducationNational Central UniversityJhongli City, TaoyuanRepublic of China
  2. 2.Graduate Institute of Learning and InstructionNational Central UniversityJhongli City, TaoyuanRepublic of China

Personalised recommendations