Journal of Computing in Higher Education

, Volume 26, Issue 1, pp 87–122 | Cite as

A meta-analysis of blended learning and technology use in higher education: from the general to the applied

  • Robert M. Bernard
  • Eugene Borokhovski
  • Richard F. Schmid
  • Rana M. Tamim
  • Philip C. Abrami
Article

Abstract

This paper serves several purposes. First and foremost, it is devoted to developing a better understanding of the effectiveness of blended learning (BL) in higher education. This is achieved through a meta-analysis of a sub-collection of comparative studies of BL and classroom instruction (CI) from a larger systematic review of technology integration (Schmid et al. in Comput Educ 72:271–291, 2014). In addition, the methodology of meta-analysis is described and illustrated by examples from the current study. The paper begins with a summary of the experimental research on distance education (DE) and online learning (OL), encapsulated in meta-analyses that have been conducted since 1990. Then it introduces the Bernard et al. (Rev Educ Res 74(3):379–439, 2009) meta-analysis, which attempted to alter the DE research culture of always comparing DE/OL with CI by examining three forms of interaction treatments (i.e., student–student, student–teacher, student–content) within DE, using the theoretical framework of Moore (Am J Distance Educ 3(2):1–6, 1989) and Anderson (Rev Res Open Distance Learn 4(2):9–14, 2003). The rest of the paper revolves around the general steps and procedures (Cooper in Research synthesis and meta-analysis: a step-by-step approach, 4th edn, SAGE, Los Angeles, CA, 2010) involved in conducting a meta-analysis. This section is included to provide researchers with an overview of precisely how meta-analyses can be used to respond to more nuanced questions that speak to underlying theory and inform practice—in other words, not just answers to the “big questions.” In this instance, we know that technology has an overall positive impact on learning (g+ = +0.35, p < .01, Tamim et al. in Rev Educ Res 81(3):4–28, 2011), but the sub-questions addressed here concern BL interacting with technology in higher education. The results indicate that, in terms of achievement outcomes, BL conditions exceed CI conditions by about one-third of a standard deviation (g+ = 0.334, k = 117, p < .001) and that the kind of computer support used (i.e., cognitive support vs. content/presentational support) and the presence of one or more interaction treatments (e.g., student–student/–teacher/–content interaction) serve to enhance student achievement. We examine the empirical studies that yielded these outcomes, work through the methodology that enables evidence-based decision-making, and explore how this line of research can improve pedagogy and student achievement.

Keywords

Bended learning Technology use Higher education Meta-analysis 

References

  1. Note: A list of studies included in the meta-analysis is available upon request from the authors.Google Scholar
  2. Abrami, P. C. (2010). On the nature of support in computer supported collaborative learning using gStudy. Computers in Human Behavior, 26(5), 835–839. doi:10.1016/j.chb.2009.04.007.CrossRefGoogle Scholar
  3. Abrami, P. C., & Bernard, R. M. (2006). Research on distance education: In defense of field experiments. Distance Education, 27(1), 5–26.Google Scholar
  4. Abrami, P. C., & Bernard, R. M. (2012). Statistical control versus classification of study quality in meta-analysis. Effective Education, 4(1), 43–72. doi:10.1080/19415532.2012.761889.
  5. Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M., Tamim, R. M., et al. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage one meta-analysis. Review of Educational Research, 78(4), 1102–1134. doi:10.3102/0034654308326084.CrossRefGoogle Scholar
  6. Abrami, P. C., Bernard, R. M., Bures, E. M., Borokhovski, E., & Tamim, R. (2011). Interaction in distance education and online learning: Using evidence and theory to improve practice. Journal of Computing in Higher Education, 23(2/3), 82–103. doi:10.1007/s12528-011-9043-x.CrossRefGoogle Scholar
  7. Albrecht, B. (2006). Enriching student experience through blended learning. ECAR Research Bulletin, 12. Google Scholar
  8. Allen, M., Bourhis, J., Burrell, N., & Mabry, E. (2002). Comparing student satisfaction with distance education to traditional classrooms in higher education: A meta-analysis. American Journal of Distance Education, 16(2), 83–97. doi:10.1207/S15389286AJDE1602_3.CrossRefGoogle Scholar
  9. Allen, M., Bourhis, J., Mabry, E., Burrell, N., Timmerman, E., & Titsworth, S. (2006). Comparing distance education to face-to-face methods of education. In B. Gayle, R. Preiss, N. Burrell, & M. Allen (Eds.), Classroom and communication education research: Advances through meta-analysis (pp. 229–241). Hillsdale, NJ: Erlbaum.Google Scholar
  10. Allen, M., Mabry, E., Mattrey, M., Bourhis, J., Titsworth, S., & Burrell, N. (2004). Evaluating the effectiveness of distance learning: A comparison using meta-analysis. Journal of Communication, 54(3), 402–420. doi:10.1111/j.1460-2466.2004.tb02636.x.CrossRefGoogle Scholar
  11. Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction. International Review of Research in Open and Distance Learning, 4(2), 9–14. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/149.
  12. Arabasz, P., & Baker, M. (2003). Respondent Summary: Evolving campus support models for e-learning courses. EDUCAUSE Center for Applied Research. http://www.educause.edu/ir/library/pdf/EKF/ekf0303.pdf.
  13. Azevedo, R., & Bernard, R. M. (1995). A meta-analysis of the effects of feedback in computer-based instruction. Journal of Educational Computing Research, 13(2), 111–127. doi:10.2190/9LMD-3U28-3A0G-FTQT.CrossRefGoogle Scholar
  14. Bele, J. L., & Rugelj, J. (2007). Blended learning—an opportunity to take the best of both worlds. International Journal of Emerging Technologies in Learning, 2(3). doi:10.3991%2Fijet.v2i3.133.
  15. Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., et al. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289. doi:10.3102/0034654309333844.CrossRefGoogle Scholar
  16. Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare to classroom instruction? A Meta-analysis of the empirical literature. Review of Educational Research, 74(3), 379–439. doi:10.3102/00346543074003379.CrossRefGoogle Scholar
  17. Bernard, R. M., Rojo de Rubalcava, B., & St-Pierre, D. (2000). Collaborative online distance education: Issues for future practice and research. Distance Education, 21(2), 260–277. doi:10.1080/0158791000210205.CrossRefGoogle Scholar
  18. Bernard, R. M., Zhang, D., Abrami, P. C., Sicoly, F., Borokhovski, E., & Surkes, M. (2008). Exploring the structure of the Watson-Glaser critical thinking appraisal: One scale or many subscales? Thinking Skills and Creativity, 3, 15–22. doi:10.1016/j.tsc.2007.11.001.CrossRefGoogle Scholar
  19. Bliuc, A. M., Goodyear, P., & Ellis, R. A. (2007). Research focus and methodological choices in studies into students’ experiences of blended learning in higher education. The Internet and Higher Education, 10(4), 231–244. doi:10.1016/j.iheduc.2007.08.001.CrossRefGoogle Scholar
  20. Bonk, C. J., & Graham, C. R. (Eds.). (2006). The handbook of blended learning: Global perspectives, local designs. San Francisco, CA: Pfeiffer.Google Scholar
  21. Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. (2005). Comprehensive meta-analysis version 2 .2.048. Englewood, NJ: Biostat.Google Scholar
  22. Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. (2009). Introduction to meta-analysis. Chichester, UK: Wiley.CrossRefGoogle Scholar
  23. Borokhovski, E., Tamim, R. M., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are contextual and design student–student interaction treatments equally effective in distance education? A follow-up meta-analysis of comparative empirical studies. Distance Education, 33(3), 311–329. doi:10.1080/01587919.2012.723162.Google Scholar
  24. Campbell, D., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago, IL: Rand McNally.Google Scholar
  25. Cavanaugh, C. S. (2001). The effectiveness of interactive distance education technologies in K-12 learning: A meta-analysis. International Journal of Educational Telecommunications, 7(1), 73–88. Norfolk, VA: AACE. Retrieved March 13, 2007 from http://www.editlib.org/p/8461.
  26. Cavanaugh, C., Gillan, K. J., Kromey, J., Hess, M., & Blomeyer, R. (2004). The effects of distance education on K-12 student outcomes: A meta-analysis. Naperville, IL: Learning Point Associates.Google Scholar
  27. Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53(4), 445–459. doi:10.3102/00346543053004445.CrossRefGoogle Scholar
  28. Clark, R. E. (1994). Media will never influence learning. Educational Technology Research and Development, 42(2), 21–29. doi:10.1007/BF02299088.CrossRefGoogle Scholar
  29. Clark, R. E., Yates, K., Early, S., & Moulton, K. (2009). An analysis of the failure of electronic media and discovery-based learning: Evidence for the performance benefits of guided training methods. In K. H. Silber & R. Foshay (Eds.), Handbook of training and improving workplace performance, Volume I: Instructional design and training delivery. Washington, DC: ISPI.Google Scholar
  30. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.Google Scholar
  31. Cook, D. A. (2009). The failure of e-learning research to inform educational practice and what we can do about it. Medical Teacher, 31(2), 158–162. doi:10.1080/01421590802691393.CrossRefGoogle Scholar
  32. Cook, D. A., Levinson, A. J., Garside, S., Dupras, D. M., Erwin, P. J., & Montori, V. M. (2008). Internet-based learning in the health professions: A meta-analysis. Journal of the American Medical Association, 300(10), 1181–1196. doi:10.1001/jama.300.10.1181.CrossRefGoogle Scholar
  33. Cooper, H. M. (2010). Research synthesis and meta-analysis: A step-by-step approach (4th ed.). Los Angeles, CA: SAGE Publications.Google Scholar
  34. Cooper, H. M., Hedges, L. V., & Valentine, J. C. (Eds.). (2009). The handbook of research synthesis and meta-analysis (2nd ed.). New York, NY: Russell Sage Foundation.Google Scholar
  35. Driscoll, M., & Carliner, S. (2005). Advanced web-based training strategies. Blended learning as a curriculum design strategy (pp. 87–116). New York, NY: ASTD Press.Google Scholar
  36. Duval, S., & Tweedie, R. (2004). Trim and fill: A simple funnel-plot–based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455–463. doi:10.1111/j.0006-341X.2000.00455.x.CrossRefGoogle Scholar
  37. Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines. San Francisco, CA: Jossey-Bass.Google Scholar
  38. Glass, G. V. (1976). Primary, secondary and meta-analysis. Educational Researcher, 5(10), 3–8.CrossRefGoogle Scholar
  39. Glass, G. V., McGaw, B., & Smith, M. L. (1981). Meta-analysis in social research. Beverly Hills, CA: Sage.Google Scholar
  40. Graham, C. R. (2005). Blended learning systems. In C. J. Bonk, & C. R. Graham (Eds.), The handbook of blended learning: Global perspectives, local designs. Chichester: Wiley (originally Pfeiffer).Google Scholar
  41. Hammerstrøm, K., Wade, C. A. & Jørggensen, A.-M. K. (2010). Searching the literature: A guide to information retrieval for campbell systematic reviews 2010: Supplement 1. Oslo, Norway: The Campbell Collaboration. doi:10.4073/csrs.2010.1.
  42. Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic Press.Google Scholar
  43. Hedges, L. V., Shymansky, J. A. & Woodworth, G. (1989). A practical guide to modern methods of meta-analysis. (Stock Number PB-52). Washington, DC: National Science Teachers Association. (ERIC Document Reproduction Service No. ED309952).Google Scholar
  44. Higgins, J. P. T., Thompson, S. G., Deeks, J. J., & Altman, D. G. (2003). Measuring inconsistency in meta-analysis. British Medical Journal, 327, 557–560. doi:10.1136/bmj.327.7414.557.CrossRefGoogle Scholar
  45. Hunt, M. (1997). How science takes stock: The story of meta-analysis. New York, NY: Russell Sage Foundation.Google Scholar
  46. Hunter, J. E., & Schmidt, F. L. (1990). Methods of meta-analysis: Correcting error and bias in research findings. Newbury Park, CA: SAGE Publications.Google Scholar
  47. Jahng, N., Krug, D. & Zhang, Z. (2007). Student achievement in online education compared to face-to-face education. European Journal of Open, Distance and E-Learning. Retrieved from http://www.eurodl.org/materials/contrib/2007/Jahng_Krug_Zhang.htm.
  48. Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story: Social interdependence theory and cooperative learning. Educational Researcher, 368(5), 365–379. doi:10.3102/0013189X09339057.CrossRefGoogle Scholar
  49. Keegan, D. (1996). Foundations of distance education (3rd ed.). London: Routledge.Google Scholar
  50. Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. New York, NY: Sage Publications.Google Scholar
  51. Lou, Y., Abrami, P. C., & D’Appollonia, S. (2001). Small group and individual learning with technology: A meta-analysis. Review of Educational Research, 71, 449–521. doi:10.3102/00346543071003449.CrossRefGoogle Scholar
  52. Lou, Y., Bernard, R. M., & Abrami, P. C. (2006). Media and pedagogy in undergraduate distance education: A theory-based meta-analysis of empirical literature. Educational Technology Research and Development, 54, 141–176. doi:10.1007/s11423-006-8252-x.CrossRefGoogle Scholar
  53. Machtmes, K., & Asher, J. W. (2000). A meta-analysis of the effectiveness of telecourses in distance education. American Journal of Distance Education, 14(1), 27–46. doi:10.1080/08923640009527043.CrossRefGoogle Scholar
  54. Marquis, C. (2004). WebCT survey discovers a blend of online learning and classroom-based teaching is the most effective form of learning today. WebCT.com. Retrieved from http://www.webct.com/service/ViewContent?contentID=19295938.
  55. Means, B, Toyama, Y., Murphy, R. F., & Baki, M. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature, Teachers College Record, 115(3), 1–47. Retrieved from http://www.tcrecord.org/library/content.asp?contentid=16882.
  56. Means, B., Toyama, Y., Murphy, R., Bakia, M,. & Jones, K. (2010). Evaluation of evidence-based practices in online Learning: A meta-analysis and review of online learning studies. Technical Report. U. S. Department of Education, Washington, DC.Google Scholar
  57. Moore, M. G. (1989). Editorial: Three types of interaction. The American Journal of Distance Education, 3(2), 1–6. doi:10.1080/08923648909526659.CrossRefGoogle Scholar
  58. Pettigrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide. Oxford, UK: Blackwell Publishing.CrossRefGoogle Scholar
  59. Piggot, T. D. (2012). Advances in meta-analysis. New York, NY: Springer.CrossRefGoogle Scholar
  60. Pintrich, P. R. (2003). A Motivational science perspective on the role of student motivation in learning and teaching contexts. Journal of Educational Psychology, 95(4), 667–686. doi:10.1037/0022-0663.95.4.667.CrossRefGoogle Scholar
  61. Rosen, Y., & Salomon, G. (2007). The differential learning achievements of constructivist technology-intensive learning environments as compared with traditional ones: A meta-analysis. Journal of Educational Computing Research, 36(1), 1–14. doi:10.2190/R8M4-7762-282U-554J.CrossRefGoogle Scholar
  62. Ross, S. M., & Morrison, G. R. (1989). In search of a happy medium in instructional technology research: Issues concerning external validity, media replications, and learner control. Educational Technology Research and Development, 37(1), 19–33. doi:10.1007/BF02299043.CrossRefGoogle Scholar
  63. Ross, S. M., Morrison, G. R., & Lowther, D. L. (2010). Educational technology research past and present: Balancing rigor and relevance to impact school learning. Contemporary Educational Technology, 1(1), 17–35. Retrieved from http://www.cedtech.net/articles/112.pdf.
  64. Rothstein, H. R., Sutton, A. J., & Borenstein, M. (Eds.). (2005). Publication bias in meta-analysis: Prevention, assessment and adjustments. Chichester, UK: Wiley.Google Scholar
  65. Scammacca, N., Roberts, G., & Stuebing, K. K. (2013). Meta-analysis with complex research designs: Dealing with dependence from multiple measures and multiple group comparisons. Review of Educational Research. Published online 13 September 2013. doi:10.3102/0034654313500826.
  66. Schmid, R. F., Bernard, R. M., Borokhovski, E., Tamim, R. M., Abrami, P. C., Surkes, M. A., et al. (2014). The effects of technology use in postsecondary education: A meta-analysis of classroom applications. Computers & Education, 72, 271–291. doi:10.1016/j.compedu.2013.11.002.CrossRefGoogle Scholar
  67. Shachar, M., & Neumann, Y. (2003). Differences between traditional and distance education academic performances: A meta-analytical approach. International Review of Research in Open and Distance Education, 4(2). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/153/704.
  68. Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin.Google Scholar
  69. Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology, 59(3), 623–664. doi:10.1111/j.1744-6570.2006.00049.x.CrossRefGoogle Scholar
  70. Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research., 81(3), 4–28. doi:10.3102/0034654310393361.CrossRefGoogle Scholar
  71. Ungerleider, C. S., & Burns, T. C. (2003). Information and communication technologies in elementary and secondary education: A state of the art review. International Journal of Educational Policy, Research & Practice, 3(4), 27–54.Google Scholar
  72. Valentine, J. C., & Cooper, H. (2003). Effect size substantive interpretation guidelines: Issues in the interpretation of effect sizes. Washington, DC: What Works Clearinghouse.Google Scholar
  73. Valentine, J. C., & Cooper, H. (2008). A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: The Study Design and Implementation Assessment Device (Study DIAD). Psychological Methods, 13(2), 130–149. doi:10.1037/1082-989X.13.2.130.CrossRefGoogle Scholar
  74. Watson, G., & Glaser, E. M. (1980). Watson–Glaser critical thinking appraisal: Forms A and B. San Antonio, TX: PsychCorp.Google Scholar
  75. Williams, S. L. (2006). The effectiveness of distance education in allied health science programs: A meta-analysis of outcomes. American Journal of Distance Education, 20(3), 127–141. doi:10.1207/s15389286ajde2003_2.CrossRefGoogle Scholar
  76. Zhao, Y., Lei, J., Yan, B., & Tan, S. (2005). What makes the difference? A practical analysis of research on the effectiveness of distance education. Retrieved from http://ott.educ.msu.edu/literature/report.pdf.
  77. Zhoa, Y., & Breslow, L. (2013). Literature review on hybrid/blended learning. Unpublished manuscript. Retrieved from http://tll.mit.edu/sites/default/files/library/Blended_Learning_Lit_Reveiw.pdf.
  78. Zimmerman, B. (2000). Attaining self-regulation: A social cognitive perspective. In M. Boekaertzs & P. R. Pintrich (Eds.), Handbook of self-regulation (pp. 13–39). New York, NY: Academic Press.Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  • Robert M. Bernard
    • 1
  • Eugene Borokhovski
    • 1
  • Richard F. Schmid
    • 1
  • Rana M. Tamim
    • 2
  • Philip C. Abrami
    • 1
  1. 1.Centre for the Study of Learning and Performance (CSLP)Concordia UniversityMontrealCanada
  2. 2.Zayed UniversityDubaiUnited Arab Emirates

Personalised recommendations