Advertisement

Journal of Science Education and Technology

, Volume 24, Issue 1, pp 91–102 | Cite as

Meta-analytic Evaluation of a Virtual Field Trip to Connect Middle School Students with University Scientists

  • Omolola A. AdedokunEmail author
  • Jia Liu
  • Loran Carleton Parker
  • Wilella Burgess
Article

Abstract

Although virtual field trips are becoming popular, there are few empirical studies of their impacts on student outcomes. This study reports on a meta-analytic evaluation of the impact of a virtual field trip on student perceptions of scientists. Specifically, the study examined the summary effect of zipTrips broadcasts on evaluation participants’ perceptions of scientists, as well as the moderating effect of program type on program impact. The results showed statistically significant effect of each broadcast, as well as statistically significant summary (combined) effect of zipTrips on evaluation participants’ perceptions of scientists. Results of the moderation analysis showed that the effect was greater for the students that participated in the evaluation of the 8th grade broadcasts, providing additional insight into the role of program variation in predicting differential program impact. This study illustrates how meta-analysis, a methodology that should be of interest to STEM education researchers and evaluation practitioners, can be used to summarize the effects of multiple offerings of the same program. Other implications for STEM educators are discussed.

Keywords

Meta-analysis Virtual field trips Student–scientist interactions Student perceptions of scientists 

Notes

Acknowledgments

Purdue zipTrips™ is a trademark of Purdue University. The project was developed with partial support from the Howard Hughes Medical Institute (Grant #51006097; PI: J. Paul Robinson). The contents of this paper are the authors’ and do not necessarily represent the views or policies of the Howard Hughes Medical Institute. The authors acknowledge the assistance of Jamie Loizzo, Joan Crow, Carol McGrew, Steve Doyle, Sharon Katz, Julianne Bell, Rebecca Goetz, Ann Bessenbacher, Laurent Couetil, Lisa Hillard, Lori Corriveau, the Indiana Higher Education Telecommunication System, the Indiana Public Broadcasting Service, and the teachers and administrators from several school corporations who assisted in the development of Purdue zipTrips.

References

  1. Adedokun OA, Hetzel K, Parker LC, Loizzo JL, Burgess WD, Robinson JP (2012) Connecting students with university scientists: core elements and evaluation of Purdue zipTrips. J Sci Educ Technol 21:607–618Google Scholar
  2. Banks S, McHugo GJ, Williams V, Drake RE, Shinn M (2002) A prospective meta-analytic approach in a multisite study of homelessness prevention. N Direct Eval 2002(94):45–60CrossRefGoogle Scholar
  3. Barber R (2010) Self-contained virtual field trips in community science through point-to-point videoconferencing. Unpublished Thesis, California State University. http://csuchico-dspace.calstate.edu/bitstream/handle/10211.4/186/
  4. Barbour RS (2001) Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? BMJ: Br Med J 322(7294):1115–1117CrossRefGoogle Scholar
  5. Barnett M, Strauss E, Rosca C, Langford H, Chavez D, Deni L, Lord C (2004) Improving urban youth’s interest and engagement through field-based scientific investigations. In: Kafai YB WAS, Enyedy N, Nixon AS, Herrera F (eds) Sixth international conferences of the learning sciences. Mahway, New Jersey: Lawrence Erlbaum, p 73–80Google Scholar
  6. Biostat Solutions (2006) Comprehensive meta-analysis (Version 2.2.027). [Computer software]. Mt. Airy, Md.: Biostat SolutionsGoogle Scholar
  7. Bodzin A, Gehringer M (2001) Breaking science stereotypes. Sci Child 38(4):36–41Google Scholar
  8. Borenstein M, Hedges LV, Higgins JP, Rothstein HR (2009) Introduction to meta-analysis. Wiley, Chichester, West SussexCrossRefGoogle Scholar
  9. Cassady JC, Kozlowski A, Kornmann M (2008) Electronic field trips as interactive learning events: promoting student learning at a distance. J Interact Learn Res 19:439–454Google Scholar
  10. Chambers DW (1983) Stereotypic images of the scientist: the draw-a-scientist test. Sci Educ 67(2):255–265CrossRefGoogle Scholar
  11. Cox ES, Su T (2004) Integrating student learning with practitioner experiences via virtual fieldtrips. J Educ Media 29(2):113–123CrossRefGoogle Scholar
  12. Falk JH, Balling JD (1982) The field trip milieu: learning and behavior as a function of contextual events. J Educ Res 76(1):22–28CrossRefGoogle Scholar
  13. Finson KD (2002) Drawing a scientist: what we do and do not know after fifty years of drawing. Sch Sci Math 102(7):335–345CrossRefGoogle Scholar
  14. Finson KD, Beaver JB, Cramond BL (1995) Development and field test of a checklist for the draw-a-scientist test. Sch Sci Math 41:195–205CrossRefGoogle Scholar
  15. Glass GV (1976) Primary, secondary, and meta-analysis of research. Educ Res 5(10):3–8Google Scholar
  16. Goldenberg LB, Ba H, Heinze J, Hess A (2003) JASON multimedia science curriculum’s impact on student learning. Cent Child Technol. www.jason.org/sites/default/files/documents/year_3_finalreport.pdf
  17. Guthrie JT, Schafer WD, Von Secker C, Alban T (2000) Contributions of instructional practices to reading achievement in a statewide improvement program. J Educ Res 93:211–225CrossRefGoogle Scholar
  18. Hetzel K, Adedokun OA, Parker LC, Bessenbacher A, Burgess WD, Robinson JP (2012) Using the draw a scientist test to evaluate the impact of an electronic field trip on students' perceptions of scientists. Paper presented at the 2012 annual conference of the National Science Teachers AssociationGoogle Scholar
  19. Higgins JP, Thompson SG, Deeks JJ, Altman DG (2003) Measuring inconsistency in meta-analyses. Br Med J 327(7414):557–560CrossRefGoogle Scholar
  20. Hovell SR (2003) LEARNZ virtual field trips and the New Zealand curriculum: a teacher’s perspective. Unpublished M.S. Thesis, University of Otago, Dunedin, New Zealand. http://www.learnz.org.nz/downloads/dissertation-stephen-hovell.pdf
  21. Huedo-Medina TB, Sanchez-Meca J, Marin-Martinez F, Botella J (2006) Assessing heterogeneity in meta-analysis: q statistic or I2 index? Psychol Methods 11:193–206CrossRefGoogle Scholar
  22. Jacobson AR, Militello R, Baveye PC (2009) Development of computer-assisted virtual field trips to support multidisciplinary learning. Comput Educ 52(3):571–580CrossRefGoogle Scholar
  23. Kalaian SA (2003) Meta-analysis methods for synthesizing treatment effects in multisite studies: hierarchical linear modeling (HLM) perspective. http://pareonline.net/htm/v8n15.htm
  24. Knapp D (2000) Memorable experiences of a science field trip. Sch Sci Math 100(2):65–72CrossRefGoogle Scholar
  25. Laursen S, Liston C, Thiry H, Graf J (2007) What good is a scientist in the classroom? Participant outcomes and program design features for a short-duration science outreach intervention in K–12 classrooms. CBE-Life Sci Educ 6(1):49–64CrossRefGoogle Scholar
  26. Milner-Bolotin M (2007) Building bridges between scientists and teachers to bring the joy of science to British Columbia students. Sci Scope 30(9):58–59Google Scholar
  27. Niemitz M, Slough S, Peart L, Klaus AD, Leckie RM, St. John K (2008) Interactive virtual expeditions as a learning tool: the school of rock expedition case study. J Educ Multimed Hypermed 17(4):561–580Google Scholar
  28. Painter J, Jones MG, Tretter TR, Kubasko D (2006) Pulling back the curtain: uncovering and changing students’ perceptions of scientists. Sch Sci Math 106:181–190CrossRefGoogle Scholar
  29. Placing K, Fernandez A (2002) Virtual experiences for secondary science teaching. Aust Sci Teach J 48(1):40–43Google Scholar
  30. Rennie LJ (1994) Measuring affective outcomes from a visit to a science education centre. Res Sci Educ 24(1):261–269CrossRefGoogle Scholar
  31. Riley RD, Higgins J, Deeks JJ (2011) Interpretation of random effects meta-analyses. BMJ 342:964–967CrossRefGoogle Scholar
  32. Rosenthal R, DiMatteo MR (2001) Meta-analysis: recent developments in quantitative methods for literature reviews. Annu Rev Psychol 52(1):59–82CrossRefGoogle Scholar
  33. Sabatelli RM, Anderson SA (2005) Assessing outcomes in child and youth programs: a practical handbook revised edition. US Department of Justice to the State of Connecticut, ConnecticutGoogle Scholar
  34. Stoddard J (2009) Toward a virtual field trip model for the social studies. Contemp Issues Technol Teach Educ 9(4):421–438Google Scholar
  35. Strube MJ, Hartmann DP (1983) Meta-analysis: techniques, applications, and functions. J Consult Clin Psychol 51(1):14CrossRefGoogle Scholar
  36. Tanner-Smith EE, Steinka-Fry KT, Lipsey MW (2012) A multi-site evaluation of the centering pregnancy programs in Tennessee. Vanderbilt University, Nashville, TN (2012). 18 June 2014 https://my.vanderbilt.edu/emilytannersmith/files/2012/02/Contract19199-GR1030830-Final-Report.pdf
  37. Tutwiler MS, Lin M-C, Chang C-Y (2013) Determining virtual environment “fit”: the relationship between navigation style in a virtual field trip, student self-reported desire to visit the field trip site in the real world, and the purposes of science education. J Sci Educ Tech 22:351–361Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  • Omolola A. Adedokun
    • 1
    Email author
  • Jia Liu
    • 1
  • Loran Carleton Parker
    • 1
  • Wilella Burgess
    • 1
  1. 1.Discovery Learning Research CenterPurdue UniversityWest LafayetteUSA

Personalised recommendations