Teacher Implementation and the Impact of Game-Based Science Curriculum Materials
- 946 Downloads
Research-based digital games hold great potential to be effective tools in supporting next-generation science learning. However, as with all instructional materials, teachers significantly influence their implementation and contribute to their effectiveness. To more fully understand the contributions and challenges of teacher implementation of digital games, we studied the replacement of existing high school biology genetics lessons over a 3- to 6-week period with Geniverse, an immersive, game-like learning environment designed to be used in classrooms. The Geniverse materials infuse virtual experimentation in genetics with a narrative of a quest to heal a genetic disease; incorporate the topics of meiosis and protein synthesis with inheritance; and include the science practices of explanation and argumentation. The research design involved a quasi-experiment with 48 high school teachers and about 2000 students, student science content knowledge and argumentation outcome measures, and analysis using hierarchical linear modeling. Results indicate that when Geniverse was implemented as the designers intended, student learning of genetics content was significantly greater than in the comparison, business-as-usual group. However, a wide range of levels of Geniverse implementation resulted in no significant difference between the groups as a whole. Students’ abilities to engage in scientific explanation and argumentation were greater in the Geniverse group, but these differences were not statistically significant. Observation, survey, and interview data indicate a range of barriers to implementation and teacher instructional decisions that may have influenced student outcomes. Implications for the role of the teacher in the implementation of game-based instructional materials are discussed.
KeywordsEducational games Game-based learning Genetics Argumentation Teacher implementation Fidelity of implementation
We are grateful to Randy von Smith, Paul Szauter, and the Jackson Laboratories for crafting drake genes and genotypes, to Lisa Carey at BSCS for her help with data collection, to the Geniverse research teachers for their hard work and timely feedback, and to Arthur Libby for his design work on advanced genetics challenges. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
This work was funded by the National Science Foundation under Grant No. DRL-0733264.
- American Association for the Advancement of Science. (1994). Benchmarks for science literacy. Oxford University Press. Retrieved from https://global.oup.com/academic/product/benchmarks-for-science-literacy-9780195089868?q=Benchmarks for Science Literacy, AAAS 1993&lang=en&cc=us.
- Ault, M., Craig-Hare, J., Frey, B., Ellis, J. D., & Bulgren, J. (2015). The effectiveness of Reason Racer, a game designed to engage middle school students in scientific argumentation. Journal of Research on Technology in Education, 47(1), 21–40. https://doi.org/10.1080/15391523.2015.967542.CrossRefGoogle Scholar
- Ball, D. L., & Cohen, D. K. (1996). Reform by the book: what is - or might be - the role of curriculum materials in teacher learning and instructional reform? Educational Researcher, 25(9), 6–14.Google Scholar
- Begle, E. G. (1973). Mathematics reading ability [SMESG working paper no. 1]. Stanford University: Stanford Mathematics Education Study Group.Google Scholar
- Buckley, B. C., Gobert, J., Kindfield, A. C., Horwitz, P., Tinker, R., & Gerlits, B. (2004). Model-based teaching and learning with BioLogica™: what do they learn? How do they learn? How do we know? Journal of Science Education and Technology, 13(1), 23–41. https://doi.org/10.1023/B:JOST.0000019636.06814.e3.CrossRefGoogle Scholar
- Campbell, T., Dowdle, G., Shelton, B. E., Olsen, J., Longhurst, M., & Beckett, H. (2013). Gaming as a platform for developing science practices. Science Activities: Classroom Projects and Curriculum Ideas, 50(3), 90–98.Google Scholar
- Clark, D. B., Tanner-Smith, E. E., & Killingsworth, S. (2014). Digital games, design, and learning: a systematic review and meta-analysis (executive summary). Menlo Park: SRI International.Google Scholar
- Driver, R., Squires, A., Rushworth, P., & Wood-Robinson, V. (1994). Making sense of secondary science: Research into children’s ideas. Abingdon: Routledge.Google Scholar
- Gee, J. P. (2007). What video games have to teach us about learning and literacy (2nd ed.). New York: Palgrave Macmillan.Google Scholar
- Glass Lab Research. (2015). Field study results: Mars Generation One. Retrieved from http://about.glasslabgames.org/wp-content/uploads/2014/08/ResearchMGOFull.pdf
- Hickey, D. T., Kindfield, A. C., Horwitz, P., & Christie, M. A. T. (2003). Integrating curriculum, instruction, assessment, and evaluation in a technology-supported genetics learning environment. American Educational Research Journal, 40(2), 495–538. https://doi.org/10.3102/00028312040002495.CrossRefGoogle Scholar
- Hickey, D. T., Kindfield, A. C. H., Horwitz, P., & Christie, M. A. (1999). Advancing educational theory by enhancing practice in a technology-supported genetics learning environment. Journal of Education, 181, 25–55.Google Scholar
- Higgin, T. (2014). Game-based learning and the NGSS. Common Sense Media. Retrieved from https://www.graphite.org/blog/game-based-learning-and-the-ngss
- Kangas, M., Koskinen, A., & Krokfors, L. (2016). A qualitative literature review of educational games in the classroom: the teacher’s pedagogical activities. Teachers and Teaching, 23(4), 451–471.Google Scholar
- Keeley, P. (2008). Science formative assessment: 75 practical strategies for linking assessment, instruction, and learning (Vol. 1). Thousand Oaks: Corwin.Google Scholar
- Lynch, S. J., Pyke, C., & Grafton, B. H. (2012). A retrospective view of a study of middle school science curriculum materials: Implementation, scale-up, and sustainability in a changing policy environment. Journal of Research in Science Teaching, 4(3), 305–332. https://doi.org/10.1002/tea.21000.CrossRefGoogle Scholar
- McNeill, K. L., & Krajcik, J. S. (2011). Supporting grade 5–8 students in constructing explanations in science: the claim, evidence, and reasoning framework for talk and writing. New York: Pearson Allyn & Bacon.Google Scholar
- Michaels, S., & O’Connor, C. (2012). Talk science primer. Cambridge, MA. Retrieved from http://inquiryproject.terc.edu/shared/pd/TalkScience_Primer.pdf
- National Research Council. (2011). In M. A. Honey, M. Hilton, & Committee on Science Learning (Eds.), Learning science through computer games and simulations. Washington, DC: The National Academies Press.Google Scholar
- NGSS Lead States. (2013). Next generation science standards: for states, by states. Washington: National Academies Press.Google Scholar
- Osborne, J. F., Henderson, B., MacPherson, A., & Szu, E. (2013). Validating and assessing a new progress map for student argumentation in science. Paper presented at the American Educational Research Association, San Francisco, CA.Google Scholar
- Pallant, A., & Tinker, R. F. (2004). Reasoning with atomic-scale molecular dynamic models. Journal of Science Education and Technology, 13(1), 51–66. https://doi.org/10.1023/B:JOST.0000019638.01800.d0.CrossRefGoogle Scholar
- Rasch, G. (1960/1980). Probabilistic models for some intelligence and attainment tests. (Copenhagen, Danish Institute for Educational Research), expanded edition (1980) with foreword and afterword by B. D. Wright. Chicago: The University of Chicago Press.Google Scholar
- Schmidt, W., McKnight, C., & Raizen, S. (1997). A splintered vision: an investigation of U.S. science and mathematics. Boston: Kluwer.Google Scholar
- Spybrook, J., Bloom, H., Congdon, R., Hill, C., Martinez, A., & Raudenbush, S. W. (2011). Optimal design plus empirical evidence: documentation for the “optimal design” [software version 3.0]. www.wtgrantfoundation.org.
- Steinkuehler, C., & Chmiel, M. (2006). Fostering scientific habits of mind in the context of online play. In S. A. Barab, K. E. Hay, & D. T. Hickey (Eds.), Proceedings of the international conference of the learning sciences (pp. 723–729). Mahwah: Erlbuam Retrieved from http://website.education.wisc.edu/steinkuehler/blog/papers/tenure/publications/51_SteinkuehlerChmiel.pdf.Google Scholar
- Stevens, S. Y., & Shin, N. (2010). An investigation into students’ interpretations of submicroscopic representations. (pp. 439–440). Paper presented at International Conferences of the Learning Sciences.Google Scholar
- Takeuchi, L. M., & Vaala, S. (2014). Level up learning: a national survey on teaching with digital games. New York: The Joan Ganz Cooney Center at Sesame Workshop.Google Scholar
- Tobin, K., & McRobbie, C. J. (1996). Cultural myths as constraints to the enacted curriculum. Science Education, 80(2), 223–241. https://doi.org/10.1002/(SICI)1098-237X(199604)80:2<223::AID-SCE6>3.0.CO;2-I.
- Usiskin, Z. (1985). We need another revolution in secondary school mathematics. Yearbook of the National Council of Teachers of Mathematics. Reston: NCTM.Google Scholar
- Vega, V (2013). Technology integration research review. Retrieved from http://www.edutopia.org/technology-integration-research-learning-outcomes
- Walsh, J. A., & Sattes, B. D. (2011). Thinking through quality questioning: deepening student engagement. Thousand Oaks: Corwin.Google Scholar
- Wilson, C. D., Taylor, J. A., Kowalski, S. M., & Carlson, J. (2010). The relative effects and equity of inquiry-based and commonplace science teaching on students’ knowledge, reasoning and argumentation. Journal of Research in Science Teaching, 47(3), 276–301.Google Scholar