Immersive Games and Expert-Novice Differences

  • Amanda J. H. Bond
  • Jay Brimstin
  • Angela Carpenter
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 498)

Abstract

Immersive game-based training has been used effectively for years to train within numerous domains. Immersive simulations and games, however, are frequently used to train at the pinnacle of instruction, though research shows that game- and simulation-based training platforms are consistently more effective than traditional training across all phases of instruction. Game-based training has potentially limitless variables on which training can be adapted: troops can change efficacy, weather can turn and equipment can malfunction. Understanding the relationships between adaptive variables is key to effective game design that distinguishes expert and novice performers for assessment. This paper describes the development of a simulation-based game using distributed concept maps for expertise categorization. The expert models were incorporated into a real-time strategy game intended for use to train and assess understanding of and adherence to Army doctrine. Preliminary validation data are also presented comparing the game to traditional Interactive Multimedia Instruction (IMI) courseware.

Keywords

Serious games Expert-novice differences Adaptive training Scenario-based training 

References

  1. 1.
    Shute, V.J., Zapata-Rivera, D.: Adaptive educational systems. In: Durlach, P. (ed.) Adaptive Technologies for Training and Education, pp. 7–27. Cambridge University Press, New York (2012)CrossRefGoogle Scholar
  2. 2.
    Department of the Army: The U.S. Army Learning Concept for 2015 (TRADOC Pamphlet 525-8-2). Headquarters, United States Army Training and Doctrine Command: Fort Monroe, Virginia (2011)Google Scholar
  3. 3.
    Minnesota Educational Computing Consortium: Number Munchers (1986)Google Scholar
  4. 4.
    Borderbund Software: Where in the World is Carmen Sandiego (1985)Google Scholar
  5. 5.
    Charsky, D.: From edutainment to serious games: a change in the use of game characteristics. Games Culture 5(2), 177–198 (2010)CrossRefGoogle Scholar
  6. 6.
    Cannon-Bowers, J.: The way ahead in game based learning. Paper presented at the Defense Game Tech Users Conference, Orlando, FL (2010)Google Scholar
  7. 7.
    Gopher, D., Weil, M., Bareket, T.: Transfer of skill from a computer game trainer to flight. Hum. Factors 36(3), 387–405 (1994)Google Scholar
  8. 8.
    Wouters, P., van Nimwegen, C., van Oostendorp, H., van der Spek, E.D.: A Meta-Analysis of the cognitive and motivational effects of serious games. J. Educ. Psychol. 105(2), 249–265 (2013)CrossRefGoogle Scholar
  9. 9.
    Stizmann, T.: A meta-analytic examination of the instructional effectiveness of computer-based simulation games. Pers. Psychol. 64(2), 489–528 (2011)CrossRefGoogle Scholar
  10. 10.
    Reiber, L.P.: Seriously considering play: designing interactive learning environment ased on the blending of microworlds, simulations, and games. Educ. Tech. Res. Dev. 44(2), 43–58 (1996)CrossRefGoogle Scholar
  11. 11.
    Girard, C., Ecalle, J., Magnan, A.: Serious games as new educational tools: how effective are they? A meta-analysis of recent studies. J. Comput. Assist. Learn. 29(3), 207–219 (2012)CrossRefGoogle Scholar
  12. 12.
    Soflano, M., Connolly, T.M., Hainey, T.: An application of adaptive games-based learning based on learning style to teach SQL. Comput. Educ. 66, 192–211 (2015)CrossRefGoogle Scholar
  13. 13.
    Lee, J., Park, O.: Adaptive instructional systems. In: Spector, J.M., Merill, M.D., van Merrienboer, J., Driscoll, M.P. (eds.) Handbook of Research for Educational Communications and Technology, pp. 469–484. Routledge, Taylor & Francis Group, New York (2007)Google Scholar
  14. 14.
    Kolb, D.A.: Experiential Learning: Experience as the Source of Learning and Development. Prentice Hall, Englewood Cliffs, NJ (1984)Google Scholar
  15. 15.
    Kim, M.K.: Theoretically grounded guidelines for assessing learning progress: cognitive changes in ill-structured complex problem-solving contexts. Educ. Technol. Res. Dev. 60, 601–622 (2012)CrossRefGoogle Scholar
  16. 16.
    Spector, J.M., Koszalka, T.A.: The DEEP methodology for assessing learning in complex domains. Final report to the National Science Foundation Evaluative Research and Evaluation Capacity Building, Syracuse University, Syracuse, NY (2004)Google Scholar
  17. 17.
    Tobias, S., Fletcher, J.D.: What research has to say about designing computer games for learning. Educ. Technol. 47(5), 20–29 (2007)Google Scholar
  18. 18.
    Hoffman, R.R., Lintern, G.: Eliciting and representing the knowledge of experts. In: Ericsson, K.A., Charness, N., Feltovich, P., Hoffman, R. (eds.) Cambridge Handbook of Expertise and Expert Performance, pp. 203–222. Cambridge University Press, New York (2006)CrossRefGoogle Scholar
  19. 19.
    Novak, J.D., Cañas, A.J.: Theoretical origins of concept maps, how to construct them and uses in education. Reflecting Educ. 3(1), 29–42 (2007)Google Scholar
  20. 20.
    Department of the Army: Armor and Mechanized Company Team (Army Training Publication 3-90.1). Headquarters, Department of the Army: Washington, DC (2002)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Amanda J. H. Bond
    • 1
  • Jay Brimstin
    • 2
  • Angela Carpenter
    • 1
  1. 1.Cubic Global DefenseOrlandoUSA
  2. 2.Maneuver Center of ExcellenceFort BenningUSA

Personalised recommendations