Alternate Rubric for Performance Assessment of Infantry Soldier Skills Training

  • Douglas MaxwellEmail author
  • Jonathan Stevens
  • Crystal Maraj
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9744)


Gauging the impact of simulation-based training (SBT) technology has been straightforward in the past when applied to domains such as pilot training and ground vehicle operator training. In the dismounted infantry soldier skills domain, the low hanging fruit for effective use of (SBT) are weapons and equipment operations training. However, the complexities of the operational environment are often too difficult to replicate in current virtual environments to represent an accurate or effective training for the skills requiring identification of enemy activity or reacting to enemy contact. This paper discusses the need for an alternate method of performance assessment when comparing traditional training means to SBT.


Simulation based training Infantry soldier training Rubric Return on investment 



The U. S. Army Research Laboratory and the University of Central Florida research team would like to express our profound gratitude to the dedicated men and women of the 2/124th and 211th without whom this work could not have been performed. We thank you for your service to our country.


  1. Association of the United States Army: Preparing for Warrior Leader course (WLC). Arlington, Virginia, United States of America (2010). Accessed 1 Feb 2016
  2. Beal, S., Christ, R.: Training effectiveness evaluation of the full spectrum command game (No. ARI-TR-1140). Army Research Institute for the Behavioral and Social Sciences, Fort Benning, GA (2004)Google Scholar
  3. Bell, B.S., Kanar, A.M., Kozlowski, S.W.: Current issues and future directions in simulation-based training. CAHRS Working Paper Series, 492 (2008)Google Scholar
  4. Blow, C.: Flight School in the Virtual Environment. United States Army Command and General Staff College Fort Leavenworth, Kansas (2012)Google Scholar
  5. Buede, D., Maxwell, D., McCarter, B.: Filling the need for intelligent, adaptive non-player characters. In: Proceedings of the Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), Orlando, FL (2013)Google Scholar
  6. Garland, R.: The mid-point on a rating scale: is it desirable. Mark. Bull. 2(1), 66–70 (1991)MathSciNetGoogle Scholar
  7. Haque, S., Srinivasan, S.: A meta-analysis of the training effectiveness of virtual reality surgical simulators. IEEE Trans. Inf. Technol. Biomed. 10, 51–58 (2006)CrossRefGoogle Scholar
  8. Hays, R., Jacobs, J., Carolyn, P., Salas, E.: Flight simulator training effectiveness: a meta-analysis. Mil. Psychol. 4, 63–74 (1992)CrossRefGoogle Scholar
  9. Kincaid, J.P., Hamilton, R., Tarr, R., Sangani, H.: Simulation in Education and Training, pp. 437–456. Springer, US (2003)Google Scholar
  10. Kunche, A., Puli, R.K., Guniganti, S., Puli, D.: Analysis and evaluation of training effectiveness. Hum. Resour. Manag. Res. 1(1), 1–7 (2011)Google Scholar
  11. Lackey, S., Salcedo, J., Matthews, G., Maxwell, D.: Virtual world room clearing: a study in training effectiveness. In: Interservice/Industry Training, Simulation and Education Conference (I/ITSEC), Orlando, Florida (2014)Google Scholar
  12. Maraj, C., Lackey, S., Badillo-Urquiola, K., Ogreten, S., Maxwell, D.: Empirically derived recommendations for training novices using virtual worlds. In: Interservice/Industry Training, Simulation and Education Conference (I/ITSEC), Orlando, pp. 1–8 (2015)Google Scholar
  13. McLennan, K.: Interactivity, engagement, and increased learning outcomes in 3D virtual world educational simulations. In: Proceedings of the World Conference on Educational Multimedia, Hypermedia and Telecommunications, p. 843. AACE, Chesapeake, VA (2012)Google Scholar
  14. Ortiz, E., Maxwell, D.: Military Open Simulator Enterprise Strategy (MOSES) (2016). Retrieved from Military Metaverse:
  15. Pickup, S.: ARMY AND MARINE Better Performance and Cost Data Needed to More Fully Assess Efforts. United States Government Accountability Office, Washington, D.C. (2013).
  16. Salas, E., Fowlkes, J., Stout, R., Milanovich, D., Prince, C.: Does CRM training improve teamwork skills in the cockpit?: two evaluation studies. Hum. Fact. J. Hum. Fact. Ergon. Soc. 41, 326–343 (1999)CrossRefGoogle Scholar
  17. Salas, E., Milham, L.M., Bowers, C.A.: Training evaluation in the military: misconception, opportunities, and challenges. Mil. Psychol. 15(1), 3–16 (2003)CrossRefGoogle Scholar
  18. Seibert, M., Diedrich, F., Ayers, J., Dean, C., Zeidman, T., Bink, M., Stewart, J.: Addressing Army Aviation Collective Training Challenges with Simulators and Simulations Capabilities. Aptima, Woburn (2012)Google Scholar
  19. Seibert, M., Diedrich, F., Stewart, J., Bink, M., Zeidman, T.: Developing Performance Measures for Army Aviation Collective Training. Army Research Institute, Fort Benning (2011)Google Scholar
  20. Sotomayor, T., Proctor, M.: Assessing combat medic knowledge and transfer effects resulting from alternative training treatments. J. Defense Model. Simul. Appl. Methodol. Technol. 6, 121–134 (2009)Google Scholar
  21. U.S. Army Training and Doctrine Command: FM 3–21.8 The Infantry Rifle Platoon and Squad. Washington, D.C. (2007).
  22. Whitney, S., Tempby, P., Stephens, A.: A review of the effectiveness of game-based training for dismounted soldiers. J. Defense Model. Simul. 319–328 (2014)Google Scholar
  23. Wong, J., Nguyen A, Ogren, L.: Serious Game and Virtual World Training: Instrumentation and Assessment. Naval Undersea Warfare Center, Newport, RI (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Douglas Maxwell
    • 1
    Email author
  • Jonathan Stevens
    • 2
  • Crystal Maraj
    • 2
  1. 1.U.S. Army Research LaboratoryOrlandoUSA
  2. 2.University of Central FloridaOrlandoUSA

Personalised recommendations