A Quantitative Approach for Developing Serious Games for Aptitude and Trait Assessment

  • Brenton M. WiernikEmail author
  • Michael D. Coovert
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11899)


We describe a development process for serious games to create psychometrically rigorous measures of individual aptitudes (abilities, skills) and traits (habits, tendencies, behaviors). We begin with a discussion of serious games and how they can instantiate appropriate cognitive states for relevant aptitudes and traits to manifest. This can have numerous advantages over traditional assessment modalities. We then describe the iterative approach to aptitude and trait measurement that emphasizes (1) careful definition and specification of the traits and aptitudes to be measured, (2) rigorous assessment of reliability and validity, and (3) revision of gameplay elements and metrics to improve measurement properties.


Assessment Validity Factor analysis Personality Cognitive ability 


  1. 1.
    Kozlowski, S.W.J., DeShon, R.P.: A psychological fidelity approach to simulation-based training: theory, research and principles. In: Schiflett, S.G., Elliott, L.R., Salas, E., Coovert, M.D. (eds.) Scaled Worlds: Development, Validation and Applications. Routledge, London (2017).
  2. 2.
    Society for Industrial and Organizational Psychology, American Psychological Association: Principles for the validation and use of personnel selection procedures (Fifth edition) (2018).
  3. 3.
    Ford, J.K., Meyer, T.: Advances in training technology: meeting the workplace challenges of talent development, deep specialization, and collaborative learning. In: Coovert, M.D., Thompson, L.F. (eds.) The Psychology of Workplace Technology. Routledge, New York (2013).
  4. 4.
    Long, D.T., Mulch, C.M.: Interactive wargaming cyberwar: 2025 (2017).
  5. 5.
    Wiemeyer, J., Hardy, S.: Serious games and motor learning: concepts, evidence, technology. In: Bredl, B., Bösche, W. (eds.) Serious Games and Virtual Worlds in Education, Professional Development, and Healthcare, pp. 197–220. IGI Global, Hershey (2013).
  6. 6.
    Wiemeyer, J., Kliem, A.: Serious games in prevention and rehabilitation—a new panacea for elderly people? Eur. Rev. Aging Phys. Act. 9, 41–50 (2012).
  7. 7.
    The O*NET® Content Model.
  8. 8.
    Ludoscience: A collaborative classification of serious games.
  9. 9.
    Dörner, R., Göbel, S., Effelsberg, W., Wiemeyer, J. (eds.): Serious Games: Foundations, Concepts and Practice. Springer, Cham (2016).
  10. 10.
    Coovert, M.D., Winner, J., Bennett, W.: Construct development and validation in game-based research. Simul. Gaming. 48, 236–248 (2017).
  11. 11.
    Lievens, F., Patterson, F.: The validity and incremental validity of knowledge tests, low-fidelity simulations, and high-fidelity simulations for predicting job performance in advanced-level high-stakes selection. J. Appl. Psychol. 96, 927–940 (2011).
  12. 12.
    Schell, J.: The Art of Game Design: A Book of Lenses CRC Press, Boca Raton (2008).
  13. 13.
    Anderson, N., Salgado, J.F., Hülsheger, U.R.: Applicant reactions in selection: comprehensive meta-analysis into reaction generalization versus situational specificity. Int. J. Sel. Assess. 18, 291–304 (2010).
  14. 14.
    Anderson, N.: Applicant and recruiter reactions to new technology in selection: a critical review and agenda for future research. Int. J. Sel. Assess. 11, 121–136 (2003).
  15. 15.
    Gilliland, S.W.: Fairness from the applicant’s perspective: reactions to employee selection procedures. Int. J. Sel. Assess. 3, 11–18 (1995).
  16. 16.
    Ones, D.S., Viswesvaran, C., Reiss, A.D.: Role of social desirability in personality testing for personnel selection: the red herring. J. Appl. Psychol. 81, 660–679 (1996).
  17. 17.
    Eklöf, H.: Skill and will: test taking motivation and assessment quality. Assess. Educ. Princ. Policy Pract. 17, 345–356 (2010).
  18. 18.
    McFarland, L.A., Yun, G.J., Harold, C.M., Viera, L., Moore, L.G.: An examination of impression management use and effectiveness across assessment center exercises: the role of competency demands. Pers. Psychol. 58, 949–980 (2005).
  19. 19.
    Wilson, M.A. (ed.): The Handbook of Work Analysis: Methods, Systems, Applications and Science of Work Measurement in Organizations. Routledge, New York (2013).
  20. 20.
    Coovert, M.D., Winner, J., Bennett, Jr., W., Howard, D.J.: Serious games are a serious tool for team research. Int. J. Serious Games. 4 (2017).
  21. 21.
    Campbell, J.P., Wiernik, B.M.: The modeling and assessment of work performance. Annu. Rev. Organ. Psychol. Organ. Behav. 2 47–74 (2015).
  22. 22.
    Wiemeyer, J., Kickmeier-Rust, M., Steiner, Christina M.: Performance assessment in serious games. In: Dörner, R., Göbel, S., Effelsberg, W., Wiemeyer, J. (eds.) Serious Games, pp. 273–302. Springer, Cham (2016). Scholar
  23. 23.
    Nandakumar, R.: Assessing essential unidimensionality of real data. Appl. Psychol. Meas. 17, 29–38 (1993).
  24. 24.
    Gignac, G.E., Watkins, M.W.: Bifactor modeling and the estimation of model-based reliability in the WAIS-IV. Multivar. Behav. Res. 48, 639–662 (2013).
  25. 25.
    Coovert, M.D., McNelis, K.: Determining the number of common factors in factor analysis: a review and program. Educ. Psychol. Meas. 48, 687–692 (1988).
  26. 26.
    Comrey, A.L.: A First Course in Factor Analysis, 2 edn. Psychology Press, New York (2013).
  27. 27.
    Reise, S.P.: The rediscovery of bifactor measurement models. Multivar. Behav. Res. 47, 667–696 (2012).
  28. 28.
    Wiernik, B.M., Wilmot, M.P., Kostal, J.W.: How data analysis can dominate interpretations of dominant general factors. Ind. Organ. Psychol. 8, 438–445 (2015).
  29. 29.
    Giordano, C.A., Waller, N.G.: Recovering bifactor models: a comparison of seven methods. Psychol. Methods (2019). Scholar
  30. 30.
    McArdle, J.J.: Latent variable modeling of differences and changes with longitudinal data. Annu. Rev. Psychol. 60, 577–605 (2009).
  31. 31.
    Coovert, M., Miller, E., Bennett, Jr., W.: Assessing trust and effectiveness in virtual teams: latent growth curve and latent change score models. Soc. Sci. 6, 87 (2017).
  32. 32.
    Bollen, K.A., Curran, P.J.: Latent Curve Models: A Structural Equation Perspective. Wiley, Hoboken (2005).
  33. 33.
    Kuhn, M., Johnson, K.: Applied Predictive Modeling. Springer, New York (2013). Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of South FloridaTampaUSA

Personalised recommendations