Skip to main content

Guidelines for the Design and Implementation of Game Telemetry for Serious Games Analytics

  • Chapter

Part of the book series: Advances in Game-Based Learning ((AGBL))

Abstract

The design of game telemetry requires careful attention to the chain of reasoning that connects low-level behavioral events to inferences about players’ learning and performance. Measuring performance in serious games is often difficult because seldom do direct measures of the desired outcome exist in the game. Game telemetry is conceived as the fundamental element from which measures of player performance are developed. General psychometric issues are raised for game-based measurement, and data issues are raised around format, context, and increasing the meaningfulness of the data itself. Practical guidelines for the design of game telemetry are presented, including targeting in-game behaviors that reflect cognitive demands, recoding data at the finest usable grain size, representing the data in a format usable by the largest number of people, and recording descriptions of behavior and not inferences with as much contextual information as practical. A case study is presented on deriving measures in a serious game intended to teach fraction concepts.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • APA, AERA, & NCME. (2014). Standards for educational and psychological testing (2014th ed.). Washington, DC: Author.

    Google Scholar 

  • Bakeman, R., & Gottman, J. M. (1997). Observing interaction: An introduction to sequential analysis (2nd ed.). Cambridge, England: Cambridge University Press.

    Book  Google Scholar 

  • Bakeman, R., & Quera, V. (2012). Behavioral observation. In H. Cooper (Ed.-in-Chief), P. Camic, D. Long, A. Panter, D. Rindskopf, & K. J. Sher (Assoc. Eds.), APA handbooks in psychology: Vol. 1. APA handbook of research methods in psychology: Psychological research: Foundations, planning, methods, and psychometrics. Washington, DC: American Psychological Association.

    Google Scholar 

  • Baker, E. L. (1997). Model-based performance assessment. Theory Into Practice, 36(4), 247–254.

    Article  Google Scholar 

  • Baker, E. L., Chung, G. K. W. K., & Delacruz, G. C. (2008). Design and validation of technology-based performance assessments. In J. M. Spector, M. D. Merrill, J. J. G. van Merriёnboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 595–604). Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Baker, E. L., Chung, G. K. W. K., & Delacruz, G. C. (2012). The best and future uses of assessment in games. In M. Mayrath, J. Clarke-Midura, & D. H. Robinson (Eds.), Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research (pp. 248–299). Charlotte, NC: Information Age.

    Google Scholar 

  • Baker, E. L., Chung, G. K. W. K., Delacruz, G. C., & Madni, A. (2013, March). DARPA ENGAGE program review: CRESST—TA2. Presentation at the ENGAGE PI meeting (Phase II review). Arlington, VA: Defense Advanced Research Projects Agency, Russell Shilling, Program Manager.

    Google Scholar 

  • Baker, R. S. J. D., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–17.

    Google Scholar 

  • Bennett, R. E., Persky, H., Weiss, A. R., & Jenkins, F. (2007). Problem solving in technology-rich environments: A report from the NAEP technology-based assessment project (NCES 2007–466). Washington, DC: National Center for Education Statistics.

    Google Scholar 

  • Berkhin, R. (2006). A survey of clustering data mining techniques. In J. Kogan, C. Nicholas, & M. Teboulle (Eds.), Grouping multidimensional data (pp. 25–72). New York: Springer.

    Chapter  Google Scholar 

  • Bittick, S. J., & Chung, G. K. W. K. (2011). The use of narrative: Gender differences and implications for motivation and learning in a math game (CRESST Report 804). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

    Google Scholar 

  • Bousbia, N., & Belamri, I. (2014). Which contribution does EDM provide to computer-based learning environments? In A. Peña-Ayala (Ed.), Educational data mining: Applications and trends (Studies in computational intelligence) (pp. 3–28). Cham, Switzerland: Springer.

    Chapter  Google Scholar 

  • Cai, L. (2013). Potential applications of latent variable modeling for the psychometrics of medical simulation. Military Medicine, 178(10S), 115–120.

    Article  Google Scholar 

  • CATS. (2012). CATS developed games (CRESST Resource Report No. 15). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  • Chung, G. K. W. K. (2014). Toward the relational management of educational measurement data. Teachers College Record, 116(11), 1–16. Retrieved from http://www.tcrecord.org/Content.asp?ContentId=17650

  • Chung, G. K. W. K. (2015, January). Updates on final wave of content/outreach: Part II: Learner modeling. Presentation at the 2015 Ready to Learn Advisors and Partners Meeting. Washington, DC.

    Google Scholar 

  • Chung, G. K. W. K., & Baker, E. L. (2003). An exploratory study to examine the feasibility of measuring problem-solving processes using a click-through interface. Journal of Technology, Learning, and Assessment, 2(2). Retrieved from http://jtla.org

  • Chung, G. K. W. K., Choi, K.-C., Baker, E. L., & Cai, L. (2014). The effects of math video games on learning: A randomized evaluation study with innovative impact estimation techniques (CRESST Report 841). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  • Chung, G. K. W. K., de Vries, L. F., Cheak, A. M., Stevens, R. H., & Bewley, W. L. (2002). Cognitive process validation of an online problem solving assessment. Computers in Human Behavior, 18, 669–684.

    Article  Google Scholar 

  • Chung, G. K. W. K., & Kerr, D. (2012). A primer on data logging to support extraction of meaningful information from educational games: An example from Save Patch (CRESST Report 814). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  • Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education, 59, 661–686.

    Article  Google Scholar 

  • Corporation for Public Broadcasting (CPB), & PBS Kids. (2011). Findings from ready to learn: 2005–2010. Washington, DC: Author.

    Google Scholar 

  • Delacruz, G. C. (2012). Impact of incentives on the use of feedback in educational videogames (CRESST Report 813). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

    Google Scholar 

  • Drachen, A., Canossa, A., & Sørensen, J. R. M. (2013). Gameplay metrics in game user research: Examples from the trenches. In M. Seif El-Nasr, A. Drachen, & A. Canossa (Eds.), Game analytics: Maximizing the value of player data (pp. 285–319). London: Springer.

    Chapter  Google Scholar 

  • Drachen, A., Thurau, C., Togelius, J., Yannakakis, G. N., & Bauckhage, C. (2013). Game data mining. In M. Seif El-Nasr, A. Drachen, & A. Canossa (Eds.), Game analytics: Maximizing the value of player data (pp. 205–253). London: Springer.

    Chapter  Google Scholar 

  • Gagné, A. R., Seif El-Nasr, M., & Shaw, C. D. (2012). Analysis of telemetry data from a real-time strategy game: A case study. ACM Computers in Entertainment (CIE)—Theoretical and Practical Computer Applications in Entertainment, 10(3), Article No. 2. doi:10.1145/2381876.2381878.

  • Girard, C., Ecalle, J., & Magnan, A. (2013). Serious games as new educational tools: How effective are they? A meta-analysis of recent studies. Journal of Computer Assisted Learning, 29, 207–219.

    Article  Google Scholar 

  • Hullet, K., Nagappan, N., Schuh, E., & Hopson, J. (2012). Empirical analysis of user data in game software development. In Proceedings of the ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (pp. 89–96). Retrieved from http://dl.acm.org/citation.cfm?doid=2372251.2372265

  • Ifenthaler, D., Eseryel, D., & Ge, X. (Eds.). (2012). Assessment in game-based learning: Foundations, innovations, and perspectives. New York: Springer.

    Google Scholar 

  • James, F., & McCulloch, C. (1990). Multivariate analysis in ecology and systematic: Panacea or Pandora’s box? Annual Review of Ecology and Systematics, 21, 129–166.

    Article  Google Scholar 

  • Junker, B. W. (2011). Modeling hierarchy and dependence among task responses in educational data mining. In C. Romero, S. Ventura, M. Pechenizkiy, & R. S. J. D. Baker (Eds.), Handbook of educational data mining (pp. 143–155). Boca Raton, FL: CRC.

    Google Scholar 

  • Katz, I. R., & James, C. M. (1998). Toward assessment of design skill in engineering (GRE Research Report 97–16). Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Kerr, D. & Chung, G. K. W. K. (2012a). The mediation effect of in-game performance between prior knowledge and posttest score (CRESST Report 819). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  • Kerr, D., & Chung, G. K. W. K. (2012b). Identifying key features of student performance in educational video games and simulations through cluster analysis. Journal of Educational Data Mining, 4, 144–182.

    Google Scholar 

  • Kerr, D., & Chung, G. K. W. K. (2012c). Using cluster analysis to extend usability testing to instructional content (CRESST Report 816). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  • Kerr, D., & Chung, G. K. W. K. (2013a). Identifying learning trajectories in an educational video game. In R. Almond & O. Mengshoel (Eds.), Proceedings of the 2013 UAI Application Workshops: Big Data Meet Complex Models and Models for Spatial, Temporal and Network Data (pp. 20–28). Retrieved from http://ceur-ws.org/Vol-1024/

  • Kerr, D., & Chung, G. K. W. K. (2013b). The effect of in-game errors on learning outcomes (CRESST Report 835). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  • Kerr, D., Chung, G. K. W. K., & Iseli, M. R. (2011). The feasibility of using cluster analysis to examine log data from educational video games (CRESST Report 790). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

    Google Scholar 

  • Kim, J.-K., & Chung, G. K. W. K. (2012). The use of a survival analysis technique in understanding game performance in instructional games (CRESST Tech. Rep. No. 812). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.

    Google Scholar 

  • Kim, J. H., Gunn, D. V., Schuh, E., Phillips, B. C., Pagulayan, R. J., & Wixon, D. (2008). Tracking real-time user experience (TRUE): A comprehensive instrumentation solution for complex systems. In Proceedings of the 26th Annual SIGCHI Conference on Human Factors in Computing Systems (pp. 443–452). New York: Association for Computing Machinery.

    Google Scholar 

  • Koenig, A., Iseli, M., Wainess, R., & Lee, J. J. (2013). Assessment methodology for computer-based instructional simulations. Military Medicine, 178(10S), 47–54.

    Article  Google Scholar 

  • Levy, R. (2013). Psychometric and evidentiary advances, opportunities, and challenges for simulation-based assessment. Educational Assessment, 18, 182–207.

    Article  Google Scholar 

  • Levy, R. (2014). Dynamic Bayesian network modeling of game based diagnostic assessments (CRESST Report 837). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).

    Google Scholar 

  • Linn, R. L. (2010). Validity. In B. McGaw, P. L. Peterson, & E. L. Baker (Eds.), International encyclopedia of education (3rd ed., Vol. 4, pp. 181–185). Oxford, England: Elsevier.

    Chapter  Google Scholar 

  • Loh, C. S. (2011, September). Using in situ data collection to improve the impact and return of investment of game-based learning. In Proceedings of ICEM-SIIE 2011, the 61st International Council for Educational Media (ICEM) and the XIII International Symposium on Computers in Education (SIIE) Joint Conference. Aveiro, Portugal: ICEM-SIIE.

    Google Scholar 

  • Loh, C. S. (2012). Information trails: In-process assessment of game-based learning. In D. Ifenthaler, D. Eseryel, & X. Ge (Eds.), Assessment in game-based learning: Foundations, innovations, and perspectives (pp. 123–144). New York: Springer.

    Chapter  Google Scholar 

  • Loh, C. S., & Sheng, Y. (2014). Maximum similarity index (MSI): A metric to differentiate the performance of novices vs. multiple-experts in serious games. Computers in Human Behavior, 39, 322–330.

    Article  Google Scholar 

  • Merceron, A., & Yacef, K. (2004). Mining student data captured from a web-based tutoring tool: Initial exploration and results. Journal of Interactive Learning Research, 15, 319–346.

    Google Scholar 

  • Messick, S. (1995). Validity of psychological assessment. American Psychologist, 50, 741–749.

    Article  Google Scholar 

  • Mislevy, R. J. (2013). Evidence-centered design for simulation-based assessment. Military Medicine, 178(10S), 101–114.

    Article  Google Scholar 

  • Mislevy, R. J., Behrens, J. T., DiCerbo, K. E., & Levy, R. (2012). Design and discovery in educational assessment: Evidence-centered design, psychometrics, and educational data mining. Journal of Educational Data Mining, 4(1), 11–48.

    Google Scholar 

  • Mislevy, R. J., Orange, A., Bauer, M. I., von Davier, A., Hao, J., Corrigan, S., et al. (2014). Psychometric considerations in game-based assessment. New York: GlassLab Research, Institute of Play.

    Google Scholar 

  • Mohamad, S. K., & Tasir, Z. (2013). Educational data mining: A review. Procedia—Social and Behavioral Sciences, 97, 320–324.

    Article  Google Scholar 

  • National Research Council (NRC). (2013). Frontiers in massive data analysis. Washington, DC: National Academies Press.

    Google Scholar 

  • O’Neil, H. F., Chung, G. K. W. K., & Williams, P. (2013). The effects of game-based instructional feedback on developmental math progress in a Hispanic-serving institution. Arlington, VA: Office of Naval Research Cognitive Science of Learning Program Review.

    Google Scholar 

  • Ostrov, J. M., & Hart, E. J. (2013). Observational methods. In T. D. Little (Ed.), The Oxford handbook of quantitative methods in psychology (Vol. 1, pp. 285–303). Oxford, England: Oxford University Press.

    Google Scholar 

  • Quellmalz, E. S., Davenport, J. L., Timms, M. J., DeBoer, G. E., Jordan, K. A., Huang, C.-W., et al. (2013). Next-generation environments for assessing and promoting complex science learning. Journal of Educational Psychology, 105(4), 1100–1114. doi:10.1037/a0032220.

    Article  Google Scholar 

  • Romero, C., Gonzalez, P., Ventura, S., del Jesus, M. J., & Herrera, F. (2009). Evolutionary algorithms for subgroup discovery in e-learning: A practical application using Moodle data. Expert Systems with Applications, 39, 1632–1644.

    Article  Google Scholar 

  • Romero, C., Romero, J. R., & Ventura, S. (2014). A survey on pre-processing educational data. In A. Peña-Ayala (Ed.), Educational data mining: Applications and trends (Studies in computational intelligence) (pp. 29–64). Cham, Switzerland: Springer.

    Chapter  Google Scholar 

  • Romero, C., & Ventura, S. (2007). Educational data mining: A survey from 1995 to 2005. Expert Systems with Applications, 33, 125–146.

    Article  Google Scholar 

  • Romero, C., & Ventura, S. (2010). Educational data mining: A review of the state-of-the-art. IEEE Transactions on Systems, Man, and Cybernetics Part C: Applications and Reviews, 40, 601–618.

    Article  Google Scholar 

  • Romero, C., Ventura, S., Pechenizkiy, M., & Baker, R. S. J. D. (Eds.). (2010). Handbook of educational data mining. Boca Raton, FL: CRC Press.

    Google Scholar 

  • Santhosh, S., & Vaden, M. (2013). Telemetry and analytics best practices and lessons learned. In M. Seif El-Nasr, A. Drachen, & A. Canossa (Eds.), Game analytics: Maximizing the value of player data (pp. 85–109). London: Springer.

    Chapter  Google Scholar 

  • Seif El-Nasr, M., Drachen, A., & Canossa, A. (Eds.). (2013). Game analytics: Maximizing the value of player data. London: Springer.

    Google Scholar 

  • Shaffer, D. W., & Gee, J. (2012). The right kind of GATE: Computer games and the future of assessment. In M. Mayrath, D. Robinson, & J. Clarke-Midura (Eds.), Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research (pp. 211–228). Charlotte, NC: Information Age.

    Google Scholar 

  • Shoukry, L., Göbel, S., & Steinmetz, R. (2014). Learning analytics and serious games: Trends and considerations. In Proceedings of the 2014 ACM International Workshop on Serious Games (pp. 21–26). Orlando, FL: ACM.

    Google Scholar 

  • Shute, V. J., & Ke, F. (2012). Games, learning, and assessment. In D. Ifenthaler, D. Eseryel, & X. Ge (Eds.), Assessment in game-based learning: Foundations, innovations, and perspectives (pp. 43–58). New York: Springer.

    Chapter  Google Scholar 

  • Tate, W. F. (2012). Pandemic preparedness: Using geospatial modeling to inform policy in systems of education and health in metropolitan America. In W. F. Tate (Ed.), Research on schools, neighborhoods, and communities: Toward civic responsibility (pp. 411–430). Lanham, MD: Rowman and Littlefield.

    Google Scholar 

  • Tobias, S., Fletcher, J. D., Dai, D. Y., & Wind, A. (2011). Review of research on computer games. In S. Tobias & J. D. Fletcher (Eds.), Computer games and instruction (pp. 127–222). Charlotte, NC: Information Age.

    Google Scholar 

  • U.S. Department of Education (DOE). (2010). Transforming American education: Learning powered by technology. Washington, DC: Author.

    Google Scholar 

  • U.S. Department of Education (DOE). (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. Washington, DC: Author.

    Google Scholar 

  • U.S. Department of Education (DOE). (2013). Expanded evidence approaches for learning in a digital world. Washington, DC: Author.

    Google Scholar 

  • Weber, B. G., Mateas, M., & Jhala, A. (2011). Using data mining to model player experience. In Proceedings of the FDG Workshop on Evaluating Player Experience in Games.

    Google Scholar 

  • Werner, L., McDowell, C., & Denner, J. (2013). A first step in learning analytics: Pre-processing low-level Alice logging data of middle school students. Journal of Educational Data Mining, 5(2), 11–37.

    Google Scholar 

  • Wetzler, M. (2013, June 26). Analytics for hackers: How to think about event data. Retrieved from https://keen.io/blog/53958349217/analytics-for-hackers-how-to-think-about-event-data

  • Williamson, D. M., Mislevy, R. J., & Bejar, I. I. (Eds.). (2006). Automated scoring of complex tasks in computer based testing. Mahwah, NJ: Erlbaum.

    Google Scholar 

Download references

Acknowledgments

The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305C080015 to the National Center for Research on Evaluation, Standards, and Student Testing (CRESST). The opinions expressed are those of the author and do not represent views of the Institute or the U.S. Department of Education.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gregory K. W. K. Chung .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Chung, G.K.W.K. (2015). Guidelines for the Design and Implementation of Game Telemetry for Serious Games Analytics. In: Loh, C., Sheng, Y., Ifenthaler, D. (eds) Serious Games Analytics. Advances in Game-Based Learning. Springer, Cham. https://doi.org/10.1007/978-3-319-05834-4_3

Download citation

Publish with us

Policies and ethics