Skip to main content

How to measure and model QoE for networked games?

A case study of World of Warcraft


In this paper, we investigate methodologies for modeling quality of experience (QoE) for networked video games, focusing on massively multiplayer online role-playing games (MMORPGs), and using Blizzard Entertainment’s World of Warcraft (WoW) as a case study. In two user studies, involving a total of 104 players, we investigate system, user, and context parameters and evaluate their impact on QoE and related quality features. We also discuss some methodological questions related to measuring gaming QoE, which can be used as guidelines for future gaming QoE studies. We further analyze a set of quality metrics “beyond MOS”. Having evaluated different modeling techniques, we present and evaluate four linear statistical models and three (non-linear) machine learning models for estimating MMORPG QoE. Finally, we make our datasets available to the research community to foster further analysis and reproducibility of results.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19


  1. 1.

    The dataset is available upon request, please visit for details.

  2. 2.

  3. 3.

    Further details on the scenario design are provided together with the dataset, please visit

  4. 4.

    Hovering over a computer icon in the main menu of WoW results in a pop-up window showing the estimated latency by the WoW client.


  1. 1.

    Bernhaupt, R.: Evaluating user experiences in games: concepts and methods, chap. User Experience Evaluation: An Overview on Methods used in Entertainment, pp. 3–10. Springer, Berlin (2010)

    Chapter  Google Scholar 

  2. 2.

    Beyer, J., Miruchna, V., Möller, S.: Assessing the impact of display size, game type, and usage context on mobile gaming QoE. In: Quality of Multimedia Experience (QoMEX), 2014 Sixth International Workshop on. IEEE, pp. 69–70 (2014)

  3. 3.

    Chang, Y.C., Chen, K.T., Wu, C.C., Ho, C.J., Lei, C.L.: Online game QoE evaluation using paired comparisons. In: 2010 IEEE International Workshop Technical Committee on Communications Quality and Reliability (CQR 2010). IEEE, pp. 1–6 (2010)

  4. 4.

    Chang, H.S., Hsu, C.-F., and Hoßfeld, T. and Chen, K.-T.: Active learning for crowdsourced QoE modeling. In: IEEE Transactions on Multimedia, April 2018 (2018)

  5. 5.

    Chen, P., El Zarki, M.: Perceptual view inconsistency: an objective evaluation framework for online game quality of experience (QoE). In: Proceedings of the 10th Annual Workshop on Network and Systems Support for Games, NetGames ’11. IEEE Press, Piscatawayv, pp. 2:1–2:6 (2011). Accessed 3 May 2019

  6. 6.

    Chen, K.T., Tu, C.C., Xiao, W.C.: Oneclick: A framework for measuring network quality of experience. In: INFOCOM 2009, IEEE. IEEE, pp. 702–710 (2009)

  7. 7.

    Chen, K.T., Wu, C.C., Chang, Y.C., Lei, C.L.: A crowdsourceable QoE evaluation framework for multimedia content. In: Proceedings of the 17th ACM International Conference on Multimedia. ACM, pp. 491–500 (2009)

  8. 8.

    Chen, K.T., Huang, P., Lei, C.L.: How sensitive are online gamers to network quality? Commun. ACM 49(11), 34–38 (2006)

    Article  Google Scholar 

  9. 9.

    Claypool, K., Claypool, M.: On frame rate and player performance in first person shooter games. Springer Multimed Syst J MMSJ 13(1), 3–17 (2007)

    Article  Google Scholar 

  10. 10.

    Csikszentmihalyi, M.: Flow: The Psychology of Optimal Experience. Harper and Row, Manhattan (1990)

    Google Scholar 

  11. 11.

    De Moor, K., Ketyko, I., Joseph, W., Deryckere, T., De Marez, L., Martens, L., Verleye, G.: Proposed framework for evaluating quality of experience in a mobile, testbed-oriented living lab setting. Mob. Netw. Appl. 15(3), 378–391 (2010)

    Article  Google Scholar 

  12. 12.

    Denieffe, D., Carrig, B., Marshall, D., Picovici, D.: A game assessment metric for the online gamer. Adv. Electr. Comput. Eng. 7, 3–6 (2007)

    Article  Google Scholar 

  13. 13.

    Dick, M., Wellnitz, O., Wolf, L.: Analysis of factors affecting players’ performance and perception in multiplayer games. In: Proceedings of 4th ACM SIGCOMM workshop on Network and system support for games, NetGames ’05, pp. 1–7. ACM, New York, NY, USA (2005).

  14. 14.

    Elo, A.: The Rating Of Chess Players, Past and Present. Arco, New York (1975)

    Google Scholar 

  15. 15.

    Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29, 1189–1232 (2000)

    MathSciNet  Article  MATH  Google Scholar 

  16. 16.

    Geel, I.V.: MMOData blog, Keeping track of the MMORPG scene, version 4.1 (2013). Accessed 3 May 2019

  17. 17.

    Herbrich, R., Minka, T., Graepel, T.: TrueSKill™: a bayesian skill rating system. Tech. Rep. MSR-TR-2006-80, Microsoft Research (2006)

  18. 18.

    Hoßeld, T., Schatz, R., Egger, S.: SOS: The MOS is not enough! In: Quality of Multimedia Experience (QoMEX), 2011 Third International Workshop on. IEEE, pp. 131–136 (2011)

  19. 19.

    Hoßfeld, T., Keimel, C., Hirth, M., Gardlo, B., Habigt, J., Diepold, K., Tran-Gia, P.: Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Trans. Multimed. 16(2), 541–558 (2014)

    Article  Google Scholar 

  20. 20.

    Hoßfeld, T., Heegaard, P.E., Varela, M., Möller, S.: QoE beyond the MOS: an in-depth look at QoE via better metrics and their relation to MOS. Qual. User Exp. 1(1), 2 (2016)

    Article  Google Scholar 

  21. 21.

    Howard, E., Cooper, C., Wittie, M.P., Swinford, S., Yang, Q.: Cascading impact of lag on user experience in cooperative multiplayer games. In: Proceedings of the 13th Annual Workshop on Network and Systems Support for Games (NetGames), pp. 1–6 (2014)

  22. 22.

    Hsu, C.L., Lu, H.P.: Why do people play on-line games? an extended TAM with social influences and flow experience. Inf. Manag. 41(7), 853–868 (2004)

    Article  Google Scholar 

  23. 23.

    International Telecommunication Union: Opinion Model for Video-Telephony Applications. ITU-T Recommendation G. 1070 (2012)

  24. 24.

    International Telecommunication Union: Reference guide to quality of experience assessment methodologies. ITU-T Recommendation G. 1011 (2016)

  25. 25.

    International Telecommunication Union: The E-model: A Computational Model for Use in Transmission Planning. ITU-T Recommendation G.107 (2015)

  26. 26.

    ITU-T Study Group 12: Subjective test methodology for gaming based applications (provisional title), Work item: P.GAME (2017). Available: Accessed 17 Dec 2017

  27. 27.

    ITU-T: Recommendation P.10/G.100 - Vocabulary for Performance and Quality of Service (2017)

  28. 28.

    ITU-T: Recommendation P.809—Subjective evaluation methods for gaming quality) (2018)

  29. 29.

    James, G., Witten, D., Hastie, T., Tibshirani, R.: An Introduction to Statistical Learning—with Applications in R. Springer, Berlin (2013)

    Book  MATH  Google Scholar 

  30. 30.

    Jarschel, M., Schlosser, D., Scheuring, S., Hoßfeld, T.: An evaluation of QoE in cloud gaming based on subjective tests. In: Proceedings of the Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS). IEEE, pp. 330–335 (2011)

  31. 31.

    Kaiser, A., Maggiorini, D., Boussetta, K., Achir, N.: On the objective evaluation of real-time networked games. Proc. IEEE Glob. Telecommun. Conf. 2009, 1–5 (2009)

    Google Scholar 

  32. 32.

    Kim, H.S., Yoon, C.H.: Determinants of subscriber churn and customer loyalty in the Korean mobile telephony market. Telecommun. Policy 28(9), 751–765 (2004)

    Article  Google Scholar 

  33. 33.

    Kuhn, M., Wing, J., Weston, S., Williams, A., Keefer, C., Engelhardt, A.: Caret: classification and regression training (2012). R package version 5.15-044. Accessed 3 May 2019

  34. 34.

    Kuipers, F., Kooij, R., De Vleeschauwer, D., Brunnström, K.: Techniques for measuring quality of experience. In: Proceedings of the 8th international conference on Wired/Wireless Internet Communications, WWIC’10. Springer, Berlin, pp. 216–227 (2010).

  35. 35.

    Le Callet, P., Möller, S., Perkis, A.: QUALINET white paper on definitions of quality of experience. European Network on Quality of Experience in Multimedia Systems and Services (COST Action IC 1003). Version 1.2 (2013)

  36. 36.

    Metzger, F., Rafetseder, A., Schwartz, C., Hoßfeld, T.: Games and frames: A strange tale of QoE studies. In: Proceedings of the International Conference on Quality of Multimedia Experience, Lisbon, Portugal, pp. 1–2 (2016)

  37. 37.

    Metzger, F., Rafetseder, A., Schwartz, C.: A comprehensive end-to-end lag model for online and cloud video gaming. In: Proc. 5th ISCA/DEGA Workshop on Perceptual Quality of Systems (PQS 2016), pp. 15–19 (2016)

  38. 38.

    Möller, S., Antons, J.N., Beyer, J., Egger, S., Castellar, E.N., Skorin-Kapov, L., Sužnjević, M.: Towards a new ITU-T recommendation for subjective methods evaluating gaming QoE. In: Quality of Multimedia Experience (QoMEX), 2015 Seventh International Workshop on. IEEE, pp. 1–6 (2015)

  39. 39.

    Möller, S., Schmidt, S., Beyer, J.: Gaming taxonomy: an overview of concepts and evaluation methods for computer gaming QoE. In: International Workshop on Quality of Multimedia Experience, QoMEX, pp. 1–6 (2013)

  40. 40.

    NewZoo: The global games market will reach $108.9 billion in 2017 with mobile taking 42%. Tech. rep. (2017). Accessed 3 May 2019

  41. 41.

    Poels, K., de Kort, Y., IJsselsteijn, W.: Game Experience Questionnaire: development of a self-report measure to assess the psychological impact of digital games Technical report, TU Eindhoven, Eindhoven, The Netherlands (2008)

  42. 42.

    Poels, K., de Kort, Y., IJsselsteijn, W.: It is always a lot of fun!: Exploring dimensions of digital game experience using focus group methodology. In: Proceedings of the 2007 conference on Future Play. ACM, pp. 83–89 (2007)

  43. 43.

    Ribeiro, F., Florêncio, D., Zhang, C., Seltzer, M.: CROWDMOS: an approach for crowdsourcing mean opinion score studies. In: Proceedings of the 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, pp. 2416–2419 (2011)

  44. 44.

    Ries, M., Svoboda, P., Rupp, M.: Empirical study of subjective quality for massive multiplayer games. In: Proceedings of the 15th International Conference on Systems, Signals and Image Processing (IWSSIP), pp. 181–184 (2008).

  45. 45.

    Schatz, R., Hossfeld, T., Casas, P.: Passive YouTube QoE monitoring for ISPs. In: Proceedings of the Sixth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS-2012), pp. 358–364 (2012)

  46. 46.

    Slivar, I., Skorin-Kapov, L., Suznjevic, M.: Cloud gaming QoE models for deriving video encoding adaptation strategies. In: Proceedings of the 7th International Conference on Multimedia Systems. ACM, p. 18 (2016)

  47. 47.

    Streijl, R.C., Winkler, S., Hands, D.S.: Mean opinion score (MOS) revisited: methods and applications, limitations and alternatives. Multimed. Syst. 22(2), 213–227 (2016)

    Article  Google Scholar 

  48. 48.

    Suznjevic, M., Saldana, J., Matijasevic, M., Vuga, M.: Impact of Simplemux Traffic Optimisation on MMORPG QoE. In: ISCA/DEGA Workshop on Perceptual Quality of Systems (PQS 2016), pp. 1–6 (2016)

  49. 49.

    Suznjevic, M., Skorin-Kapov, L., Matijasevic, M.: The impact of user, system, and context factors on gaming QoE: a case study involving MMORPGs. In: Proceedings of Annual Workshop on Network and Systems Support for Games. IEEE Press, pp. 1–6 (2013)

  50. 50.

    Suznjevic, M., Dobrijevic, O., Matijasevic, M.: MMORPG player actions: Network performance, session patterns and latency requirements analysis. Multimed. Tools Appl. 45(1–3), 191–241 (2009)

    Article  Google Scholar 

  51. 51.

    Takatalo, J., Hakkinen, J., Kaistinen, J., Nyman, G.: Evaluating user experiences in games: concepts and methods, chap. Presence, Involvement, and Flow in Digital Games, pp. 23–46. Springer, Berlin (2010)

    Google Scholar 

  52. 52.

    Ubicom Inc. Whitepaper: OPScore, or online playability score: a metric for playability of online games with network impairments (2005). Accessed 3 May 2019

  53. 53.

    Verdejo, A.J., De Moor, K., Ketyko, I., Torben Nielsen, K., Vanattenhoven, J., De Pessemier, T., Joseph, W., Martens, L., de Marez, L.: QoE estimation of a location-based mobile game using on-body sensors and QoS related data. In: Proceedings of the 2010 IFIP Wireless Days Conference—Wireless Multimedia and Entertainment, pp. 1–5 (2010)

  54. 54.

    Wattimena, A.F., Kooij, R.E., van Vugt, J.M., Ahmed, O.K.: Predicting the perceived quality of a first person shooter: the Quake IV G-model. In: Proceedings of 5th ACM SIGCOMM workshop on Network and system support for games, NetGames ’06. ACM, New York, NY, USA (2006).

  55. 55.

    Weber, R., Tamborini, R., Westcott-Baker, A., Kantor, B.: Theorizing flow and media enjoyment as cognitive synchronization of attentional and reward networks. Commun. Theory 19(4), 397–422 (2009)

    Article  Google Scholar 

  56. 56.

    Xu, J., Xing, L., Perkis, A., Jiang, Y.: On the properties of mean opinion scores for quality of experience management. In: Proceedings of the 2011 IEEE International Symposium on Multimedia (ISM). IEEE, pp. 500–505 (2011)

  57. 57.

    Yee, N.: The psychology of massively multi-user online role-playing games: Motivations, emotional investment, relationships and problematic usage. Avatars at Work and Play, pp. 187–207. Springer, Berlin (2006)

    Chapter  Google Scholar 

  58. 58.

    Yee, N.: Experimental motives for playing online games. J. CyberPsychol. Behav. 9(6), 772–775 (2007)

    Article  Google Scholar 

  59. 59.

    Zander, S., Armitage, G.: Empirically measuring the QoS sensitivity of interactive online game players. In: Proceedings of the Australian Telecommunications Networks and Applications Conference 2004 (ATNAC 2004), pp. 511–518 (2004)

  60. 60.

    Zec, M., Mikuc, M.: Operating system support for integrated network emulation in IMUNES. In: Workshop on Operating System and Architectural Support for the on demand IT Infrastructure, p. 10 (2004)

Download references


This work has been fully supported by the Croatian Science Foundation under the projects 8065 (HUTS) and UIP-2014-09-5605 (Q-MANIC). We wish to thank the anonymous reviewers for their thoughtful comments and efforts towards improving our manuscript. Also, we wish to thank Tanja Kauric for the help with conducting the user studies.

Author information



Corresponding author

Correspondence to Lea Skorin-Kapov.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

At the time of writing this publication, Aleksandra Cerekovic was an independent scholar, and was previously affiliated to the University of Zagreb, Faculty of Electrical Engineering and Computing.

Communicated by M. Claypool.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Suznjevic, M., Skorin-Kapov, L., Cerekovic, A. et al. How to measure and model QoE for networked games?. Multimedia Systems 25, 395–420 (2019).

Download citation


  • Quality of experience
  • Networked games
  • QoE assessment and modeling
  • Machine learning