Advertisement

Multimedia Tools and Applications

, Volume 72, Issue 1, pp 749–775 | Cite as

Platform for real-time subjective assessment of interactive multimedia applications

  • Bert VankeirsbilckEmail author
  • Dieter Verslype
  • Nicolas Staelens
  • Pieter Simoens
  • Chris Develder
  • Piet Demeester
  • Filip De Turck
  • Bart Dhoedt
Article

Abstract

With the advent of cloud computing and remote execution of interactive applications, there is a need for evaluating the Quality of Experience (QoE) and the influence on this QoE of network condition variations, media encoding parameter settings and related optimization algorithms. However, current QoE assessment focuses mainly on audiovisual quality in non-interactive applications, such as video-on-demand services. On the other hand, where experiments aim to quantify interactive quality, the focus is typically targeted at games, using an ad-hoc test setup to assess the impact of network variations on the playing experience. In this paper, we present a novel platform enabling the assessment of a broad range of interactive applications (e.g., thin client remote desktop systems, remotely rendered engineering applications, games). Dynamic reconfiguration of media encoding and decoding is built into the system, to allow dynamic adaptation of the media encoding to the network conditions and the application characteristics. Evaluating the influence of these automatic adaptations is a key asset of our approach. A range of possible use cases is discussed, as well as a performance study of our implementation, showing that the platform we built is capable of highly controllable subjective user assessment. Furthermore, we present results obtained by applying the platform for a subjective evaluation of an interactive multimedia application. Specifically, the influence of visual quality and frame rate on interactive QoE has been assessed for a remotely executed race game.

Keywords

Interactive media quality assessment Quality of experience Interactivity Subjective quality Thin client computing 

References

  1. 1.
    Chang YC, Tseng PH, Chen KT, Lei CL (2011) Understanding the performance of thin-client gaming. In: IEEE international workshop technical committee on communications quality and reliability (CQR), 2011, pp 1–6. doi: 10.1109/CQR.2011.5996092
  2. 2.
    Chen KT, Huang P, Lei CL (2006) How sensitive are online gamers to network quality? Commun ACM 49(11):34–38. doi: 10.1145/1167838.1167859 CrossRefGoogle Scholar
  3. 3.
    Chen KT, Tu CC, Xiao WC (2009) OneClick: a framework for measuring network quality of experience. In: INFOCOM 2009. IEEE, pp 702–710. doi: 10.1109/INFCOM.2009.5061978
  4. 4.
    Chen KT, Wu CC, Chang YC, Lei CL (2009) A crowdsourceable QoE evaluation framework for multimedia content. In: ACM multimedia, pp 491–500Google Scholar
  5. 5.
    Claypool M (2009) Motion and scene complexity for streaming video games. In: Proceedings of the 4th international conference on foundations of digital games, FDG ’09. ACM, New York, NY, USA, pp 34–41. doi: 10.1145/1536513.1536529 CrossRefGoogle Scholar
  6. 6.
    Claypool M, Claypool K (2006) Latency and player actions in online games. Commun ACM 49(11):40–45. doi: 10.1145/1167838.1167860 CrossRefGoogle Scholar
  7. 7.
    Dick M, Wellnitz O, Wolf L (2005) Analysis of factors affecting players’ performance and perception in multiplayer games. In: Proceedings of 4th ACM SIGCOMM workshop on network and system support for games, NetGames ’05. ACM, New York, NY, USA, pp 1–7. doi: 10.1145/1103599.1103624 CrossRefGoogle Scholar
  8. 8.
    Engel K, Sommer O, Ertl T (2000) A framework for interactive hardware accelerated remote 3d-visualization. In: Proc. TCVG Symp. on Vis. (VisSym), pp 167–177Google Scholar
  9. 9.
    International Telecommunication Union (2000) ITU-T Recommendation P.920. Interactive test methods for audiovisual communications. http://www.itu.int/. Accessed 4 Dec 2012
  10. 10.
    International Telecommunication Union (2008) ITU-T Recommendation P.910. Subjective video quality assessment methods for multimedia applications. http://www.itu.int/. Accessed 4 Dec 2012
  11. 11.
    Jarschel M, Schlosser D, Scheuring S, Hossfeld T (2012) Gaming in the clouds: QoE and the users’ perspective. Math Comput Model. doi: 10.1016/j.mcm.2011.12.014 Google Scholar
  12. 12.
    Jovic M, Hauswirth M (2008) Measuring the performance of interactive applications with listener latency profiling. In: Proceedings of the 6th international symposium on principles and practice of programming in Java, PPPJ ’08. ACM, New York, NY, USA, pp 137–146. doi: 10.1145/1411732.1411751 Google Scholar
  13. 13.
    JSON (2012) JavaScript Object Notation (JSON). http://www.json.org/. Accessed 4 Dec 2012
  14. 14.
    Jumisko-Pyykkö S, Utriainen T (2011) A hybrid method for quality evaluation in the context of use for mobile (3d) television. Multimed Tools Appl 55(2):185–225. doi: 10.1007/s11042-010-0573-4 CrossRefGoogle Scholar
  15. 15.
    Kohler E, Morris R, Chen B, Jannotti J, Kaashoek MF (2000) The click modular router. ACM Trans Comput Syst 18:263–297. doi: 10.1145/354871.354874 CrossRefGoogle Scholar
  16. 16.
    Kuipers F, Kooij R, De Vleeschauwer D, Brunnström K (2010) Techniques for measuring quality of experience. In: Wired/Wireless internet communications. Lecture notes in computer science (LNCS), vol 6074. Springer Berlin / Heidelberg, pp 216–227CrossRefGoogle Scholar
  17. 17.
    Lai AM, Nieh J (2006) On the performance of wide-area thin-client computing. ACM Trans Comput Syst 24(2):175–209. doi: 10.1145/1132026.1132029 CrossRefGoogle Scholar
  18. 18.
    Moorthy A, Bovik A (2011) Visual quality assessment algorithms: what does the future hold? Multimed Tools Appl 51(2):675–696. doi: 10.1007/s11042-010-0640-x CrossRefGoogle Scholar
  19. 19.
    Nieh J, Yang SJ, Novik N (2003) Measuring thin-client performance using slow-motion benchmarking. ACM Trans Comput Syst 21(1):87–115. doi: 10.1145/592637.592640 CrossRefGoogle Scholar
  20. 20.
    Quax P, Monsieurs P, Lamotte W, De Vleeschauwer D, Degrande N (2004) Objective and subjective evaluation of the influence of small amounts of delay and jitter on a recent first person shooter game. In: Proceedings of 3rd ACM SIGCOMM workshop on network and system support for games, NetGames ’04. ACM, New York, NY, USA, pp 152–156. doi: 10.1145/1016540.1016557 Google Scholar
  21. 21.
    Ries M, Svoboda P, Rupp M (2008) Empirical study of subjective quality for massive multiplayer games. In: 15th international conference on systems, signals and image processing, 2008. IWSSIP 2008, pp 181–184. doi: 10.1109/IWSSIP.2008.4604397
  22. 22.
    Santiago P, Ignasi I, Elisa M, Antonio MJ (2008) TRUE: an online testing platform for multimedia evaluation. In: Proceedings of the 2nd int. workshop on EMOTION: corpora for research on emotion and affect at the 6th conference on language resources & evaluation (LREC)Google Scholar
  23. 23.
    Stegmaier S, Magallón M, Ertl T (2002) A generic solution for hardware-accelerated remote visualization. In: Proc. TCVG Symp. on Vis. (VisSym), pp 87–96Google Scholar
  24. 24.
    Tolia N, Andersen D, Satyanarayanan M (2006) Quantifying interactive user experience on thin clients. Computer 39(3):46–52. doi: 10.1109/MC.2006.101 CrossRefGoogle Scholar
  25. 25.
    VDrift: VDrift Open Source Racing Simulator, version 2011-10-22. http://www.vdrift.net. Accessed 4 Dec 2012
  26. 26.
    Video Quality Experts Group (2010) Report on the validation of video quality models for high definition video content. Tech. rep. http://www.its.bldrdoc.gov/vqeg/projects/hdtv/. http://www.vqeg.org/. Accessed 4 Dec 2012
  27. 27.
    VirtualGL (2012) VirtualGL. http://www.virtualgl.org. Accessed 4 Dec 2012
  28. 28.
    Wattimena AF, Kooij RE, van Vugt JM, Ahmed OK (2006) Predicting the perceived quality of a first person shooter: the Quake IV G-model. In: Proceedings of 5th ACM SIGCOMM workshop on network and system support for games, NetGames ’06. ACM, New York, NY, USA. doi: 10.1145/1230040.1230052 Google Scholar
  29. 29.
    Yang C, Niu Y, Xia Y, Cheng X (2008) Performance analysis of interactive desktop applications in virtual machine environment. Chin J Electron 17(2):242–246Google Scholar
  30. 30.
    Yoo SH, Yoon WC (2006) Modeling users’ task performance on the mobile device: PC convergence system. Interact Comput 18(5):1084–1100. doi: 10.1016/j.intcom.2006.01.003 CrossRefGoogle Scholar
  31. 31.
    Zeldovich N, Chandra R (2005) Interactive performance measurement with VNCplay. In: Proceedings of the annual conference on USENIX annual technical conference, ATEC ’05. USENIX Association, Berkeley, CA, USA, pp 54–64. http://suif.stanford.edu/vncplay/. Accessed 4 Dec 2012Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Bert Vankeirsbilck
    • 1
    Email author
  • Dieter Verslype
    • 1
  • Nicolas Staelens
    • 1
  • Pieter Simoens
    • 1
    • 2
  • Chris Develder
    • 1
  • Piet Demeester
    • 1
  • Filip De Turck
    • 1
  • Bart Dhoedt
    • 1
  1. 1.Department of Information Technology (INTEC), Internet Based Communication Networks and Services (IBCN) – iMindsGhent UniversityGhentBelgium
  2. 2.Department INWEGhent University CollegeGhentBelgium

Personalised recommendations