Defining an Indicator for Navigation Performance Measurement in VE Based on ISO/IEC15939

  • Ahlem Assila
  • Jeremy Plouzeau
  • Frédéric Merienne
  • Aida Erfanian
  • Yaoping Hu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10324)

Abstract

Navigation is a key factor for immersion and exploration in virtual environment (VE). Nevertheless, measuring navigation performance is not an easy task, especially when analyzing and interpreting heterogeneous results of the measures used. To that end, we propose, in this paper, a new indicator for measuring navigation performance in VE based on ISO/IEC 15939 standard. It allows effective integration of heterogeneous results by retaining its raw values. Also, it provides a new method that offers a comprehensive graphical visualization of the data for interpreting the results. The experimental study had shown the feasibility of this indicator and its contribution to statistical results.

Keywords

Virtual Environment (VE) Measure Performance Navigation Evaluation 

Notes

Acknowledgments

This research work was supported by the region of Burgundy Franche-Compté.

References

  1. 1.
    Louka, M.N.: Augmented and virtual reality research in Halden 1998–2008. In: Skjerve, A., Bye, A. (eds.) Simulator-Based Human Factors Studies Across 25 Years. Springer, London (2010)Google Scholar
  2. 2.
    Pouzeau, J., Paillot, D., Chardonnet, J.R., Merienne, F.: Effect of proprioceptive vibrations on simulator sickness during navigation task in virtual environment. In: International Conference on Artificial Reality and Telexistence Eurographics Symposium on Virtual Environments, Japon, pp. 10–28 (2015)Google Scholar
  3. 3.
    LaViola, J.J.: A discussion of cybersickness in virtual environments. SIGCHI Bull. 32, 47–56 (2000)CrossRefGoogle Scholar
  4. 4.
    Walker, B.N., Lindsay, J.: Navigation performance in a virtual environment with bonephones. In: Proceedings of ICAD 2005-Eleventh Meeting of the International Conference on Auditory Display, pp. 260–263. Limerick, Ireland (2005)Google Scholar
  5. 5.
    Terziman, L., Marcal, M., Emily, M., Multon, F.: Shake-your-head: revisiting walking-in-place for desktop virtual reality. In: Proceedings of ACM VRST 2010, pp. 27–34. ACM, Hong Kong (2010)Google Scholar
  6. 6.
    Bliss, J.P., Tidwell, P.D., Guest, M.A.: The effectiveness of virtual reality for administering spatial navigation training for firefighters. Presence 6(1), 73–86 (1997)CrossRefGoogle Scholar
  7. 7.
    Kopper, R., NI, T., Bowman, D., Pinho, M.S.: Design and evaluation of navigation techniques for multiscale virtual environments. In: Proceedings of IEEE Virtual Reality Conference (VR 2006), pp. 175–182. IEEE Computer Society Washington, DC, USA (2006)Google Scholar
  8. 8.
    Ardouin, J., Lécuyer, A., Marchal, M., Marchand, E.: Navigating in virtual environments with 360° omnidirectional rendering. In: IEEE Symposium on 3D User Interfaces, Orlando, FL, USA, pp. 95–98 (2013)Google Scholar
  9. 9.
    Zielasko, D., Horn, S., Freitag, S., Weyers, B., Kuhlen, T.W.: Evaluation of hands-free HMD-based navigation techniques for immersive data analysis. In: 2016 IEEE Symposium on 3D User Interfaces (3DUI), Greenville, South Carolina, USA, pp. 113–119 (2016)Google Scholar
  10. 10.
    Plouzeau, J., Erfanian, A., Chiu, C., Merienne, F., Hu, Y.: Navigation in virtual environments: design and comparison of two anklet vibration patterns for guidance. In: 2016 IEEE Symposium on 3D User Interfaces (3DUI), Greenville, South Carolina, USA, pp. 263–264 (2016)Google Scholar
  11. 11.
    Suetendael, N.V., Elwell, D.: Software Quality Metrics. Federal Aviation Administration Technical Center, Atlantic City International Airport, New Jersey (1991)Google Scholar
  12. 12.
    McGarry, J., Card, D., Jones, C., Layman, B., Clark, E., Dean, J., Hall, F.: Practical Software Measurement: Objective Information for Decision Makers. Addison-Wesley, Boston (2002)Google Scholar
  13. 13.
    ISO/IEC 15939: International Organization for Standardization/International Electrotechnical Commission: Systems and Software Engineering—Measurement Process (ISO/IEC 15939: 2007(E)). Author, Geneva, Switzerland (2007)Google Scholar
  14. 14.
    ISO/IEC 14598-1: Information Technology – Software Product Evaluation – Part 1: General Overview. International Organization for Standardization, Geneva, Switzerland (1999)Google Scholar
  15. 15.
    ISO/IEC 9126-1: Software Engineering - Product Quality - Part 1: Quality Model. International Organization for Standardization, Geneva, Switzerland (2001)Google Scholar
  16. 16.
    ISO/IEC 25022: Systems and Software Engineering - Systems and Software Quality Requirements and Evaluation (SQuaRE) - Measurement of Quality in Use. International Organization for Standardization, Geneva, Switzerland (2012)Google Scholar
  17. 17.
    ISO/IEC 25010: Systems and Software Engineering — Systems and Software Quality Requirements and Evaluation (SQuaRE) — System and Software Quality Models. International Organization for Standardization, Geneva, Switzerland (2011)Google Scholar
  18. 18.
    Abran, A., Al-Qutaish, R., Cuadrado-Gallego, J.: Analysis of the ISO 9126 on software product quality evaluation from the metrology and ISO 15939 Perspectives. WSEAS Trans. Comput. 5(11), 2778–2786 (2006). World Scientific and Engineering Academy and SocietyGoogle Scholar
  19. 19.
    ISO/IEC 25000: Ingénierie des systèmes et du logiciel – Exigences de qualité des systèmes et du logiciel et évaluation (SQuaRE) – Guide de SQuaRE. International Organization for Standardization, Geneva, Switzerland (2014)Google Scholar
  20. 20.
    ISO/IEC 25021: Systems and Software Engineering – Systems and Software Quality Requirements and Evaluation (SQuaRE) – Quality Measure Elements. International Organization for Standardization, Geneva, Switzerland (2012)Google Scholar
  21. 21.
    Antolić, Ž.: An example of using key performance indicators for software development process efficiency evaluation. R&D Center Ericsson Nikola Tesla (2008)Google Scholar
  22. 22.
    Feyh, M., Petersen, K.: Lean software development measures and indicators - a systematic mapping study. In: Fitzgerald, B., Conboy, K., Power, K., Valerdi, R., Morgan, L., Stol, K.-J. (eds.) LESS 2013. LNBIP, vol. 167, pp. 32–47. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-44930-7_3 CrossRefGoogle Scholar
  23. 23.
    Monteiro, L., Oliveira, K.: Defining a catalog of indicators to support process performance analysis. J. Softw. Maint. Evol.: Res. Pract. 23(6), 395–422 (2011)CrossRefGoogle Scholar
  24. 24.
    Moraga, M.A., Calero, C., Bertoa, M.F.: Improving interpretation of component-based systems quality through visualization techniques. IET Soft. 4(1), 79–90 (2010)CrossRefGoogle Scholar
  25. 25.
    Staron, M., Meding, W., Hansson, J., Höglund, C., Niesel, K., Bergmann, V.: Dashboards for continuous monitoring of quality for software product under development. In: Mistrik, I., Bahsoon, R., Eeles, P., Roshandel, R., Stal, M. (eds.) Relating System Quality and Software Architecture, pp. 209–229. Morgan Kaufmann Publisher, Waltham (2014)CrossRefGoogle Scholar
  26. 26.
    Assila, A., Oliveira, K., Ezzedine, H.: Integration of subjective and objective usability evaluation based on ISO/IEC 15939: a case study for traffic supervision systems. Int. J. Hum.-Comput. Interact. 32(12), 931–955 (2016)CrossRefGoogle Scholar
  27. 27.
    Assila, A., Oliveira, K., Ezzedine, H.: An environment for integrating subjective and objective usability findings based on measures. In: IEEE 10th International Conference on Research Challenges in Information Science, Grenoble, France, pp. 645–656. IEEE (2016)Google Scholar
  28. 28.
    ISO 9241-11: ISO 9241-11, Ergonomic requirements for office work with visual display terminals (VDT) s- Part 11 Guidance on usability (1998)Google Scholar
  29. 29.
    Van Baren, J., Ijsselsteijin, W.: Compendium of presence measures. Deliverable 5 for OmniPres project: Measuring presence: A guide to current measurement approaches (2004)Google Scholar
  30. 30.
    Thubaasini, P., Rusnida, R., Rohani, S.M.: Efficient comparison between windows and linux platform applicable in a virtual architectural walkthrough application. In: Sobh, T., Elleithy, K. (eds.) Innovations in Computing Sciences and Software Engineering, pp. 337–342. Springer, Dordrecht (2009)Google Scholar
  31. 31.
    Mengoni, M., Germani, M.: Virtual reality systems and CE: how to evaluate the benefits. In: Ghodous, P. et al. (eds.) Leading the Web in Concurrent Engineering, pp. 853–862. IOS Press, Amsterdam (2006)Google Scholar
  32. 32.
    Witmer, B.G., Singer, M.J.: Measuring presence in virtual environments: a presence questionnaire. Presence 7(13), 225–240 (1998)CrossRefGoogle Scholar
  33. 33.
    Gerhard, M., Moore, D.J., Hobbs, D.J.: Continuous presence in collaborative virtual environments: towards a hybrid avatar-agent model for user representation. In: Antonio, A., Aylett, R., Ballin, D. (eds.) IVA 2001. LNCS, vol. 2190, pp. 137–155. Springer, Heidelberg (2001). doi: 10.1007/3-540-44812-8_12 CrossRefGoogle Scholar
  34. 34.
    Staron, M., Meding, W., Nilsson, C.: Framework for developing measurement systems and its industrial evaluation. Inf. Softw. Technol. 51(4), 721–737 (2009)CrossRefGoogle Scholar
  35. 35.
    Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation in Software Engineering. Springer, Heidelberg (2012)CrossRefMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Ahlem Assila
    • 1
  • Jeremy Plouzeau
    • 1
  • Frédéric Merienne
    • 1
  • Aida Erfanian
    • 2
  • Yaoping Hu
    • 2
  1. 1.LE2I, Arts et Métiers, CNRS, University of Bourgogne Franche-ComtéChalon-sur-SaôneFrance
  2. 2.Department of Electrical and Computer Engineering, Schulich School of EngineeringUniversity of CalgaryCalgaryCanada

Personalised recommendations