Advertisement

A Critical Analysis of Human-Subject Experiments in Virtual Reality and 3D User Interfaces

  • Carlos AndujarEmail author
  • Pere Brunet
Chapter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8844)

Abstract

This paper is about the major peculiarities and difficulties we encounter when trying to validate research results in fields such as virtual reality (VR) and 3D user interfaces (3DUI). We review the steps in the empirical method and discuss a number of challenges when conducting human-subject experiments. These challenges include the number of independent variables to control to get useful findings, the within-subjects or between-subjects dilemma, hard-to-collect data, experimenter effects, ethical issues, and the lack of background in the community for proper statistical analysis and interpretation of the results. We show that experiments involving human-subjects hinder the adoption of traditional experimental principles (comparison, repeatability, reproducibility, justification and explanation) and propose some ideas to improve the reliability of findings in VR and 3DUI disciplines.

Keywords

Experimentation Virtual reality 3D user interfaces 

Notes

Acknowledgments

This work was greatly influenced by the contributions to the Workshop on the Role and Relevance of Experimentation in Informatics, held prior to the 8th European Computer Science Summit (ECSS 2012) of Informatics Europe, November 19th 2012, Barcelona, and the discussions of the Dagstuhl Seminar 13241 Virtual Realities, June 9th–14th 2013, Dagstuhl. This work has been partially funded by the Spanish Ministry of Science and Innovation under Grant TIN2010-20590-C02-01.

References

  1. 1.
    Andujar, C., Schiaffonati, V., Schreiber, F.A., Tanca, L., Tedre, M., van Hee, K., van Leeuwen, J.: The role and relevance of experimentation in informatics. In: Workshop on the Role and Relevance of Experimentation in Informatics, 8th European Computer Science Summit (ECSS 2012) of Informatics Europe (2012)Google Scholar
  2. 2.
    American Psychological Association: Ethical principles of psychologist and code of conduct (2010)Google Scholar
  3. 3.
    American Psychological Association: Publication Manual of the American Psychological Association, 6th edn. American Psychological Association, New York (2013)Google Scholar
  4. 4.
    Bowman, D.A., Gabbard, J.L., Hix, D.: A survey of usability evaluation in virtual environments: classification and comparison of methods. Presence Teleoper. Virtual Environ. 11(4), 404–424 (2002)CrossRefGoogle Scholar
  5. 5.
    Bowman, D.A., Kruijff, E., LaViola, J.J., Poupyrev, I.: 3D User Interfaces: Theory and Practice. Addison Wesley, Boston (2004)Google Scholar
  6. 6.
    Collins, L.M., Dziak, J.J., Li, R.: Design of experiments with multiple independent variables: a resource management perspective on complete and reduced factorial designs. Psychol. Methods 14(3), 202 (2009)CrossRefGoogle Scholar
  7. 7.
    European Telecommunications Standards Institute (ETSI): Usability evaluation for the design of telecommunication systems, services and terminals. ETSI Guide EG 201472. Sophia Antipolis (2000)Google Scholar
  8. 8.
    MacKenzie, I.S.: Human-Computer Interaction: An Empirical Research Perspective. Morgan Kaufmann, San Francisco (2013)Google Scholar
  9. 9.
    MacKenzie, I.S., Kauppinen, T., Silfverberg, M.: Accuracy measures for evaluating computer pointing devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2001, pp. 9–16. ACM, New York (2001)Google Scholar
  10. 10.
    Meehan, M., Insko, B., Whitton, M., Brooks, F.P.: Physiological measures of presence in virtual environments. In: Proceedings of 4th International Workshop on Presence, pp. 21–23 (2001)Google Scholar
  11. 11.
    Slater, M., Antley, A., Davison, A., Swapp, D., Guger, C., Barker, C., Pistrang, N., Sanchez-Vives, M.V.: A virtual reprise of the Stanley Milgram obedience experiments. PLoS ONE 1(1), e39 (2006)CrossRefGoogle Scholar
  12. 12.
    Swan, J.E., Ellis, S.R., Adelstein, B.D.: Conducting human-subject experiments with virtual and augmented reality. In: 2006 Virtual Reality Conference. IEEE (2006)Google Scholar
  13. 13.
    Swan, J.E., Gabbard, J.L.: Quantitative and qualitative methods for human-subject experiments in augmented reality. In: Tutorial Presented at the International Symposium on Mixed and Augmented Reality (ISMAR 2012). IEEE (2012)Google Scholar
  14. 14.
    Takala, T.M., Pugliese, R., Rauhamaa, P., Takala, T.: Reality-based user interface system (RUIS). In: 2011 IEEE Symposium on 3D User Interfaces (3DUI), pp. 141–142. IEEE (2011)Google Scholar
  15. 15.
    Wingrave, C.A., Bowman, D.A.: Baseline factors for raycasting selection. In: Proceedings of Virtual Reality International (2005)Google Scholar
  16. 16.
    Wingrave, C.A., LaViola, J.J.: Reflecting on the design and implementation issues of virtual environments. Presence Teleoper. Virtual Environ. 19(2), 179–195 (2010)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.ViRVIG-MOVING Research GroupUniversitat Politècnica de CatalunyaBarcelonaSpain

Personalised recommendations