Skip to main content

User Experience Evaluation in Virtual Reality for Autism: A Systematic Literature Review

  • Conference paper
  • First Online:
Universal Access in Human-Computer Interaction (HCII 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14020))

Included in the following conference series:

Abstract

Virtual reality (VR) used for people with autism spectrum disorder (ASD) is gaining a place among researchers lately. The rich interactivity provided by VR makes it a powerful tool for various purposes, one of which is to support interventions for people with ASD. Since people with ASD have a different sensory experience, the user experience (UX) within VR applications for ASD becomes a crucial aspect. There have been past literature reviews of VR applications for ASD. However, to the best of our knowledge, there is hardly one that focuses on the methods used in evaluating the users’ experience towards interaction techniques or interface elements incorporated in various VR applications. In this review, 24 studies met predetermined PICo (Population, Interest, Context) criteria from the search based on three relevant databases, namely Pubmed NCBI, Dimensions.ai, and IEEE Xplore. Direct examination from observers appears as the most popular method used to assess experiences of people with ASD in using VR and followed by self-report questionnaire method. In addition, although all relevant literatures mentioned their UX evaluation method, only a few studies whose evaluation is relevantly subjected towards their interaction techniques or user interface elements. This review is expected to provide insights and considerations in designing UX evaluation method for VR applications for people with ASD in the future.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kail, R.V.: Children and Their Development, 6th edn. Pearson, Upper Saddle River (2011)

    Google Scholar 

  2. Kapp, K.M.: The Gamification of Learning and Instruction. John Wiley & Sons, Nashville (2012)

    Google Scholar 

  3. Leekam, S.R., Nieto, C., Libby, S.J., Wing, L., Gould, J.: Describing the sensory abnormalities of children and adults with autism. J. Autism Dev. Disord. 37(5), 894–910 (2007). https://doi.org/10.1007/s10803-006-0218-7

    Article  Google Scholar 

  4. Bozgeyikli, L., Raij, A., Katkoori, S., Alqasemi, R.: A survey on virtual reality for individuals with autism spectrum disorder: design considerations. IEEE Trans. Learn. Technol. 11(2), 133–151 (2018)

    Article  Google Scholar 

  5. Valencia, K., Rusu, C., Quiñones, D., Jamet, E.: The impact of technology on people with autism spectrum disorder: a systematic literature review. Sensors (Basel) 19(20), 4485 (2019)

    Article  Google Scholar 

  6. Mesa-Gresa, P., Gil-GĂłmez, H., Lozano-Quilis, J.-A., Gil-GĂłmez, J.-A.: Effectiveness of virtual reality for children and adolescents with autism spectrum disorder: an evidence-based systematic review. Sensors (Basel) 18(8), 2486 (2018)

    Article  Google Scholar 

  7. Glaser, N., Schmidt, M.: Systematic literature review of virtual reality intervention design patterns for individuals with autism spectrum disorders. Int. J. Hum. Comput. Interact. 38, 1–36 (2021)

    Google Scholar 

  8. Yuan, Y., Hunt, R.H.: Systematic reviews: the good, the bad and the ugly. Am. J. Gastroenterol. 104(5), 1086–1092 (2009)

    Article  Google Scholar 

  9. Merriam-Webster: Virtual reality definition’ (2022). www.merriam-webster.com/dictionary/virtual%20reality. Accessed 13 Mar 2022

  10. Steuer, J.: Defining virtual reality: dimensions determining telepresence. J. Commun. 42(4), 73–93 (1992)

    Article  Google Scholar 

  11. Brooks, F.P.: What’s real about virtual reality? IEEE Comput. Graph. Appl. 19(6), 16–27 (1999)

    Article  Google Scholar 

  12. Berg, L.P., Vance, J.M.: Industry use of virtual reality in product design and manufacturing: a survey. Virtual Reality 21(1), 1–17 (2016). https://doi.org/10.1007/s10055-016-0293-9

    Article  Google Scholar 

  13. Branda, E.: Review: oculus rift. J. Soc. Archit. Hist. 74(4), 526–528 (2015)

    Article  Google Scholar 

  14. Milgram, P., Kishino, F.: A taxonomy of mixed reality visual displays. In: IEICE Transactions on Information Systems, vol. E77-D, no.12, pp. 1321–1329 (1994)

    Google Scholar 

  15. Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R.V., Hart, J.C.: The CAVE: audio visual experience automatic virtual environment. Commun. ACM 35(6), 64–72 (1992)

    Article  Google Scholar 

  16. Vargas Gonzalez, A.N., Kapalo, K., Koh, S., LaViola, J.: Exploring the virtuality continuum for complex rule-set education in the context of soccer rule comprehension. Multimodal Technol. Interact. 1(4), 30 (2017)

    Article  Google Scholar 

  17. Diaz, D., Boj, C., Portalés, C.: HybridPLAY: a new technology to foster outdoors physical activity, verbal communication and teamwork. Sensors (Basel) 16(4), 586 (2016)

    Article  Google Scholar 

  18. Oyelere, S.S., Bouali, N., Kaliisa, R., Obaido, G., Yunusa, A.A., Jimoh, E.R.: Exploring the trends of educational virtual reality games: a systematic review of empirical studies. Smart Learn. Environ. 7(1), 1–22 (2020). https://doi.org/10.1186/s40561-020-00142-7

    Article  Google Scholar 

  19. Chan, J.C.P., Leung, H., Tang, J.K.T., Komura, T.: A virtual reality dance training system using motion capture technology. IEEE Trans. Learn. Technol. 4(2), 187–195 (2011)

    Article  Google Scholar 

  20. Lee, S.: A showcase of medical, therapeutic and pastime uses of virtual Reality (VR) and how VR is impacting the dementia sector. Adv. Exp. Med. Biol. 1156, 135–141 (2019)

    Article  Google Scholar 

  21. Leeb, R., Pérez-Marcos, D.: Brain-computer interfaces and virtual reality for neurorehabilitation. Handb. Clin. Neurol. 168, 183–197 (2020)

    Article  Google Scholar 

  22. van Bennekom, M.J., de Koning, P.P., Gevonden, M.J., Kasanmoentalib, M.S., Denys, D.: A virtual reality game to assess OCD symptoms. Front. Psychiatry 11, 550165 (2020)

    Article  Google Scholar 

  23. ISO, ISO 9241-11:2018: Ergonomics of human-system interaction—Part 11: Usability: Definitions and concepts. pub-ISO:adr: pub-ISO (2018)

    Google Scholar 

  24. Norman, D., Nielsen, J.: The definition of user experience (UX) (2022). https://www.nngroup.com/articles/definition-user-experience/. Accessed 13 Mar 2022

  25. Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., PRISMA Group: Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 6(7), e1000097 (2009)

    Google Scholar 

  26. Dimensions: Dimensions (2022). https://www.dimensions.ai/. Accessed 28 Mar 2022

  27. I. Xplore: About IEEE Xplore (2022). https://ieeexplore.ieee.org/Xplorehelp/overview-of-ieee-xplore/about-ieee-xplore. Accessed 28 Mar 2022

  28. P. NCBI: PubMed.gov (2022). https://pubmed.ncbi.nlm.nih.gov/.Accessed 28 Mar 2022

  29. Stern, C., Jordan, Z., McArthur, A.: Developing the review question and inclusion criteria. Am. J. Nurs. 114(4), 53–56 (2014)

    Article  Google Scholar 

  30. Ravindran, V., Osgood, M., Sazawal, V., Solorzano, R., Turnacioglu, S.: Virtual reality support for joint attention using the floreo Joint attention module: usability and feasibility pilot study. JMIR Pediatr. Parent. 2(2), e14429 (2019)

    Article  Google Scholar 

  31. Feng, S., et al.: The uncanny valley effect in typically developing children and its absence in children with autism spectrum disorders. PLoS ONE 13(11), e0206343 (2018)

    Article  Google Scholar 

  32. Mei, C., Zahed, B.T., Mason, L., Ouarles, J.: Towards joint attention training for children with ASD - a VR game approach and eye gaze exploration. In: 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Reutlingen, March 2018

    Google Scholar 

  33. Lahiri, U., Trewyn, A., Warren, Z., Sarkar, N.: Dynamic eye gaze and its potential in Virtual Reality based applications for children with autism spectrum disorders. Autism Open Access 1(1) (2011)

    Google Scholar 

  34. Lahiri, U., Welch, K.C., Warren, Z., Sarkar, N.: Understanding psychophysiological response to a Virtual Reality-based social communication system for children with ASD. In: 2011 International Conference on Virtual Rehabilitation, Zurich, Switzerland, June 2011

    Google Scholar 

  35. Nuguri, S.S., et al.: vSocial: a cloud-based system for social virtual reality learning environment applications in special education. Multimed. Tools Appl. 80(11), 16827–16856 (2020). https://doi.org/10.1007/s11042-020-09051-w

    Article  Google Scholar 

  36. Kuriakose, S., Sarkar, N., Lahiri, U.: A step towards an intelligent human computer interaction: physiology-based affect-recognizer. In: 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI), Kharagpur, India, December 2012

    Google Scholar 

  37. Di Mascio, T., Tarantino, L., De Gasperis, G., Pino, C.: Immersive virtual environments: a comparison of mixed reality and virtual reality headsets for ASD treatment. In: Gennari, R., et al. (eds.) Methodologies and Intelligent Systems for Technology Enhanced Learning, 9th International Conference, pp. 153–163. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-23990-9_19

    Chapter  Google Scholar 

  38. Bernardes, M., Barros, F., Simoes, M., Castelo-Branco, M.: A serious game with virtual reality for travel training with Autism Spectrum Disorder. In: 2015 International Conference on Virtual Rehabilitation (ICVR), Valencia, Spain, June 2015

    Google Scholar 

  39. Bozgeyikli, E., Bozgeyikli, L., Raij, A., Katkoori, S., Alqasemi, R., Dubey, R.: Virtual reality interaction techniques for individuals with autism spectrum disorder: design considerations and preliminary results. In: Kurosu, M. (ed.) HCI 2016. LNCS, vol. 9732, pp. 127–137. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-39516-6_12

    Chapter  Google Scholar 

  40. Bozgeyikli, E., Bozgeyikli, L.L., Alqasemi, R., Raij, A., Katkoori, S., Dubey, R.: Virtual reality interaction techniques for individuals with autism spectrum disorder. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2018. LNCS, vol. 10908, pp. 58–77. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92052-8_6

    Chapter  Google Scholar 

  41. Bozgeyikli, E., Raij, A., Katkoori, S., Dubey, R.: Locomotion in virtual reality for individuals with autism spectrum disorder. In: Proceedings of the 2016 Symposium on Spatial User Interaction, New York, NY, USA, October 2016

    Google Scholar 

  42. Mei, C., Mason, L., Quarles, J.: “I Built It!”—Exploring the effects of customizable virtual humans on adolescents with ASD. In: 2015 IEEE Virtual Reality (VR), Arles, Camargue, Provence, France, March 2015

    Google Scholar 

  43. Finkelstein, S., Barnes, T., Wartell, Z., Suma, E.A.: Evaluation of the exertion and motivation factors of a virtual reality exercise game for children with autism. In: 2013 1st Workshop on Virtual and Augmented Assistive Technology (VAAT), Lake Buena Vista, FL, USA, March 2013

    Google Scholar 

  44. Halabi, O., Abou El-Seoud, S., Alja’am, J., Alpona, H., Al-Hemadi, M., Al-Hassan, D.: Design of immersive virtual reality system to improve communication skills in individuals with autism. Int. J. Emerg. Technol. Learn. 12(05), 50 (2017)

    Google Scholar 

  45. Bozgeyikli, L.L., Bozgeyikli, E., Katkoori, S., Raij, A., Alqasemi, R.: Effects of virtual reality properties on user experience of individuals with autism’. ACM Trans. Access. Comput. 11(4), 1–27 (2018)

    Article  Google Scholar 

  46. Pearl, A.M., Edwards, E.M., Murray, M.J.: Comparison of self-and other-report of symptoms of autism and comorbid psychopathology in adults with autism spectrum disorder. Contemp. Behav. Health Care 2(1), 1–8 (2017)

    Article  Google Scholar 

  47. Newbutt, N., Sung, C., Kuo, H.-J., Leahy, M.J., Lin, C.-C., Tong, B.: Brief report: a pilot study of the use of a virtual reality headset in autism populations. J. Autism Dev. Disord. 46(9), 3166–3176 (2016). https://doi.org/10.1007/s10803-016-2830-5

    Article  Google Scholar 

  48. Li, C., Ip, H.H.S., Ma, P.K.: A design framework of virtual reality enabled experiential learning for children with autism spectrum disorder. In: Cheung, S.K.S., Lee, L.-K., Simonova, I., Kozel, T., Kwok, L.-F. (eds.) Blended Learning: Educational Innovation for Personalized Learning: 12th International Conference, ICBL 2019, Hradec Kralove, Czech Republic, July 2–4, 2019, Proceedings, pp. 93–102. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-21562-0_8

    Chapter  Google Scholar 

  49. Junaidi, A.R., Alamsyah, Y., Hidayah, O., Mulyawati, N.W.: Development of virtual reality content to improve social skills in children with low function autism. In: 2020 6th International Conference on Education and Technology (ICET), Malang, Indonesia, October 2020

    Google Scholar 

  50. Malihi, M., Nguyen, J., Cardy, R.E., Eldon, S., Petta, C., Kushki, A.: Data-driven discovery of predictors of virtual reality safety and sense of presence for children with autism spectrum disorder: a pilot study. Front. Psychiatry 11, 669 (2020)

    Article  Google Scholar 

  51. Newbutt, N., Bradley, R., Conley, I.: Using virtual reality head-mounted displays in schools with autistic children: views, experiences, and future directions. Cyberpsychol. Behav. Soc. Netw. 23(1), 23–33 (2020)

    Article  Google Scholar 

  52. De Luca, R., et al.: Innovative use of virtual reality in autism spectrum disorder: a case-study. Appl. Neuropsychol. Child 10(1), 90–100 (2021)

    Article  Google Scholar 

  53. Schmidt, M., Schmidt, C., Glaser, N., Beck, D., Lim, M., Palmer, H.: Evaluation of a spherical video-based virtual reality intervention designed to teach adaptive skills for adults with autism: a preliminary report. Interact. Learn. Environ. 29(3), 345–364 (2021)

    Article  Google Scholar 

  54. Zhao, H., Swanson, A., Weitlauf, A., Warren, Z., Sarkar, N.: A novel collaborative virtual reality game for children with ASD to foster social interaction. In: Antona, M., Stephanidis, C. (eds.) Universal Access in Human-Computer Interaction. Users and Context Diversity: 10th International Conference, UAHCI 2016, Held as Part of HCI International 2016, Toronto, ON, Canada, July 17-22, 2016, Proceedings, Part III, pp. 276–288. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40238-3_27

    Chapter  Google Scholar 

  55. Schmidt, M., Beck, D., Glaser, N., Schmidt, C., Abdeen, F.: Formative design and evaluation of an immersive learning intervention for adults with autism: design and research implications. In: Beck, D., et al. (eds.) Immersive Learning Research Network: 5th International Conference, iLRN 2019, London, UK, June 23–27, 2019, Proceedings, pp. 71–85. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-23089-0_6

    Chapter  Google Scholar 

Download references

Acknowledgments

Aulia would like to sincerely thank Indonesian Endowment Fund for Education/Lembaga Pengelola Dana Pendidikan (LPDP) for the financial support during the work of this article.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aulia Hening Darmasti .

Editor information

Editors and Affiliations

Appendix

Appendix

Author, Year

Intervention/Therapy/Treatment

User interface/Interaction technique

UX Evaluation Method

Output/Variable Measured

Interesting Insights

Li et al. (2019) [48]

Social stories using Kolb’s experiential learning model

- Interface components: Visual hints, task list

- Interaction: Verbal response

- 3D Interaction: Selecting object (Tapping the in-VR rating button)

- Observation from observer; onsite and video. Direct interview to participant

- Output: Case report per individual (qualitative)

Enabling in-VR task lists, hints, and real-time feedbacks are effective for providing in-VR facilitations, which is needed in delivering treatment for ASD users

Junaidi et al. (2020) [49]

Social scenario; selecting food menu, seating, and putting the dirty dishes

- Interface comp: Visual hints, task list

- Interaction: Verbal response

- Observation from observer and observer filling questionnaire

- Variables measured: Borg and Gall; convenience of using HMD, object readability in VR, understanding the instructions in VR

- Output: Statistics of user experience (quantitative)

- The VR content used has not been able to address LFA problems

- LFA users want to use VR devices, but with some limitations

Di Mascio et al. (2020) [37]

Scenes (space, forest, historic site) and games (virtual blocks)

- Interface comp: Scenes

- Interaction: Remote control, gamepad, virtual hand (leap motion)

- 3D Interaction: Selecting and manipulating an object

- Concurrent Think Aloud (CTA),

- Observation from observer. Direct interview to participant

- Self-report questionnaire/survey (but not used)

- Output: Acceptability and usability data (quantitative), and engagement (qualitative)

- The evaluation framework could represent the foundations for an innovative IVE evaluation framework

Malihi et al. 2020 [50]

Social scenario (in school bus) and scenes (Blue Planet)

- Interaction: Exploring the VR scene by moving their head (VR) or mouse control

- Self-report questionnaire/survey

- Using elastic nets, random forest, AdaBoost, and NN to predict the user-reported ratings of sense

- Variable measured: Demography, sense of presence

(Spatial presence, naturalness, engagement) and safety

(cybersickness, anxiety)

- Output: correlation between IQ and anxiety to sense of presence and safety

- The most accurately predicted target is spatial presence, followed by engagement

- IQ and anxiety traits are identified as critical predictors of spatial presence and engagement

Ravindran et al. (2019) [30]

Social scenario consists of reciprocal interaction

- Interface comp: Scenes, Avatar

- Interaction: Verbal response

- Observation from observer; onsite and video. Observer filling questionnaire and do a direct interview to participant

- Self-report questionnaire/survey

- Variables measured: Joint attention, participant's mood, self-report condition (Alertness, eye discomfort, clarity of vision, headache, stomachache, balance, enjoyment), and observation result (participant’s tolerance of HMD, enjoyment, adverse side effects, value from Floreo)

- Observations from the observer is the primary evaluation method

- Participant’s questionnaire response was not consistent as the observer’s note due to limited communication skills

Feng et al. (2018) [31]

Showing virtual faces on the screen

- Interface comp: Avatar with varying realism degree and pupil size

- Interaction: Mouse control

- Tracking participant's eye movement (ROI) and combining the user's preference between two different faces with different realism and eye size

- Variables measured: - Region of Interest (ROI) from participant's eye-gaze

- Participant's preference

ASD children rather indifferent towards facial features manipulation

Newbutt et al. (2020) [51]

Scenes (Moon, Historical Place) and Game (Throwing Ball)

- Interface comp: Scenes

- Interaction: virtual hand (controller)

- 3D Interaction: Selecting object

- Observer do the direct interview to participant

- Self-report questionnaire/survey

- Variables measured: Enjoyment/Usefulness, physical experience, and preference

The most preferred device is the high-end HMD

De Luca et al. (2021) [52]

Scenario games with CBT

- Interface comp: Scene

- 3D Interaction: Full-body movement

- Observation from observer, using GARS

- Psychometric assessment

- Parental distress (Self-report questionnaire)

Variables measured: Psychometric (Nonverbal fluid intelligence, attention process, visual-spatial functions)

Combined rehabilitation using CBT with VR may be promising in improving cognition

Schmidt et al. (2021) [53]

Train public transportation skills

- Interface comp: Scenes

- Interaction: Selecting object

- Expert review, observation from observer. Observer filling questionnaire

- Self-report questionnaire/survey

- Variable measured: Usability question set using SUS and adjective scale by Bangor

The spherical video-based virtual reality (SVVR) is easy-to-use and has positive user-experience feedback in general

Bozgeyikli et al. (2019) [39]

Vocational: cleaning, shelving, environmental awareness, loading, money management, and social

- Interface comp: Scenes

- 3D interaction:

-- Object selection & manipulation; Tangible object manipulation, Haptic Device, Touch and Snap, Touchscreen

-- Locomotion; Real Walking, Walk-in Place

- Observation from observer

- Self-report questionnaire/survey

- Participant’s questionnaire response is processed using ANOVA,

- Variables measured: user experience (difficult in understanding, difficult in operating, in control, enjoyment, effort, tiredness, overwhelmedness, frustration) and task completion

- Participants prefer touchscreen and tangible interaction techniques and real walking

- Participants had more difficulty in gesture and more abstract interaction

Bozgeyikli et al. (2018) [40]

Vocational: cleaning, shelving, environmental awareness, loading, money management, and social

- Display method: HMD, curtain screen

- 3D interaction:,

-- Object selection & manipulation; Tangible object manipulation, Haptic Device, Touch and Snap, Touchscreen

-- Locomotion: Real Walking, Walk-in Place,

- Observation from observer

- Self-report questionnaire/survey

- Questionnaire with a Likert scale is processed using one-way ANOVA

- Variables measured: preference, cybersickness, user experience (ease of interaction, enjoyment, frustration, tiredness, immersion)

- (Same points from preliminary result)

- Curtain display is more preferred than HMD

- Participants prefer more realistic and real-life linkable interaction techniques

Bozgeyikli et al. (2018) [45]

Scene in virtual warehouse with a realistic appearance, the user asked to go without colliding obstacles

- Interface comp: Avatar

- Interface attribute: Instruction methods, Visual fidelity, View zoom, Clutter, Motion

- Observation from observer

- Self-report questionnaire/survey

- Questionnaire is processed using ANOVA, Mauchly sphericity test

- Variables measured: user experience, presence, motion sickness, user comment

To use animated instructions and avoid verbal instructions, use low visual fidelity and standard view zoom, and use no clutter and no motion in VR training application for HF ASD

Bozgeyikli et al. (2016) [41]

Scene in virtual warehouse with a realistic appearance, user asked to go without colliding obstacles

- Interface comp: Avatar

- 3D interaction:

-- 3 commonly used: redirected walking, walk-in-place, joystick,

-- 2 unexplored techniques: stepper machine, point & teleport

-- 3 selected techniques for ASD: flying, flapping, and trackball

- Observation from observer

- Self-report questionnaire/survey (Likert scale)

- Questionnaire is processed using ANOVA, Mauchly sphericity test

- Variables measured: ease of understanding, ease of operating, required effort, tiredness, being in control, enjoyment, being overwhelmed and frustrated, motion sickness, and presence

- Joystick, point & teleport, redirected walking, and walk-in place are suitable for VR locomotion techniques

- Hand gesture-based and automatic movement locomotion are not convenient for HD ASD

Mei et al. (2015) [42]

Game: Imagination soccer (Hand-eye coordination)

- Interface comp: Customizable Virtual Human (CVH) & NonCVH

- Self-report questionnaire/survey

- Question set using PIFF2 and Likert scale

- Variables measured: task success rate, user experience (presence, involvement, and flow)

- CVH can increase hand-eye-coordination performance

- CVH improved the UX (presence, involvement, and flow) of ASD users

Nuguri et al. (2021) [35]

Social scenario in Orientation Day; recognizing facial expressions, sharing ideas, turn-taking in conversations

- Interface comp: Scenes, bubbles, pop-up boxes, status bars

- 3D Interaction: Locomotion (teleportation in VR)

- Self-report questionnaire/survey (using SUS)

- Observation from observer towards the app

- Observation from observer towards user’s real-time brain signals

- Variables measured: immersiveness, frustration, engagement, task completion

- Teleportation in VR is preferred than moving with a mouse & keyboard

- VR is preferred than desktop

- Network speed affects the satisfaction of UX

Mei et al. (2018) [32]

Game: Imagination soccer (Hand-eye coordination)

- Interface comp: Scenes, avatar (CVH)

- Eye-gaze logging

- Paired t-test (CVH-NCVH comparison) using Bonferroni correction

Variable measured: Total acquired joint, time gazing at Regions of interest (ROI)

- Including CVH results on slower reaction to do the joint attention request

- CVH helps ASD to gaze less at unimportant areas

Zhao et al. (2016) [54]

Puzzle game, collection game, and delivery game

- 3D Interaction: Object manipulation (hold, move, and drop)

- Participants fill a questionnaire/survey

Variable measured: user experience, task performance

- Participants are engaged and motivated to play well

- Not really adapted to use Leap Motion device

Finkelstein et al. (2013) [43]

Game with physical activity;

- Dodging, ducking, jumping

- Collecting points

- Interface comp: Scene

- 3D Interaction: Full-body movement

- Self-report questionnaire/survey

- Observation from observer

Variables measured: demographic, total energy burnt, user experience (enjoyment, replayability, amount of exercise, preference)

- Two non-verbal participants cannot provide feedback; some participants received parental assistance in filling questionnaire

- Participants show more interest in familiar themes

Schmidt et al. (2019) [55]

Training to use public transportation (Applied Behavior Analysis + immersive technologies + special education curriculum)

- Interface comp: Scene

- Observation from observer. Direct interview to participant

- Self-report questionnaire/survey (SUS)

- Cluster semantics coding categories using affinity mapping techniques

Output: Affect (joy/fun/excitement, willingness to return), Accessibility (physical, cognitive, cybersickness), General (usefulness, realism, real-world connections)

The qualitative codes categorization (on output) and operationalizations

Bernardes et al. (2015) [38]

Traveling training; validating the ticket, sitting in the right place, pressing the stop button

- Interface comp: Scene

- Concurrent Think Aloud (CTA)

- Task performance (completion time)

Variables measured: Technology acceptance, interface comprehension, task performance

- ASD group takes longer on finishing the task compared to the control group (TD)

- Serious game intervention is needed

Kuriakose et al. (2012) [36]

Bidirectional social conversation

- Interface comp: Scene, Avatar

- Observation from observer towards user performance

- Observation from observer towards physiological response (Heart Rate and Skin Temperature)

Variables measured: affective states (engagement and likeness) through physiological features (HR & SKT)

- The use of wired physiological sensors possibly induces additional anxiety factors and limiting the user’s movement freedom

Halabi et al. (2017) [44]

Conversation (Role-play and turn-taking)

- Interface comp: Scene, Avatar

- Interaction: Verbal response (speech recognition), waving (gesture recognition)

- Self-report questionnaire/survey

- Variables measured: voice and physical motion, time in task completion

- Level of immersion: CAVE > HMD > Desktop

- The most liked display method: CAVE

Lahiri et al. (2012) [33]

Bidirectional social conversation

- Interface comp: Scene, 3D Avatar

- Interaction: mouse

- Observation from observer

- Physiological signal acquisition

- Variables measured: ROI (face, context-relevant objects, and other) and attention duration

This system provides feedback based on quantitative measurement of user’s performance and viewing pattern

Lahiri et al. (2011) [34]

Bidirectional social conversation

- Interface comp: Scene, Avatar

- Observation from observer

- Physiological signal acquisition

Variables measured: Physiological data and Affective states (anxiety, enjoyment, and engagement of participants)

The system can predict user’s affective states from objective measures (physiological signals)

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hening Darmasti, A., Pinkwart, N., Zender, R. (2023). User Experience Evaluation in Virtual Reality for Autism: A Systematic Literature Review. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. HCII 2023. Lecture Notes in Computer Science, vol 14020. Springer, Cham. https://doi.org/10.1007/978-3-031-35681-0_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35681-0_36

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35680-3

  • Online ISBN: 978-3-031-35681-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics