Abstract
Virtual Reality (VR) and Augmented Reality (AR) afford new forms of work and leisure. While affordable and effective VR and AR headsets are now available, neither technology has achieved widespread user adoption. However, we predict continual technological advances and cost reductions are likely to lead to wider diffusion in society. Bridging the chasm from the early adopters to the early majority will require careful consideration of the needs of a more casual and diverse user population. In particular, it is desirable to minimise the exclusion of potential users based on their unique needs and maximise the inclusion of users in these novel immersive experiences. Ensuring equitable access to the emerging metaverse further reinforces the need to consider the diverse needs of users. We refer to this objective of maximising the accessibility and enjoyment potential of users of VR, AR and the metaverse as Inclusive Immersion. This paper reviews the research and commercial landscape seeking to address the accessibility needs of users in VR and AR. The survey provides the basis for a synthesis of the emerging strategies for maximising the inclusiveness of VR and AR applications. Finally, we identify several unaddressed accessibility challenges requiring further research attention. Our paper consolidates disparate efforts related to promoting accessible VR and AR and delivers directions for advancing research in this area.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Virtual and Augmented Reality loom as the next major technology wave in personal computing. At present, it is largely early adopters and technology enthusiasts who are using these technologies as consumers (Foxman 2018; Sánchez-Cabrero et al. 2019). There is also an increasing number of specialised verticals in which Virtual and Reality are being deployed (Li et al. 2018; Masood and Egger 2019; Wedel et al. 2020). While constrained to niche applications and discerning but ultimately forgiving user groups, a narrow consideration of specific user needs is tolerated. However, as the technology matures further and sees adoption by the early majority, the wider usability concerns of this expanded user group must be accommodated. Significant among these concerns is accessibility. As Hedvall (2009) notes, consideration of accessibility tends to lag behind broader advancement in usability and interaction design in human–computer interaction (HCI) and therefore demands concerted and targeted research attention.
In this paper, we review research and commercial efforts aimed at improving the accessibility of Virtual Reality (VR) and Augmented Reality (AR). Our broad scope is chosen in direct response to the fact that most prior work examines specific forms of disability or impairment and their influence on VR and AR usability. In practice, analysis by Waller et al. (2010) indicates that more than 60% of individuals with one form of ability loss also have at least one other form of ability loss. Although there is clear merit in targeted focus on specific forms of disability, the frequent co-occurrence of ability loss also highlights the need for a broad appreciation of the accessibility challenges encountered in VR and AR. We address this gap in the literature by synthesising the related work covering different forms of disability in different application areas into a set of emergent design strategies and unaddressed accessibility challenges. This paper therefore offers a primer for researchers seeking a holistic perspective on proven solutions, remaining challenges and general design strategies for accessible VR and AR.
Our focus on the diversity of accessibility needs among users with a disability takes direct influence from the concept of inclusive design (Clarkson et al. 2007). Inclusive design emphasises how an understanding of user diversity can inform design decisions that maximise usability for as broad a population as possible. To reflect this influence, we subsequently refer to the objective of maximising the inclusiveness of VR and AR technologies as Inclusive Immersion.
The pursuit of Inclusive Immersion is motivated by four key factors: (i) a moral imperative to make technology advances accessible to all; (ii) VR and AR have demonstrated value as assistive and rehabilitative technologies; (iii) the commercial benefits of reaching the broadest user base possible; and (iv) good design typically yields better usability for all. Expanding on this last factor, there are circumstances in which users may not have a disability but may be situationally impaired. As Wobbrock et al. (2011, pg. 7) observe, “Situational impairments arise when aspects of a user’s environment adversely affect his or her ability to perform certain activities”. A design that addresses a non-situational impairment may thus also improve the usability for those who are situationally impaired.
The lens of human–computer interaction also informs the perspective applied in this paper to the synthesis of the literature. Similar efforts by Sears and Young (2002) and Brulé et al. (2020) have made a valuable contribution to designers. Sears and Young provide a highly informative paper identifying physical impairments that can hinder interaction with standard computing technologies. With a more constrained scope, Brulé et al. review quantitative studies evaluating technology for visually impaired users. The survey presented in this paper expands on this prior work by reviewing the unique accessibility challenges and opportunities afforded by VR and AR across the diverse forms of ability loss.
We suggest that it is useful to address the common usability issues exposed by VR and AR despite many other important differences. Their joint examination serves to highlight how some solutions identified in VR may also be applicable in AR and vice versa. Milgram and Kishino (1994) describe how there is in fact a continuum between the entirely virtual and entirely real environments, and that VR and AR sit at different points along this spectrum.
It is timely to address accessibility issues in VR and AR given the current maturation level of the technology. There is a dearth of evidence-based design guidance for building VR and AR applications, even without considering the needs of specific user groups. This presents a challenge but also an opportunity to embed inclusive thinking in the emerging principles of design for VR and AR. Accessibility in VR and AR deserves specific attention due to the very fact that the user is immersed (or partially immersed) in a virtual environment. The World Health Organization’s (WHO) International Classification of Functioning, Disability and Health (ICF) model of disability (WHO 2013) explicitly reflects the role that environmental factors play in an individual’s experience of disability. The social model of disability (Shakespeare 2006) also stresses how people are disabled by barriers encountered in society and the physical environment. This model is in contrast to the medical model of disability which concentrates on the loss of ability as a consequence of a physiological or psychological impairment. The social model helps to highlight how factors external to the individual can be redesigned or reconfigured to address disability. As a trivial example, a ramp can replace a stair to eliminate (or at least mitigate) access issues for the user of a wheelchair. Likewise, given the designer’s complete control over the virtual environment in which they choose to embed the user, there are potentially many simple substitutions or configurable features they can support that will deliver broader accessibility for their application.
This paper has two key objectives: first, to capture and synthesise the disparate latest research and commercial efforts that highlight the unique accessibility challenges and solutions for delivering more inclusive user engagement with VR and AR; and second, to provide a helpful reference for designers to promote awareness of the specific needs for and strategies related to delivering more inclusive immersive applications. These two objectives directly relate to the three primary contributions of this paper:
-
1.
A concise survey of both research and commercial efforts relevant to the objectives of Inclusive Immersion.
-
2.
A discussion of the outstanding challenges that must be overcome to improve accessibility in VR and AR.
-
3.
A synthesis of demonstrated strategies for improving the inclusiveness of VR and AR applications and technologies.
The scope of this survey concentrates predominantly on the design of immersive content. This covers the design of the interfaces and interactions exposed in that content. Less attention is given to the accessibility considerations related to the supporting hardware providing the vehicle for these immersive experiences. This scoping is a reflection of the rapidly evolving and currently disparate VR and AR hardware product space which would quickly date any specific observations.
In scoping this work, we also make an important distinction between efforts enhancing the accessibility of immersive content, and the application of VR and AR technology as a form of assistive technology. This distinction reflects the fact that assistive technologies typically focus on addressing the specific needs of a particular user group, while mainstream immersive content should, ideally, be inclusive of a broad user base. We do not limit our scope to particular forms of immersive content but several mainstream application areas naturally emerge from the reviewed literature. These application areas include VR and AR experiences for entertainment (e.g. gaming, cultural experiences and broadcast), socialisation, education, health care and work.
1.1 Outline
Prior to our survey of the literature, we first define the key terms in Sect. 2 to ensure a common frame of reference. Section 3 describes the methodology employed in selecting the preliminary set of literature for review. The survey proper is then presented in Sects. 4 and 5, categorised into accessibility-related efforts according to their application to Virtual Reality or Augmented Reality. In Sect. 6, we survey research efforts focused on the metaverse and social VR as a key emerging application of VR and AR. Finally, in Sect. 7, we also briefly cover relevant efforts that are not strictly in the area of VR and AR but for which the learning is readily transferable.
The literature and commercial efforts reviewed throughout Sects. 4, 5, 6 and 7 are synthesised to compile a set of generalisable strategies for delivering more accessible and inclusive VR and AR experiences. This compiled set of strategies presented in Sect. 8 is intended to provide a useful reference for designers seeking to improve the accessibility of their immersive content. Finally, we distilled a common set of three key challenges in supporting Inclusive Immersion. These challenges are described, with examples, in Sect. 9. Finally, Sect. 10 revisits the stated objectives and summarises the core contributions of this paper.
2 Definitions
In this section, we define the key terms used throughout this paper. This serves to establish a common lens through which the related work is viewed. The definitions of the key terms are summarised in Table 1.
3 Survey of efforts in Inclusive Immersion
The conceptualisation of disability as an emergent consequence of a health condition and contextual factors (as per the social model of disability and the WHO ICF model) is central to the analytical perspective applied in this review. The social model of disability (Shakespeare 2006) encourages thinking in terms of what barriers exist, and how these might be removed, rather than focusing on what a particular individual cannot do and the associated implication that this stems from a shortcoming on their part. The presumption is then that effective design can mitigate or even eliminate disability. This suggests a proactive rather than reactive approach to considering the potentially diverse capabilities of users of VR and AR.
The term Inclusive Immersion is new but the concept is founded in established thought, and particularly the concepts of accessibility and inclusive design. In the following sections, we present a survey of the literature and commercial efforts in which specific attention has been given to the goal of promoting the usability of VR and AR among disabled users. The methodology employed to structure this survey is described in the following subsection.
3.1 Methodology
To assemble a preliminary set of the literature for review, we conducted a Scopus search with the following query: (AUTHKEY(“Virtual Reality”) OR AUTHKEY(“Augmented Reality”) OR AUTHKEY(“Mixed Reality”) ) AND (AUTHKEY("Accessibility") ) AND (LIMIT-TO(LANGUAGE, “English”) ). In Scopus search syntax, AUTHKEY designates author-assigned keywords. This query returned 177 documents on 20 July 2022. The titles and abstracts of these documents were reviewed for relevancy and categorised by the first author. These preliminary relevancy decisions and category assignments were then reviewed by the second author. Any disagreements were resolved through discussion between the first two authors. The final categories described below were iteratively refined to accurately reflect the distinct foci of the reviewed documents. Of the 177 documents, 120 were found to be relevant to the broad application of Virtual and Augmented Reality for users with a disability or impairment. Papers that used “accessible” in the broader sense of making the technology more widely available, for example, were not considered relevant. Demonstration and editorial papers related to a main paper were also excluded.
Relevant papers were categorised into one of six groups:
-
Accessibility: This is papers core to our review which specifically seek to include features that improve the accessibility of VR or AR for users with a disability. The goal of these papers is to make VR or AR more usable for disabled users.
-
Disability simulation: This is papers where the core focus is simulating the experience of having a disability with VR or AR. Such simulations are primarily targeted at non-disabled users.
-
Skills training: This is papers where the core focus is using VR or AR to help develop the skills of a user with a disability or impairment. These papers may include features added for accessibility but the purpose is to demonstrate an application for improving a particular skill.
-
Assistive technology: This is papers that seek to use VR or AR to deliver some form of assistive technology. In contrast with the Accessibility category, these papers use VR or AR as the tool to allow users to do something (typically) in the physical world, as opposed to the primary focus being on them being able to do something in or enjoy the virtual world.
-
Health: This is papers that use VR or AR as a clinical tool to perform an assessment, help diagnose a particular health condition, as a therapy, for rehabilitation or in some other way to improve health outcomes.
-
Review: This is papers that survey or review research work in one of the above categories.
The number of papers in each category is summarised in Table 2.Footnote 1 Our focus in this paper is the identification of solutions, challenges and design strategies for accessible VR and AR. The 73 papers within the Accessibility category are most relevant to this focus and form the basis of the survey. We also expand our coverage of related work by identifying relevant papers either cited in or cited by this core set of 73 papers. Papers in the other four main categories of Disability simulation, Skills training, Assistive technology and Health provide insight into how the technologies and associated content can benefit users with a disability but focus primarily on the application as opposed to interface or interaction considerations. Therefore, we draw key examples from these categories rather than cover them exhaustively.
In addition to the formal literature captured by the Scopus search, we also review prominent commercial efforts. The inclusion of commercial efforts in this survey reflects the rapid maturation of the technology in this space. The demand for immersive hardware and content means that relevant research lags behind commercial efforts in certain areas. It is therefore useful to document the various strategies applied, despite potentially less attention given to theoretical grounding or validation.
The survey splits Virtual Reality and Augmented Reality into different sections. This is followed by a brief survey of research efforts focused on the metaverse, and social VR in particular, given its growing relevance as a key application of VR and AR. Finally, an additional top-level category of “Other Relevant Non-Immersive Efforts” serves to capture work that does not specifically target VR and AR but which could be easily ported to this setting.
4 Virtual reality
Virtual Reality produces a simulated environment in which a user’s abilities and perception can be extended. The control exercised over the virtual environment suggests that VR can provide a productive tool and enjoyable outlet for users whose interaction with a real environment might otherwise be impaired.
Although recent technological advances have significantly broadened the availability of low-cost consumer VR, there has been ongoing research in this area stemming from much earlier waves of interest in VR. The International Conference on Disability, Virtual Reality and Associated Technologies (IDCVRAT) has been held biannually since 1996. As the necessary hardware has recently gained greater consumer penetration, however, a broader research community has similarly taken greater interest in the challenges and opportunities afforded by VR for disabled users. The 3rd International Workshop on Virtual and Augmented Assistive Technology (VAAT) was held at IEEE VR in 2015 (see, for example, Maidenbaum and Amedi 2015). A workshop in Accessibility in VR was also held at the International Symposium on Mixed and Augmented Reality (ISMAR) in 2019 (Mott et al. 2019). XR Access (XR Access 2020) is a recently established community of university and industry partners focused on addressing the accessibility challenges encountered with VR and AR technologies. XR Association (XRA) is the trade association formed to represent VR and AR device manufacturers. An October 2020 report from XRA (XR Association 2020) provides explicit guidance on the development of VR and AR applications that are accessible to disabled users. Oculus has also introduced Virtual Reality Check (VRC) guidelines related to accessibility (Oculus 2021). On behalf of the Information Technology and Innovation Foundation (ITIF) think tank, Dick (2021) provides a primer for policymakers on the accessibility issues encountered in VR and AR and offers various recommendations for promoting inclusion. A growing number of researchers and organisations are therefore actively investigating systems and strategies for improving the accessibility of VR and AR.
Researchers have examined the requirements for accessible VR from multiple perspectives. We briefly survey some of these perspectives along with the high-level insights they offer before focusing on concrete efforts addressing particular forms of access barriers. Gerling and Spiel (2021) argue that VR is an “inherently ableist” technology and fails to accommodate the needs of users with a disability. From a perspective based on Surrogate Body theory (i.e. a theory describing how users effectively lend their body to an immersive experience), they examine several aspects of use of VR and what this implies about human bodies and how this might be incompatible for those with a physical disability. Their analysis highlights various usability concerns implied for users with a disability and provides theory derived insight into the empirical observations made in the various other studies reviewed in this paper. The seven principles of universal design were created in 1997 with the aim of developing universal environments, products and communications (Preiser and Smith 2011). Dombrowski et al. (2019) review these seven principles against the current state of VR technology and the experiences of users with different disabilities. These seven principles and their reinterpretation in the context of VR by Dombrowski et al. are: equitable use (i.e. usable by people with diverse abilities); flexibility in use (i.e. accommodate varying preferences and abilities); simple and intuitive use; perceptible information (i.e. effective communication of information to the user); tolerance for error (i.e. minimise the effect of accidental actions); low physical effort; and size and space for approach (i.e. appropriate tracking and space configuration, as well as sizing of content).
Ciccone et al. (2021) provide a brief overview of accessibility and ergonomics issues with VR. They point out that the accessibility features available on modern smartphones, e.g. screen readers, are not available in current headset operating systems. Related to this point, Teófilo et al. (2018b) investigated how existing mobile accessibility guidelines can be effectively applied to improve the accessibility of Virtual Reality for disabled users. The study by Teófilo et al. found that accessibility features such as zooming and subtitles were also useful in a VR setting. Captions were observed to increase the participants’ understanding of content, while a voice assistant was helpful for navigation and comprehension purposes. However, Teófilo et al. suggest that such accessibility features should be optional and the user should be empowered to decide if and when they are needed. These findings are in line with an earlier study by Teófilo et al. (2016) which explored how assistive features in Virtual Reality might be integrated and accessed by users with different physical and mental conditions. For example, a colour inversion feature might have a particular requirement to be easily toggled. This proposal suggests that it is inappropriate to simply bury all accessibility features deep within a system menu. Also offering some high-level guidance, Jimenez-Mixco et al. (2009) report on their design of a living lab for assessing the accessibility of smart home features using VR. For example, they propose that every interaction is possible with at least two modalities, that users are given both visual and audible feedback on their interactions, and that various options can be personalised.
In the following subsections, we categorise efforts based on the particular access barrier being addressed in the work. The high-level categories of access difficulty covered include perception, cognition and movement. Within perception, we separately review work focused on blind and low-vision users (Sect. 4.1), and work focused on users with hearing difficulties (Sect. 4.2). Within cognition we chiefly cover neurodiversity and users with cognitive impairments (Sect. 4.3) and under movement, users with balance and motor impairments (Sect. 4.4). Later subsections specifically examine accessibility efforts in the related targeted domains of broadcast VR (Sect. 4.5), health care (Sect. 4.6), and work training and disability simulation (Sect. 4.7). Finally, various related commercial efforts addressing aspects of accessibility in VR are also reviewed in Sect. 4.8.
4.1 PERCEPTION: blind and low-vision users
Spaces and objects presented within a virtual environment lack some of the key physical and acoustic qualities that might otherwise be exploited by blind and low-vision users to perceive and interact with their surroundings. One potential strategy is to replicate some of these qualities, such as by providing haptic or force feedback when interacting with objects or navigating through virtual environments. Roberts et al. (2002) present a very early design concept for a haptic glove with proposed use cases in accessibility and VR. This early concept is now technically feasible, and haptic gloves have been demonstrated as a means to support object perception in VR for blind and visually impaired users (Kreimeier et al. 2020b).
To support spatial understanding in VR, several studies have explored a strategy of replicating the experience of using a white cane. Canetroller (Zhao et al. 2018) was designed to assist users in learning virtual environments by allowing users to navigate both the physical and virtual world. The Canetroller comprises a controller, a HTC Vive tracker, a slider, a brake and a voice coil (providing haptic feedback). The haptic and audible feedback provided by the Canetroller approximately replicates the experience of navigating through a physical space. A user study with nine vision impaired participants evaluated the Canetroller in both a virtual indoor and outdoor scenario. All participants were able to explore and map the virtual environment themselves thanks to the vibrotactile and auditory feedback, and all virtual objects in the indoor scene were recognised by the participants. The navigation in the outdoor scenario was successful for most of the participants, but a subset of participants found it confusing. Kim (2020) also present a cane-based system for use in VR that provides additional functionality around the replication of the experience of encountering braille blocks while walking. Similarly, the Virtual-EyeCane (Maidenbaum et al. 2013) leverages sensors and audio cues to help users with vision impairments perceive distance to virtual objects and orient themselves within virtual environments. When the virtual cane approaches an object in the virtual environment, the user will hear increasing beeps emitted by the cane indicating the proximity of the obstacle.
In some VR experiences or physical spaces, it may be infeasible to use an integrated white cane. Kreimeier et al. (2020a) evaluated a range of conventional methods and devices for controlling VR locomotion with blind and visually impaired users. The evaluated methods included locomotion controlled by tracking markers attached to the user, use of treadmills and use of a controller’s joystick. Kreimeier et al. found that joystick-based locomotion was considered the safest by participants and delivered the highest performance.
For low-vision users, it may be possible to improve perception of VR content by offering various tools that allow users to leverage any residual vision they may have. SeeingVR (Zhao et al. 2019a) implements various features designed to improve the perception of content in VR for visually impaired users. It consists of 14 tools including magnification lens, bifocal lens, brightness lens, contrast lens, edge enhancement, peripheral remapping, text augmentation, text to speech, depth measurement, object recognition, highlight, guideline, recolouring and assistive applications. The components can scale up the view, increase the brightness or contrast of the content, outline and highlight objects, invert colours of the environment, provide speech and subtitles, measure the depth of objects, direct users’ attention to important objects using guidelines and show objects that are out of the centre of the user’s visual field. If SeeingVR is used in conjunction with Microsoft’s Seeing AI (Microsoft 2020) and VizWiz (Bigham et al. 2010), the system can also recognise and verbally describe a virtual scene for users if requested. Hoppe et al. (2020) proposed allowing for these different forms of visual augmentations to be rapidly triggered using controller buttons, given that different tools may be required in different circumstances and frequently navigating complicated accessibility menus is highly undesirable.
Weir et al. (2020) introduce a VR application to support reading for users with visual impairments. A key observation guiding the design of their application is that “additional visual noise will often decrease acuity for people with severe vision loss”. In response, their application uses simple graphics and allows for background elements to be disabled. Wu et al. (2021) also propose a set of design principles regarding the use of VR to support accessible reading activities such as reading a newspaper. These principles include the provision to adjust text and layout appearance, smart contrasting and simple interactions for navigation and control. Interestingly, Powell et al. (2020) found that existing off-the-shelf VR hardware can be usable to low-vision users without any substantial hardware or software modification necessary by simply increasing luminance. This suggests that it may not always be necessary to introduce bespoke features or tools to improve accessibility.
Carefully combining audio and haptic cues can provide some degree of spatial perception to support blind users in completing particular tasks in VR. Wedoff et al. (2019) built a VR game called Virtual Showdown targeted at visually impaired youth. Virtual Showdown is an augmented version of a real-world accessible game in which players hit a ball to each other across a table, as in table hockey. Virtual Showdown supports players by providing verbal and vibration cues and hints related to the trajectory of the ball. Wedoff et al. found that verbal cues alone outperformed the combined use of verbal and vibration cues. DualPanto (Schneider et al. 2018) applies an alternative strategy for supporting blind users’ interaction in virtual worlds for gaming. Two force feedback handles are used to provide the player with a spatial understanding of their current location as well as the target object. Players can move their avatar using one handle, while the target object is automatically updated. This allows for, for example, blind users to perceive and move their avatar towards the ball in a football game. Similarly, Morański and Materka (2014) use a haptic force feedback input device to permit users with visual impairments to perceive a virtual environment. A key recommendation arising from Morański and Materka’s study is to pair such interactions with audible feedback about objects and or reference points in the scene given the ease with which one can lose their sense of orientation in the space. Using sound alone to support recognition of visual content may be feasible in some circumstances but cues must be well designed. Salamon et al. (2014) demonstrate a method for sonifying object trajectories based on music where the x-position of an object is represented by a note and the y-position is represented by an octave. The technique was not evaluated with non-sighted individuals but recognition rates of non-trivial trajectories by sighted individuals who were blindfolded was relatively poor.
4.2 PERCEPTION: users with hearing difficulties
Jain et al. (2021a) offer a taxonomy of sounds in VR for the purpose of representing different sounds to users who are deaf or hard of hearing. The taxonomy enables categorisation of sounds according to two key dimensions: (i) sound source (e.g. localised speech, non-localised speech, interaction sounds) and (ii) sound intent (e.g. sounds for conveying critical information versus sounds, sounds for generating an affective state). This taxonomy establishes a standard “language” of sound in VR for pairing with alternative feedback channels such as via haptics of visual indicators. Jain et al. (2021b) then leverage this taxonomy and the clarity it offers in understanding the design space to develop and evaluate several prototype sound-to-visual/haptic mappings. Significantly, Jain et al. also performed an evaluation with developers to assess how easily these prototypes can be integrated into their applications.
EarVR (Mirzaei et al. 2020) is a system designed to help individuals with hearing impairments recognise sound direction. It locates 3D sounds for users and converts them into vibrations indicating the source direction. The vibro-motors are placed in the user’s ear, and the vibration frequency differs between the left and right ear to indicate sound from different directions. The difference between the two ears enables precise recognition of direction by users. The system was demonstrated by Mirzaei et al. to enable people who have hearing problems to perform simple sound-based VR tasks, like locating objects emitting a sound, as well as people without auditory deficit.
Perhaps the most established method for communicating sound and verbal content to users with hearing impairments is through captions. The design space for presenting captions in VR has been explored extensively within the domain of immersive broadcast and 360-degree video and is covered later in Sect. 4.5.
Verbal content can also be delivered to deaf or hard-of-hearing users through sign language. For instance, Paudyal et al. (2019) present a concept for augmenting online lectures presented in VR with sign language interpreters for deaf and hard-of-hearing users. Other researchers have sought to standardise the procedure for transcribing text into signing avatars and streamlining the process for integrating this into applications (Jaballah 2012; Do Amaral et al. 2011; Do Amaral and De Martino 2010). McCloskey (2022) describe a sign language learning concept in VR in which user hand gestures are recognised to provide feedback to learners. A conceptual extension of this work could be supporting in-game interaction via sign language recognition.
4.3 COGNITION: neurodiversity and users with cognitive impairments
Guitton et al. (2019) present an emerging set of principles for immersive serious games to accommodate users with cognitive difficulties. These include: support personalised configurations, manage level of detail, improve comprehension by avoiding implicit meaning, provide assistance in managing time and support simplified controls. For neurodiverse users, the ability to exercise control over stimuli would appear to be critical (Boyd 2019; Wasserman et al. 2019; Boyd et al. 2019; Lukava et al. 2022). Boyd (2019) describes the concept of sensory-inclusive techniques and suggests providing neurodiverse users with control over sensory stimuli, e.g. by allowing volume or game play speed adjustment.
In their review paper, Standen and Brown (2006) suggest design guidelines for educational VR software servicing individuals with intellectual or developmental disabilities (IDD). Although the guidelines are based on less mature non-immersive VR systems, some insights remain useful for developers in creating VR content as a learning tool for those with IDD. Standen and Brown recommend that both users and specialists be involved in the design process to develop text and symbols together. Developers should, for example, also consider adjusting building features such as doorways to be larger than in the physical world to facilitate navigation in VR. Another important consideration is the need to accommodate the processing time of different users, for example, when they are presented with information dialogues (Standen and Brown 2006). Related to these considerations and representing a potential dynamic assistance approach for users with IID, Hong et al. (2021) examined whether eye fixation and scanning behaviour could be used to determine when additional guidance should be presented in a VR job training application.
Garzotto et al. (2017a) considered how the usability of a technology may depend on whether that technology can be adapted to a user’s specific characteristics or requirements. Therefore, they designed a VR storytelling tool called XOOM that satisfied diverse user needs by allowing parents or caregivers to customise the VR experience without needing to rely on a developer. The tool was leveraged to deliver more effective social story therapy for children with neurodevelopmental disorders (NDD). Parents or caregivers were able to select a 360-degree video, personalise the interaction between children and the virtual world, observe and control the interface and collect data about the interactions. Building on this prior work, Garzotto et al. (2018) describe the development of Wearable Immersive Social Stories. This is an immersive version of typical social stories used to help people with neurodevelopmental disorders learn social skills. As observed in the context of XOOM, there is again a requirement to be able to personalise the VR content and interactive features for each user. Nabors et al. (2020) provide a scoping review of VR as an intervention for skill development for individuals with intellectual disabilities and similarly highlight the need for caregiver and technology support. Bailey et al. (2021) provide a systematic review of social communication interventions targeted at individuals with communication and neurodevelopmental disorders. Bailey et al.’s review also echoes the observation that a high degree of customisation is necessary to accommodate different users and their specific needs across a range of parameters such as task difficulty and user controls.
Boger et al. (2018) undertook a participatory design process with individuals with mild cognitive impairments to develop a VR exercise application. They offer a set of design guidelines identified through their participatory design approach. One of the design aspects identified by Boger et al. as being particularly important was the introduction of a calibration task to help manage the limitations in users’ range of motion. Importantly, they also highlight the fact that the range of motion is not necessarily the same on each side of the user. This particular observation also relates to Sect. 4.4 and highlights the need to consider common co-occurrence of disability.
VR has been shown to deliver positive experiences in various gaming and cultural applications for users with IID (Shaker et al. 2020; Wang et al. 2021; Giaconi et al. 2021). Shaker et al. (2020) developed a VR tour system for users with IDD and observed a preference for gaze-based interaction over controller-based interaction due to its simpler mapping to objectives. Participants in their experiment also expressed interest in being provided with a dedicated guide within the VR experience. Given that autism spectrum disorder (ASD) can be associated with sensory integration disorders, Glaser et al. (2022) examined whether ASD users of VR were more prone to cybersickness but found no strong evidence that this was the case.
4.4 MOVEMENT: users with balance and motor impairments
Common user interactions in VR require hand and wrist movement while gripping controllers. However, the gripping posture can constrain finger movement and may further restrict hand dexterity. Cook et al. (2019) assess a range of VR controllers in terms of their accessibility and demands on dexterity, particularly focusing on older adults. Cook et al. suggest “the need to reform consistent controller design in terms of ergonomic and dexterity-based interaction”.
For users with dexterity- and/or mobility-related impairments there may, therefore, be value in supporting natural interactions more similar to how people interact within the real world. VR gloves and hand tracking functionality have seen recent advances, but some gloves are bulky and may demand greater motor skills than using a controller. Li et al. (2020) create a pen-style controller for people with motor disabilities to enable input using mainly wrist movement. This type of controller leverages users’ established familiarity with pen-shaped joysticks or standard pens.
Other users may have impairments that prevent them from using their hands or arms at all. Such users may currently be excluded from the majority of conventional VR experiences that demand the use of controllers or arm movements to both interact with content and navigate the environment. However, some VR navigation methods, such as being automatically transported through the environment by a vehicle, can potentially include users facing this form of access barrier. Di Luca et al. (2021) provide an index of different VR locomotion techniques and includes tags indicating low, medium and high accessibility. An alternative strategy is to enable interaction via other modalities such as using gaze or voice.
Users who have impaired mobility may benefit from control schemes or interaction methods that avoid the one-to-one mapping between physical and virtual movements typically required in VR experiences. Minakata et al. (2019) evaluated different pointing methods including mouse, gaze, head and foot with a head-mounted display (HMD) and ran a portion of the study with eight individuals with movement disorders. Head-pointing was the most efficient method after mouse-pointing but this paper does highlight the potential of alternative pointing methods when an impairment may prevent neck or upper-body movement. Franz et al. (2021) present Nearmi which provides a framework for designing accessible techniques that help users with limited mobility to locate and observe points of interest (POI). Wang et al. (2019) present a wearable device that can be worn with a VR or AR headset and incorporates non-invasive biosensors to detect eye movements and facial expressions to enable alternative interactions.
Mott et al. (2020) interviewed 16 users with limited mobility to obtain their perspectives on the accessibility of VR. The seven main accessibility barriers identified by Mott et al. were: (i) setting up a VR system; (ii) putting on/taking off the HMD; (iii) adjusting the HMD; (iv) cord management; (v) manipulating dual controllers; (vi) inaccessible controller buttons; and (vii) maintaining controllers in tracking volume. In direct response to these identified barriers, Mott et al. proposed a number of strategies that they hypothesise may help to mitigate these barriers. The proposed strategies include: (i) design for interdependence (i.e. facilitate collaboration between the user and others who can provide assistance); (ii) design for customisation (i.e. allow for more customisation of devices, control and interactions); (iii) design for diversity (i.e. accommodating variation in user capability can benefit disabled and non-disabled users alike); and (iv) leverage ability-based design (i.e. consider what users can do rather than focusing on what they cannot do).
HMD users sometimes lose their balance when being fully immersed in a virtual environment, and people with mobility and movement impairments may be more likely to encounter balance problems in virtual environments. Ferdous et al. (2018b, 2018a) observed that postural instability in VR is worse for people with balance impairments. The most significant qualities leading to degraded postural instability for people with balance impairments relate to field of view and frame rate. Ferdous et al. (2016) evaluated a virtual static rest frame in VR as a means to assist HMD users with balance impairments to maintain balance control and avoid cybersickness. The static rest frame consisted of a central crosshair and corner markers similar to a viewfinder on a camera and was effectively rigidly attached to the user’s head and consistently overlaid on the user’s view of the environment. The static rest frame was tested in a hitting game with participants with multiple sclerosis (MS). The visual feedback of the static rest frame improved the depth perception of the participants and helped them maintain balance. The frame enabled the participants to better judge the motion and location of virtual objects during the game. Interestingly, Mahmud et al. (2022) found that providing spatial audio was also beneficial to users with balance impairments. These findings highlight the opportunities for serious games in VR targeted at individuals with MS to aid with physical therapy. Indeed, Arafat et al. (2016) found that users with MS experienced similar levels of cybersickness to users without MS suggesting that VR may be a feasible therapeutic tool.
Gerling et al. (2020) undertook a survey of 25 wheelchair users to examine their reasons for and against engaging with VR. Based on the survey outcomes, Gerling et al. derived three main design implications: offer flexible control schemes based on movement but allow sedentary input as a fallback; consider the wheelchair when designing interactions; and avoid mandatory representation of disability. These design implications were subsequently validated by Gerling et al. (2020) in three different VR game prototypes aimed at wheelchair users. Hansen et al. (2019) introduced a simple low-cost device enabling locomotion in VR by sensing the wheel movements of a wheelchair.
A simple accessibility feature for wheelchair users or other users who may prefer to remain seated in VR is the ability to apply a fixed height offset that can correct for this displacement from the expected user height. Modern VR HMDs such at the Quest 2 include this feature, and Ganapathi et al. (2022) found no detrimental effects associated with applying such an offset.
Finally, various studies have demonstrated how VR can either support assessment of ergonomics (Palaniappan et al. 2019) or adjust interactions and interfaces to reflect ergonomic factors (Kartick et al. 2020; Evangelista Belo et al. 2021). Kartick et al. (2020) presented a preliminary study seeking to accommodate individual upper limb ergonomic factors into the design of grasping interactions in VR. Liu et al. (2022) introduced a novel interaction device providing force feedback with a propeller blowing wind. They discuss how the device can be used in serious games with users who may have balance disorders by controlling the level of difficulty encountered by modulating the level of force feedback. More ambitiously, Evangelista Belo et al. (2021) offer a system that can theoretically adjust interfaces based on individual ergonomics and user impairments by placing interface elements at points in space requiring minimal physical effort to reach.
4.5 Accessibility in broadcast VR
Many mainstream broadcasters, such as the BBC and Sky, are seeking to develop immersive experiences for their viewers (Sky 2020; BBC R &D 2020). At the BBC, multiple projects have been conducted with the goal of producing 360-degree video content suitable for both TV screens and cardboard VR-type headsets (BBC R &D 2020).
In addition to addressing the technological issues, design solutions considering accessibility in 360-degree video content have been developed based on the accessible design guidelines for subtitles, sign language, and audio in traditional videos. Subtitles used in broadcasting help the audience with hearing impairments perceive information while also assisting non-hearing impaired viewers to better comprehend auditory information. BBC R &D have examined the design of subtitles in 360-degree video content (Brown and Patterson 2017) and have evaluated different ways to place and present subtitles (Brown et al. 2018). Brown and Patterson (2017) found that placing subtitles at 120° intervals around the video sphere was an easy strategy for developers to implement, but could be difficult to read in some viewing angles. The always-visible alternative of displaying subtitles fixed in front of the viewer accommodates head movements but may affect the viewer’s perception of information on the screen and may cause simulator sickness. A slight variation on this mode is adjusting subtitle position with some delay. For example, the subtitles may move slower than the head movement or the subtitles disappear when the head turns and re-appear when the head is static. Using this approach, however, the audience may have difficulty reading the subtitles during the delayed placement update.
The Immersive Accessibility (ImAc) project (Montagud et al. 2018), consisting of a consortium of nine partners with funding from the EU, seeks to develop accessibility tools for immersive media with a particular focus on 360-degree video. This project has produced an extensive body of research as well as platforms and tools that facilitates the creation of subtitles and other accessible features for 360-degree videos, including audio descriptions of non-dialogue plot elements and sign language (Agulló et al. 2019; Montagud et al. 2020a, c, b, d, 2021; Matamala 2021). In terms of subtitle content, Oncins et al. (2020) compared subtitles designed for deaf and hard-of-hearing individuals with simplified subtitles aimed at people with cognitive disabilities. In their evaluation with an immersive 360-degree video opera experience, the simplified subtitles were generally preferred as they caused less distraction and permitted greater focus on the primary visual content. Climent et al. (2021) synthesised these various findings to produce 19 requirements for subtitles in 360-degree video. One of the ImAc tools, ImAc Player, allows the viewer to customise subtitles, audio descriptions, and sign language according to their individual preferences. Subtitles can be configured to indicate the position of the speaker in the video using an arrow or a dynamic radar. Similarly, the viewer can configure the sign language interpretation to show the name of the speaker with an arrow or radar. For visually impaired viewers, the player also supports interaction using voice controls. Fidyka and Matamala (2018, 2021) explored audio descriptions in 360-degree video and ran a series of focus groups. One idea proposed was splitting the 360-degree video scene into multiple sections with distinct audio descriptions. Also, considered important was the ability to pause the main narration in order to focus on alternative tracks.
In the context of live broadcasts or performances, there are additional potential challenges related to rapidly incorporating captions into feeds. Solving these challenges, however, can potentially widen the accessibility of live events. For instance, Teófilo et al. (2018a) demonstrate the use of automatic speech recognition to provide text displayed in a VR device to support hearing impaired audience members of live theatre.
The design guidelines created by Epelde et al. (2013) establish standard practice for customising the interface of immersive broadcast content. These guidelines specifically seek to address the needs of elderly viewers with visual, auditory, cognitive or mobility decline. Included in the guidelines are specific design recommendations for icons, graphics, text, and colours in the interface as well as for the audio. An example of this guidance is to avoid using condensed letters and that uppercase should be used for better legibility. Other intuitive recommendations include, only presenting essential graphical information to avoid distraction, ensuring changes in brightness are subtle, and utilising high-contrast colours. For viewers with motor impairments, the guidelines suggest that the interface should be insensitive to erratic input, the requirement for fine motor control should be avoided, and the interface interactions should be designed with consideration of the movement accuracy and movement control capability of elderly viewers.
4.6 VR in health care
Virtual Reality has shown good promise for use in health care, and there has been a significant amount of research examining its application to treatment and rehabilitation. These applications are relevant to the broader concept of Inclusive Immersion due to the need for a consideration of user impairments and recovering or degenerating capabilities. A detailed survey of VR applications in health care is beyond the scope of this paper, and the reader is referred to other recent review papers targeting particular aspects of this domain (e.g. Bryant et al. 2020; Zhu et al. 2021; Nabors et al. 2020; Bailey et al. 2021; Miller et al. 2013). In this section, we briefly highlight the spectrum of health care applications investigated and identify any general insights offered regarding accessibility and inclusion.
Prominent application areas of VR in health care include: motor skills training and mobility rehabilitation (Levin 2011; Kaminer et al. 2014; Chen et al. 2017; Tsoupikova et al. 2015; Saposnik and Levin 2011; Achanccaray et al. 2018; Zhang et al. 2010; Oña et al. 2019; Kim et al. 2017; Devigne et al. 2017; John et al. 2018); treatment of eye impairments (Waddingham et al. 2006; Blaha and Gupta 2014; Vedamurthy et al. 2016; Hurd and Kurniawan 2019; Vivid Vision 2020); pain relief (Benham et al. 2019; Murray et al. 2007; Trost and Parsons 2014; Hoffman et al. 2004; Sharar et al. 2008); cognitive testing and treatment of brain damage (Rose et al. 2005; Corriveau Lecavalier et al. 2020; Morris et al. 2000; Davison et al. 2018; Klinger et al. 2004; Werner et al. 2009; Kizony et al. 2012; Lamash et al. 2017); and learning for individuals with intellectual and developmental disabilities (Standen and Brown 2006; Kandalaft et al. 2013; Boyd et al. 2018; Didehbani et al. 2016; Tianwu et al. 2016; Poyade et al. 2017; Nabors et al. 2020; Bailey et al. 2021; Garzotto et al. 2017b). A key advantage of VR in health care is that it enables flexible and systematic clinical testing and training programmes by immersing patients in controllable 3D stimulus environments and simultaneously permitting the logging of performance for capability tracking and assessment (Rizzo 2002). Kizony et al. (2003) observed that VR made the rehabilitation process more enjoyable and engaging for individuals with motor and cognitive disabilities and was effective in transferring skill learning to real-world abilities. Establishing a high sense of presence in the virtual environment can produce a realistic experience for patients that triggers their natural motor patterns, while the virtual content also enables dangerous situations to be tested which would otherwise be unfeasible in the real world (Kim et al. 2017). These observations suggest that VR applications seeking to serve dual purposes of entertainment and exercise or rehabilitation should target high levels of realism and immersion.
4.7 Work training and disability simulation
VR clearly offers opportunities for providing job training more broadly but has specific advantages among disabled individuals where work tasks may be more difficult to complete or master. Piovesan et al. (2013) sought to increase the inclusion of people with disabilities in the work market by training them in an artificial environment that simulates a real company. VR4VR (Bozgeyikli et al. 2014) is an immersive VR training system for vocational rehabilitation targeted at people with ASD, traumatic brain injuries, and severe mobility impairments. The system facilitates skill acquisition in tasks such as cleaning, loading the back of a truck, money management, and shelving.
Conversely, VR has also been demonstrated as an empathy tool to provide non-disabled users with an understanding of the various challenges faced by disabled users. Disability simulation in VR supports learning about disabilities in order to build empathy and eliminate biases (Flower et al. 2007). Autismity (Autismity 2020) is a VR simulator for educational organisations, medical specialists and family members to gain a better understanding of the experience of having autism. It demonstrates factors, such as lights, colours, sounds and a crowd, that lead to feelings of sensory overload, and mimics the feeling and reaction of a person with autism to the stressors. Empath-D (Kim et al. 2018) allows mobile application developers to experience visual, hearing and motor control impairments within VR. McIntosh et al. (2020) seek to enable empathetic architectural design of corridors and waiting rooms by using VR to simulate the experience of visitors with a disability. To promote empathy in medical students, simulations have also been developed for the experience of a wheelchair user (Meijer and Batalas 2020) and the experience of an individual with a hearing impairment (McLaughlin et al. 2020).
Several studies have developed VR based simulations of various visual impairments (Ates et al. 2015; Jones et al. 2020) including temporary visual impairments experienced with migraines (Misztal et al. 2020). SIMVIZ (Ates et al. 2015) is a VR system built with the aim of identifying accessibility issues caused by vision impairments. Ates et al. (2015) leverage the level of control that can be exercised over the rendering of the virtual scene to simulate symptoms of common vision problems such as Glaucoma, cataracts and colour blindness. SIMVIZ provides the option to adjust the simulation intensity to allow different degrees of vision impairment to be studied.
4.8 Commercial efforts
There are a growing number of commercial efforts to improve the accessibility of VR hardware and applications. Game developers in particular seem to have been forward thinking in this endeavour. A 2019 gaming blog (Bertiz 2019) reviews five games that give special consideration of users with disabilities: Crystal Rift, Persistence, Moss, Arca’s Path VR and Island 359. Bertiz (2019) highlights various accessibility features offered in these games for people with vision, hearing, cognitive and mobility impairments. Bertiz notes that six degrees of freedom head tracking is a simple yet helpful feature allowing short sighted users to better perceive virtual content. For example, the user can move closer to a menu to effectively view the text at an enlarged size.
The game Persistence has an Assist Mode aimed at accommodating different skill levels. The Assist Mode allows the user to configure settings including Enemy Speed, Unlimited Teleport and See Enemies Through Walls. Users can also customise the control inputs of the game to access an easier play mode. The ability to adjust challenge level is a typical affordance in gaming and can benefit both disabled and non-disabled users alike. For instance, a user with a condition that varies in severity from day to day may wish to adjust difficulty in a manner similar to a user returning to the game after a long absence, and with associated degradation in skills. However, there are aspects of challenge that may manifest differently in the context of certain access difficulties, and this may therefore demand greater resolution in challenge control than is typically afforded. An example of this would be a user who can reliably fend off enemies attacking from the front but has difficulty with enemies attacking from the side or behind due to mobility issues. For individuals with hearing impairments, the game Persistence can also convey the direction and location of enemies using a visible indicator.
Moss offers inclusive features for hearing impaired users by not only providing non-obstructive subtitles but also through the main character providing game-related hints using American sign language. Arca’s Path VR can be played without controllers by simply moving one’s head to manipulate a ball rolling on a path. This offers a simplified control input suitable for some users while retaining the challenging and entertaining features of the game. Island 359 offers multiple settings that benefit game players who have mobility difficulties, such as wheelchair users. Wheelchair players may not have the ability to perform full free rotations to adjust their perspective in the virtual world. To address this issue, users can enable Bump Turns which then allows view rotation via the controller. A Reach Assist mode also helps wheelchair users to remotely reach virtual objects on the floor or beyond a certain distance, and the Seated Mode can be activated to increase the apparent height of the wheelchair player in the virtual world.
In 2017, an online survey was conducted by the Disability Visibility Project in partnership with ILMxLAB (a division of Lucasfilm who make several Star Wars VR game titles) to examine the experience of disabled users with VR. The survey (Wong et al. 2017) collected feedback from 79 users in 12 different countries with a range of different disabilities. The survey provides an excellent overview of the many issues faced by disabled users in VR. The five most frequent VR activities with which respondents noted having difficulty were: balancing while standing (43%), crouching (43%), standing (41%), locomotion (37%) and raising/extending/moving arms (29%). The survey report offers 17 recommendations derived from the participant responses for VR designers and developers. A recommendation echoed by multiple respondents was the need to accommodate users who must or prefer to sit for comfort reasons.
The W3C Accessible Platform Architectures Working Group has also reviewed accessibility issues for VR (W3C 2017). A significant unaddressed issue they note is the provision of alternative means of movement for people with mobility impairments, i.e. a person with a constrained range of motion should be provided with the ability to reach objects beyond this range if required by the experience. The same group has more recently released a note on XR accessibility user requirements (W3C 2021). The document establishes standard definitions for key terms and outlines an emerging list of user needs and requirements.
At the time of writing, there appears to be limited system level accessibility support provided by major VR HMD manufacturers. Addressing the lack of native accessibility features, the WalkinVR Driver (WalkinVR 2020) can be used with standard VR hardware to remap controllers and adjust offsets and sensitivities to enable more freedom in movement and interactions in VR. It can adjust the controllers’ position and range of motion allowing users to complete VR tasks by making smaller-amplitude movements in the real world.
Google Daydream Labs developed a prototype system to help people with vision impairments navigate and interact in VR (Palos 2017). The system leverages a controller implemented as a 3D audio laser module that can be pointed at objects in the environment to query their name or description. With the help of the prototype, people with impaired vision are able to learn the environment quickly and determine the relative location of virtual objects.
At the intersection of commercial and educational applications of VR, the Web Accessibility group at the University of Melbourne has built a web resource (The University of Melbourne 2019) documenting the pros and cons of VR for disabled users. The site provides a number of helpful suggestions. For example, to assist mobility impaired users, VR should remove time limits on tasks and make input devices and input controls simple and tolerant of user errors. Other recommendations include: allowing people with hearing loss to mute the ambient noise in order to hear basic interface sounds; avoiding light or sound stressors for users with cognitive disabilities; and for users with low vision, supporting amplification and contrast adjustment.
5 Augmented reality
An Augmented Reality experience can be delivered through a variety of techniques as illustrated in Fig. 1. The injection of virtual content into the user’s view of the physical environment can be achieved by one of three main methods: (1) using a video see-through (VST) approach on a mobile device, i.e. tablet or smartphone, or dedicated HMD; (2) an optical see-through (OST) approach with a dedicated HMD or other low-tech approximation; or by projecting onto the real environment.
Most modern mid- to high-end smartphones can deliver a VST AR experience in which the user will typically hold the device to look around the environment and view overlaid virtual content, as illustrated in Fig. 1a. OST AR, as illustrated in Fig. 1d, is currently largely restricted to use by developers or highly specialised industrial applications. However, there are some low-tech solutions that can provide a coarse OST AR experience using a smartphone mounted into an head-mounted frame with reflective lenses (see Fig. 1c). A recent trend in the AR/VR HMD market is the incorporation of video-passthrough capability that also enables VST AR experiences as illustrated in Fig. 1b. Projection-based AR has recently seen less attention due to significant improvements in the other more compelling and immersive alternatives. The immaturity of AR technology and applications means that there is less guidance related to inclusivity than in VR.
The different strategies for delivering AR experiences clearly impose different demands on the user. As a trivial example, the smartphone- or tablet-based VST AR approach works by holding up a smartphone or tablet as if looking through a window and then using this view to inspect virtual content placed in the real environment. This minimum requirement for use may be prohibitive for some users with mobility impairments. Similarly, currently available OST HMDs provide narrow effective display regions and are typically designed such that content is presented in the middle of the user’s field of view. This may be problematic for users with certain visual impairments.
In the remainder of this section, we survey research efforts grouped by access difficulty before briefly summarising applications of AR in health care as well as notable commercial efforts at accessibility in AR.
5.1 PERCEPTION: blind and low-vision users
For visually impaired or blind users, an AR experience may necessarily be delivered through non-visual means. Coughlan and Miele (2017) discuss the specific application domain of AR to assist people with low vision or blindness. They refer to this domain as Augmented Reality for Visual Impairment (AR4VI). For blind users, the augmentation of the environment typically takes the form of audio feedback. Coughlan and Miele describe two applications developed to assist blind users in navigation and understanding physical objects through the delivery of contextual audio feedback. Cassidy et al. (2019) evaluate several radar-based metaphors for conveying spatial directional information through audio. However, these metaphors were evaluated with sighted individuals and so it is unclear how these findings might translate to blind or low-vision users. Haptics offers another feasible non-visual means for conveying information to users about virtual and physical elements in their vicinity. As an example, Bertram et al. (2013) present a concept for providing spatial information through a haptic display integrated in a helmet.
Tactile maps with projected visuals and audio as a means to supporting spatial understanding and developing the navigation skills of visually impaired users have also been explored (Albouys-Perrois et al. 2018; Thevin and Brock 2018; Thevin et al. 2019, 2021). Tooteko (D’Agnano et al. 2015) is a system based on a wearable ring with which visually impaired users can interact with a 3D surface and obtain contextualised audio information. Applying a related strategy but eliminating the requirement for a ring, Kane et al. (2013) demonstrate a system for blind users that can read text in a given document via a camera and then allows the user to touch on the document to trigger the system to read out this text. Such a system could theoretically be deployed to work with both physical and virtual textual content.
Ahmetovic et al. (2021) describe the process of developing an AR application to enable users with visual impairments to better experience artworks in a gallery setting. In the developed system, verbal descriptions were complimented by visual contours tracing the region of the artwork being described. Comments from users suggested that these contours were not always easy to perceive and Ahmetovic et al. subsequently hypothesised that “customization (according to users’ needs) and adaptation (e.g. to the background image) can help improving visibility”. Fogli et al. (2018) present an AR application for use in museums and galleries and describe an accessibility feature where users can specify if they have a hearing or visual impairment and the application adapts its interaction features. Also focused on enhancing interaction and engagement in a museum setting, Guedes et al. (2020) demonstrate an AR application including basic accessibility features such as text to speech. Paciulli et al. (2020) applied a subset of W3C’s Web Content Accessibility Guidelines in the context of an AR application concept for supporting chemistry teaching for visually impaired users, although this was largely limited to the provision of audio descriptions.
More advanced strategies for improving accessibility in AR rely on either embedded meta-data or dynamically identifying content of interest to the user. Herskovitz et al. (2020) examine the challenge and potential solutions for including meta-data in virtual objects and thereby allowing screen readers to function in mobile AR. Kaul et al. (2021) demonstrate an AR concept application in which objects in the scene can be dynamically detected and then associated with the spatial map. This can facilitate the task of locating physical objects in an environment for users with visual impairments in a way that is potentially seamless with virtual objects as per an ideal AR experience.
Even with low vision, many visually impaired users can still perceive and utilise some visual cues. Lang and Machulla (2021) evaluated a set of visual augmentations presented in AR, such as stationary and moving guidelines, designed to support visually impaired users with using touch screen interfaces. These augmentations were designed to make use of the residual vision of visually impaired users. This same approach could theoretically be used to aid visually impaired users to interact with virtual interfaces. Although designed to cover a range of potential impairments, Mengoni et al. (2015) describe a rule-based system for adapting an AR interface for instructing work tasks according to the user’s known capabilities, such as contrast sensitivity.
AR can be used to warn users of potential hazards and to improve spatial awareness. Zhao et al. (2019b) specially explored the design space around providing augmented cues to low-vision users in order to support stair navigation. In testing with the Microsoft HoloLens, Zhao et al. found a favourable combination of visual and auditory cues leading to users feeling safer and more secure than when navigating without such cues. This work highlights the potential benefit for visually impaired users of providing additional hazard-related warnings in AR as well as VR.
Zhao et al. (2020) investigate wayfinding guidance supported by AR with both visual and audio cues. An important observation in this study was the fact that low-vision users tend to rely on monocular cues rather than binocular cues to perceive depth which is problematic for stereoscopic AR displays. Nevertheless, as Zhao et al. note, monocular cues of the depth of virtual elements can still be provided by change in apparent size and occlusion. More generally, participants in the study also observed a steeper learning curve with understanding and utilising visual cues compared with audio cues. Clew (Yoon et al. 2019) is another AR application designed to assist people with visual disabilities in indoor navigation.
For low-vision users, Augmented Reality provides one mechanism of presenting image and text magnification to aid reading. Zhao et al. (2017) evaluated visual perception of low-vision users of simple content present in AR glasses and they offer some simple general guidelines around the type of content, its colour and size. Stearns et al. (2018) explore the design space afforded by incorporating an AR HMD as the display device in a magnification tool. This exploration involved evaluating magnification of the stream from a camera attached to the finger and alternatively the stream from a standard smartphone camera. An observed advantage of this approach over conventional magnification tools is the ability to separate the camera from the display.
5.2 PERCEPTION: users with hearing difficulties
In traditional visual media, verbal information is typically communicated to deaf or hard-of-hearing users by means of subtitles or sign language. Jain et al. (2018a) explore the potential for AR to support real-time captioning for deaf or hard-of-hearing users. AR can theoretically support captioning that is unobtrusive yet easily glanceable, allowing users to better focus on other important conversational cues. Extending this concept still further, Jain et al. (2018b) explore accessible captions using AR in moving contexts. While AR-based captions were found to generally assist in conversation, some participants found the captions distracting and an impediment to focusing on navigation. A number of informative design implications were also presented (Jain et al. 2018b): position text to avoid splitting attention, adapt text to background context, optionally disable transcription of wearer’s speech, provide additional contextual information (e.g. speaker name, location, etc.) and allow for customisation of caption appearance.
Aljowaysir et al. (2019) describe an AR application concept for supporting education and with some consideration for accessibility features such as the ability to provide real-time captions of the teacher’s speech and the ability to monitor student focus. Applying a slightly different approach, Peng et al. (2018) leveraged AR to present spatial speech bubbles as opposed to captions to support group conversations for deaf and hard-of-hearing users. Vinayagamoorthy et al. (2019) explore the design space around sign language interpreters presented via AR synchronised with broadcast TV. Vinayagamoorthy et al. noted that results were largely consistent with established understanding for presenting sign language interpreters within a broadcast but stress the need for personalisation given different individual preferences.
A survey conducted by Findlater et al. (2019) investigating the preferences of deaf and hard-of-hearing users as they relate to wearable assistive technologies serves to highlight significant perceived benefits associated with HMDs in providing captions. Interestingly, respondents were asked to create an imaginary ideal multi-modal set-up and among the potential combinations, the use of an HMD for visual feedback and a smartwatch for haptic feedback was the most favoured.
5.3 COGNITION: neurodiversity and users with cognitive impairments
Vona et al. (2020) explored the potential of a virtual assistant as a means to support individuals with a cognitive disability when engaging in Mixed Reality experiences. Vona et al. observed that “The impairments of [individuals with a cognitive disability] are incredibly varied, and therefore it is challenging to create a way to communicate with them that is effective for everyone”. The evidence for the benefits provided by the virtual assistant is somewhat inconclusive.
Keselj et al. (2021) demonstrated an AR application for teaching geometry with a range of accessibility features, including the ability to adjust contrast and font size and style. The adjustment of font style is particularly relevant to accommodating users with dyslexia who can encounter greater difficulty with reading some fonts than others.
AR has been investigated as a possible aid to people with aphasia by providing conversation cues (Williams et al. 2015). Using an HMD for this purpose was hypothesised by Williams et al. (2015) to support better focus than referring to more traditional external tools. The study results suggest that providing glanceable vocabulary support allowed users to maintain attention and achieve better engagement with the conversation partner.
5.4 MOVEMENT: users with balance and motor impairments
Malu and Findlater (2015) investigated the usability of Google Glass for people with upper-body mobility impairments. They found that the default input mechanism supported, whereby users touch an input surface on the sides of the head-mounted display, was difficult or impossible for a subset of motor-impaired users. Malu and Findlater noted that participants were positive about voice commands serving as an appropriate substitute for direct touch-based input on the HMD. Another positive observation from the study participants was the potential additional freedom and convenience afforded by having the device worn on the head, as opposed to a phone held in the hand (which can be difficult to pick up) or a laptop/tablet in the near vicinity (which can be easily dropped or knocked over). An alternative strategy where a separate touchpad could be attached to the body or wheelchair in a custom location was also explored. This strategy was shown to greatly improve usability by allowing users to place input surfaces within comfortable reach.
Alternative input schemes may also be necessary in AR to ensure they are accessible to a broad set of users. For example, Guerrero et al. (2020) offer voice and gaze-based interaction as an alternative to hand-based interaction. Seeking to support selection of out-of-reach objects, Morita et al. (2020) evaluated a virtual hand technique that combined head orientation and hand movement with respect to a tablet. Moving the head would adjust the projection area (projection-based AR) and then the tablet could be used to move the virtual hand within that projection area. Brain–computer interaction has also been proposed (Gang et al. 2019) as a potential additional mode in multi-modal interaction for AR.
Seaborn et al. (2016) also proposed a set of guidelines for making Mixed Reality games more accessible to users with powered wheelchairs. Although the concepts examined by Seaborn et al. largely involved non-immersive location-based Mixed Reality, many of the guidelines can be extrapolated more broadly. Significant among these guidelines are the need to consider how control and attention should not be monopolised given the user’s need to maintain control of their chair, as well as the potential benefits of, and need for, multi-modal presentation of information.
By preserving the view of the physical world, AR may in some cases be slightly less problematic for users with balance or mobility issues that affect moving about in space. Nevertheless, similar to their work in VR (Ferdous et al. 2016), Ferdous (2017) propose including a virtual static rest frame in AR to support individuals with balance impairments.
5.5 AR in health care
Given the promising applications of VR in health care, there has been an increasing number of studies seeking to apply the unique potential benefits of AR. A comprehensive review of health care applications of AR is beyond the scope of this paper although we briefly highlight several illustrative examples in this section.
Richard et al. (2007) suggest that some children may be too sensitive to the effects of VR, and that AR offers a viable alternative: offering safer and more intuitive interaction with virtual objects in the real world. Richard et al. conducted a study using projection-based AR and found that children with cognitive impairments engaged well with this method and improved their perception and understanding of objects through the virtual presentation in AR.
Google Glass has been proposed as a potential aid for people with Parkinson’s disease (McNaney et al. 2014) as well as an assistive device for older adults (Kunze et al. 2014). Also using Google Glass, Voss et al. (2016) demonstrated an AR system called Superpower Glass for behavioural therapy in children with ASD. It uses the camera in Google Glass to capture and recognise facial expressions during conversation and provides visual and auditory cues to enhance the social understanding of children with ASD. Voss et al. found that children with ASD improved their eye contact after using the Superpower Glass.
SayWAT (Boyd et al. 2016) is a system developed for autistic adults providing communication skills training, and in particular the use of prosody. Atypical prosody, such as sarcasm, may influence the real meaning of a sentence, but adults with autism have difficulty interpreting such variations. SayWAT aims to increase awareness of atypical prosody in a conversation based on detected variations in volume and pitch. The tool provides visual and audio cues to the user using Google Glass and triggers alerts indicating when the users’ volume or pitch exceeds a threshold. Boyd et al. (2016) evaluated SayWat and reported that the tool was effective in volume training and helped users identify non-verbal behaviour or emotions of others.
5.6 Commercial efforts
The capabilities afforded by AR have also spawned commercial efforts seeking to deliver AR-based accessibility tools. IrisVision (IrisVision 2020) is a video see-through headset that comprises a magnifier, screen reader and other image adjustment functions designed to help people with low sight regain some vision. Relumino (Samsung 2017), a product developed by Samsung, employs a similar strategy to IrisVision, but leverages a smartphone to deliver a lower cost video see-through HMD. Zolyomi et al. (2017) explore the various aspects of using head-worn sight aids.
The HoloLens 2 is an OST AR headset produced by Microsoft. The settings menu contains an “Ease of Access” settings group, however, at the time of writing, this sub-menu only provides limited configurable options. Users currently can adjust menu text size; show/hide scroll bars; configure notification timings; set light/dark mode; apply colour filters (to accommodate colour blindness); and combine left and right audio channels into one to “make your device easier to hear”. The Universal Windows Application specification refers to a wider range of accessibility features (Microsoft 2019), but these are not currently available on the HoloLens 2.
6 The metaverse
The term “metaverse” has recently gained popularity as companies including Meta, Epic Games and Microsoft seek to develop immersive virtual worlds in which users can socialise and co-work (Xu et al. 2022). The metaverse describes a broadly encompassing form of immersive content including aspects of entertainment and socialisation (Goldberg and Tetreault 2021), as well as work, education, health care and service delivery (Yang et al. 2022; Yue 2022). Current manifestations of the metaverse are largely limited to rudimentary social and co-working experiences, however, the long-term vision held by some proponents of the metaverse is a tightly integrated platform offering a complete virtual parallel of the physical world (Park and Kim 2022).
VR and AR devices provide the means to access and interact within the metaverse and there are features of the metaverse that also exist within conventional immersive experiences. This suggests that many existing accessibility efforts for general immersive content are relevant and may be readily applicable to the metaverse. However, there are other aspects of this new medium which have or are likely to demand specific research attention in terms of promoting accessibility. Three key factors make user engagement in the metaverse distinct from conventional AR or VR applications: (i) the close interaction of multiple users and psychosocial issues ensuing from this interaction; (ii) the transactional quality of some interactions; and (iii) the exposure of users to end-user developed content.
The metaverse brings multiple users together into the same environment with the expectation that these users will interact with each other. Some of these interactions may be transactional, for example, in the context of a user seeking medical advice (Yang et al. 2022), in contrast to conventional social VR and AR experiences that are more commonly collaborative or casually competitive. The real-time and occasional transactional nature of these interactions exposes unique considerations around both physical and verbal representation, timeliness of accessible interventions, and the difficulty of dynamically adapting content in a shared virtual experience. Briefly expanding on point (iii) above, experiences encountered in the metaverse may also be produced by other users as opposed to developers. If supporting developers to improve the accessibility of their content is already challenging, there are additional obstacles to providing end-users with the tools, understanding and motivation necessary to ensure their developed content is accessible.
The social aspect of the metaverse overlaps with the sub-genre of VR experiences or applications referred to as Social VR. Social VR (SVR) is defined as “a web-based social interaction paradigm, mediated by immersive technologies and taking place in predesigned three-dimensional virtual worlds where individuals, represented by an avatar, may engage in real-time interpersonal conversation and shared activities” (Dzardanova et al. 2018).
SVR platforms typically also involve some form of gameplay and/or gamification. This aspect gives rise to the potential exposure of users to content generated by other users: ranging from the self-creation of personal avatars to the self-building of completely new environments and user experiences. SVR platforms are conceptualised around the idea of user-creators with user-generated content driving platform growth, similar to how established social networks largely rely on user-generated content. SVR presents numerous opportunities for social experiences and collaboration in a range of areas, including SVR cinema, live concerts as well as shared participation in religious worship or cultural events, such as museum and gallery exhibitions. SVR is also an increasingly popular means of socialising and Sykownik et al. (2022) describe how SVR can deliver authentic social experiences.
Since SVR and the metaverse are relatively new phenomena, prior work on inclusive design of such platforms is rather limited. Maloney et al. (2021) specifically highlight the need for focus on the research question “What are the considerations for designing a social environment inclusively for marginalized groups of all kinds?” Reflecting on the limitations of their study of self-representation in SVR, Freeman and Maloney (2021) call for future work focusing on “recruiting a more diverse sample (e.g. more disabled users, LGBTQ, minority and disabled users) to further investigate what type of behaviours and identity practices are more easily affected by their embodiment in social VR”. Seigneur and Choukou (2022) report how there is currently a dearth of digital assets on metaverse platforms for self-representation of users with a disability, e.g. wheelchairs or prosthetics for integration with avatars.
Baker et al. (2020) evaluated the use of VR in residential aged care facilities (RACF) and observed that “When asked about what sort of experiences they would like to have in VR in the future, many participants also spoke about the potential to have social experiences with people outside the RACF”. In another study by Baker et al. (2019), several participants commented on the potential benefits of SVR and the complete control it affords over self-representation in terms of mitigating the negative effects of loss of self-esteem among older adults and its potential impact on self-imposed social isolation. Examining the potential use of Avatar Mediated Communication (AMC) by older adults, Baker et al. (2021) identified four key themes in the user study data: (i) AMC reduces inhibitions; (ii) AMC allows welcome control over the degree to which one reveals their true appearance; (iii) AMC helped establish social connections over distance; and (iv) current AMC inadvertently obscures some emotions and non-verbal cues.
Shao and Lee (2020) conducted a study with older adults assessing their impressions and anticipated use of Social VR, and identified an expectation that Social VR may enhance social connectedness as well as be useful in addressing medical needs. Freeman and Acena (2021) note how attending immersive events such as concerts and talk shows as well as professional workshops delivered in Social VR represent a particularly novel social activity of significant value to those with mobility restrictions whether due to financial constraints or physical disability.
These various studies highlight aspects of ensuring an accessible metaverse that are not necessarily covered by the accessibility needs of users engaged in isolated AR or VR experiences. Given the nascent stage of such platforms there is still a relatively poor understanding of the major obstacles encountered by disabled users and even less research offering potential solutions.
7 Other relevant non-immersive efforts
VR and AR expose many of the same challenges to accessibility encountered in other human–computer interaction settings. In this section, we review research and commercial efforts in non-immersive contexts that are, nevertheless, informative to Inclusive Immersion. Since many of these application settings represent large research areas in their own right, the review presented here is scoped to be illustrative rather than exhaustive.
7.1 Pointing assistance for users with motor disabilities
Pointing and selecting is a common user interface task in conventional computing and is likely to also be core in VR and AR. Pointing can be particularly challenging for people with motor disabilities and so significant research effort has been invested in identifying suitable pointing and selection strategies for this user group. Although many of these strategies are designed for 2D user interfaces, there may be aspects of their design that can be transferred to immersive 3D interfaces.
Findlater et al. (2010) evaluated several alternative cursor designs referred to as “enhanced area cursors”. These cursors are designed to reduce the need for fine correction in pointing and selecting, particularly for small targets. PointAssist (Salivia and Hourcade 2013) is an adaptive technique designed to help people with motor impairments conduct pointing tasks using a mouse or touchpad. Salivia and Hourcade demonstrated that PointAssist could improve user accuracy in clicking, pressing and releasing the target. Adaptive Click-and-Cross (Li and Gajos 2014) is a similar assistance technique leveraging the ability to perform selections via crossing.
Reflecting on the design of PointAssist and assistive techniques more generally, Salivia and Hourcade (2013) highlight several aspects relevant to improving the accessibility of assistive software. First, assistive software should not interfere with the appearance or the regular operation of the original application. Second, assistive software should not require extra drivers, hardware or special training. Third, assistance should be target agnostic and activate only when needed. Finally, assistive software should be useful to a large population of users with motor disabilities and should accommodate individual abilities. These same principles are likely to also be relevant to supporting pointing and selection in immersive environments.
7.2 Accessibility in gaming
Gaming has grown over the past decades into a popular leisure activity and now captures large audiences. Some game developers have been particularly forward thinking in ensuring the broader audience is inclusive of disabled users. The 2020 gaming title, The Last of Us II, has been described as “the most accessible game ever” (Molloy and Carter 2020). It provides extensive accessibility features addressing a broad set of needs for users with different disabilities (PlayStation 2020). Visually impaired users can, for example, enable text to speech, adjust display contrast and increase the size of the heads-up display. Players with motor disabilities can enable auto-locking on to targets, modify actions requiring repeated button presses to use a button hold interaction instead and set an option to prevent the character from being able to fall off ledges. Other settings can be applied to adjust the difficulty such as skipping puzzles and remaining invisible while prone. The attention given to the accessibility needs of players of The Last of Us II is promising for future immersive gaming titles.
Several plugins are available on asset stores managed by Unity and Unreal offering functionality that streamlines the process of integrating accessibility features into games and applications. For example, Dislectek (Lowtek Games 2021) is a plugin developed by Lowtek Games providing simple to integrate text-to-speech functionality to support dyslexic users. UI Accessibility Plugin (UAP) (MetalPop Games 2021) by MetalPop Games is another plugin on the Unity asset store for easily integrating accessibility features into a UI for blind and visually impaired users.
In 2018, Xbox released the “Xbox Adaptive Controller” (Xbox 2020) targeted at users with limited mobility. The controller has large buttons and can be customised and extended (including with peripherals from Logitech (Logitech 2020) to support game play using, for example, only one hand, one foot, the chin, the wrist or the elbow.
KINECTWheels (Gerling et al. 2015) is a toolkit extending the Microsoft Kinect sensor to detect movements of wheelchair players and make motion-based games accessible to these players. Gerling et al. applied the KINECTWheels toolkit in three motion-based games (Cupcake Heaven, Last Tank Rolling and Wheelchair Revolution) to explore the design space for wheelchair-based games. The Last Tank Rolling enabled a wheelchair user to control a tank with the wheelchair and to collaborate with another non-disabled player who controls a virtual solider such that they fight enemies together in the game (Gerling and Buttrick 2014). The game mechanics relate to the differing capabilities of the two users such that the tank has more defensive power but its maintenance relies on the solider, and the solider can move quickly but needs the protection from the tank. Cupcake Heaven was created for older adults who use a wheelchair regularly in daily life. In an evaluation study, older wheelchair users were found to be able to play the game, but the game controls induced high physical and cognitive loads. The games developed by Gerling et al. (2015) highlight the additional opportunities afforded by incorporating the player’s assistive device. Furthermore, Last Tank Rolling in particular illustrates how game mechanics can be designed to relate to the capabilities of the player in an engaging and positive way that fosters collaboration.
Second Life is a virtual social world offering various opportunities for users with disabilities (Smith 2012). For instance, the Virtual Ability Island is a virtual location in Second Life with wide ramps, bright and high-contrast symbols, and smooth footpaths and landscapes. As Smith describes, disabled users of Second Life can be classified into augmentationists and immersionists. Second Life allows augmentationists to present their authentic appearance using some unique elements, such as a wheelchair or dark glasses and a guide dog, to reflect their true appearance. Immersionists may instead choose an able-bodied avatar, allowing them to perform activities which are not possible in the real world. The diverging preference among disabled users suggests that applications involving avatars should facilitate self-representation when desired.
8 Synthesis of accessibility strategies
In this section, we synthesise the broad spectrum of efforts surveyed in Sects. 4, 5, 6 and 7 to compile a set of demonstrated strategies for improving accessibility in VR and AR. We categorise these strategies according to the dominant access barrier they address. This categorisation is based on core capabilities relevant to enjoying and interacting with immersive content. The four top-level capabilities include: (A) Perception, (B) Cognition, (C) Communication and (D) Movement.
Within (A) Perception, we provide further resolution with subcategories of: (A1) Seeing, (A2) Hearing and (A3) Touch. Similarly for (D) Movement, we provide subcategories of: (D1) Mobility Through Space, (D2) Use of Arms, (D3) Use of Hands and (D4) Movement of Head/Neck. This categorisation not only serves to give some structure to the list of strategies available but also helps to highlight where there are largely unaddressed gaps. This compiled set of strategies is summarised in Table 3.
Table 3 illustrates that (A1) Seeing is one of the most widely explored and addressed access barriers in the context of VR and AR accessibility. The subcategories within (D) Movement are also generally well explored aside from (D4) Movement of Head/Neck. By contrast, there were only two tangentially relevant papers related to (C) Communication and no relevant research associated with (A3) Touch. This deficiency is likely a consequence of the fact that there exist effective workarounds for users with such access difficulties or that any difficulties encountered may be relatively minor. Nevertheless, targeted research should ascertain whether this is in fact the case.
In Sect. 8.1, we describe six common design principles that connect the various strategies listed in Table 3. These underlying principles are indicated in Table 3 with the tags DP1–6. These six principles (described in more detail in Sect. 8.1) are: (DP1) Output redundancy; (DP2) Input redundancy; (DP3) Integration of Assistive Technologies (AT); (DP4) Eustomisability; (DP5) Enhanced assistance; and (DP6) Inclusive design.
8.1 Common design principles
When viewed in combination, the accessibility strategies presented in Table 3 exhibit a degree of commonality in terms of the underlying design approach. We refer to these common approaches as design principles, that is, principles of design that are known to be effective in enhancing accessibility in VR and AR. We identified six design principles that generalise across the strategies listed. These principles are illustrated in Fig. 2 and described in Table 4. Several of these principles share common traits with the design strategies outlined by Mott et al. (2020) and the principles described by Dombrowski et al. (2019). We hope that these high-level design principles, viewed in combination with the specific strategies synthesised from the literature, may offer a useful reference for accessibility efforts in this emerging space.
9 Major challenges for Inclusive Immersion
Our comprehensive survey of the literature highlighted to us three key outstanding challenges that hinder the realisation of Inclusive Immersion. These three key challenges are: i) diversity in user needs; ii) lack of guidance and tools for developers; and iii) difficulty in conducting empirical research. Each of these key challenges is detailed in the following subsections, and where appropriate we also propose potential solutions and avenues for further research.
9.1 Diversity in user needs
The extensive range of different capabilities among disabled users frustrates efforts to apply a “one size fits all” strategy in immersive application development. The principle of “equitable use” adapted by Dombrowski et al. (2019) from Universal Design is difficult to achieve in practice when disabilities in the user population can vary so extensively in severity and type. Designing for diverse needs is further challenged by the fact that disabilities are often co-occurring (in the survey performed by Wong et al. (2017), 29% of respondents reported having two or more disabilities while this proportion was higher again at 58% in the survey by Garaj et al. (2019)). Many of the accessibility strategies summarised in Table 3 demand that the user possesses a specific alternative capability, e.g. audio descriptions can address sight-related difficulties but demand a functional hearing capability. Such strategies may then be rendered ineffective when users experience multiple forms of access difficulty. Further complicating the establishment of clear guidance is the fact that research work is typically conducted on a specific user group or a panel of users with diverse needs meaning that outcomes are difficult to generalise.
More detailed studies covering the spectrum of disability are required to more accurately inform accessibility strategies in VR and AR. Some disabilities and impairments have received more attention in the literature than others, for example there are multiple studies examining the requirements of blind and low-vision users. By contrast, relatively little attention has been given to users with mobility impairments and how input control schemes or content might be modified to alleviate these issues. Accessing a sufficient number of users with a given disability is a challenge for researchers. Research panels assembled by disability charities or organisations are a potential source of study participants. Online studies, leveraging web-based VR and AR frameworks, may increasingly be able to reach a broader sample of disabled users.
Immersive technologies can be made more inclusive of different user needs and abilities by supporting customisation. However, customisation can introduce a trade-off against complexity. Although commenting more generally about information and communication technologies, Fuglerud (2014) notes that “adding flexibility and multimodality often means adding functionality, and this usually adds to the complexity of the interface”. Introducing redundant accessories, supplementary cues and multiple configurations may ultimately distract the users’ attention, interfere with their experience and make immersive technologies more complicated and exclusive. Even for users who need assistance in a virtual environment, the customisation process should be easy and the assistive function should be optional and easily toggled (Teófilo et al. 2016). As discussed earlier, there has been some success in supporting automatic personalisation. However, impairments of different types and different degrees of severity make robust personalisation difficult to achieve.
9.2 Lack of guidance and tools for developers
Disseminating a general awareness of accessibility considerations and providing simple to use tools and guidelines remains a significant obstacle to promoting Inclusive Immersion. Developers of immersive applications may be largely unfamiliar with the benefits of and strategies for improving accessibility. The lack of awareness related to exclusionary effects of a disability can likely be partly addressed with detailed and generalisable design guidelines. Web developers, for example, can obviate the need to understand how hearing or visually impaired users experience websites and instead refer to the W3C’s Web Content Accessibility Guidelines (W3C 2018). The currently disparate device and software space for VR and AR means there are limited generalisable design guidelines for immersive content full-stop, let alone specific guidelines for addressing accessibility. Zhao et al. (2019a) found that the developers who participated in their study using the SeeingVR plugin were not aware of any accessibility guidelines for designing VR content. Interviews with developers conducted by Lukava et al. (2022) highlight a similar lack of understanding of accessibility needs and desire for clear requirements and guidelines. As Lukava et al. note, there is currently a fundamental lack of awareness even at the most basic level, with developers complaining of “being confused by which questions should be asked to define accessibility needs, and understanding how accessibility modification should be incorporated into the flow of a project”. The surveys performed by Wong et al. (2017) and Garaj et al. (2019) represent proactive steps to first identify the shortcomings in accessibility given the status quo of VR and AR. Mott et al. (2019), Dombrowski et al. (2019) and this paper help to establish a conceptual structure and standard terminology upon which more extensive developer guidance might be founded.
An important element of the consolidation of useful guidance will be closely engaging with the disabled community, in recognition of the community’s principle of “Nothing About Us Without Us”. A well-packaged set of guidelines that developers can refer to will remove a degree of the effort required in designing and implementing accessible features. Importantly too, a generally accepted set of guidelines can be used by organisations outsourcing development to help dictate to contracted parties what features should be supported. Nevertheless, it is important to be mindful of the inherent tension between efforts to deliver practical tools and guidance for developers that may enhance accessibility but inadvertently deliver tokenistic efforts and provide an excuse for not directly including disabled users in the design process. We acknowledge also the potentially contradictory message conveyed in this paper by us, as authors, who work with the disabled community but are not ourselves disabled.
Guidance should also be complemented by simple to use and integrate developer tools. As demonstrated by the range of systems and solutions presented in the work covered by this survey, there are significant opportunities for improving the accessibility of immersive experiences. More commercial and research effort will hopefully continue to expand on this growing body of work. It would be particularly useful if proven systems and solutions were packaged into toolkits or downloadable assets that can be readily incorporated into projects with little developer effort. Interfaces and interaction methods that support customisation and adaptation are likely to be fruitful but require more research attention and are potentially difficult to generalise. Advanced tools that support hyperpersonalisation to accommodate the specific needs of an individual user is also a promising avenue for exploration.
A deep understanding of the condition and needs of a disabled user as they relate to a particular application can be hard to obtain. There is insufficient assistance provided to developers to understand the impaired condition and needs of users with disabilities. One proven method for delivering a partial appreciation of the potential effects of an impairment is by using capability simulators (Clarkson et al. 2007). In the context of the Inclusive Design Toolkit (Clarkson et al. 2007), these tools are used to help designers empathise with users experiencing capability loss. In the same way, it may be feasible to create a universal disability simulator than can at least partly elicit in developers an appreciation of the effects of capability loss in a VR or AR setting. Such a simulator may have pragmatic benefits by enabling rapid review at the outset and throughout all stages of designing immersive content. As Clarkson et al. note, however, such tools “should never be considered as a replacement for involving real people with such losses”.
Autismity (Autismity 2020) and SIMVIZ (Ates et al. 2015) are two VR simulation tools that help to deliver a better understanding of autistic individuals and people with vision impairments, respectively. Other types of VR content, such as the collaborative game created by Gerling et al. (2015) involving users in wheelchairs, may also be helpful in bridging gaps in understanding. More generally, there is potential value in building a comprehensive empathy tool based on simulation of the experience of a broad spectrum of impairments in VR. This universal impairment simulator could provide developers with a window into the experience of impaired conditions and a better appreciation of the barriers encountered in VR and AR environments.
9.3 Difficulty in conducting empirical research
The conduct of user research with disabled users is challenged by both ethical and access constraints. Ethically, it is undesirable to perform user research that subjects potentially vulnerable users to features known to be ill-favoured or which might cause significant discomfort. This precludes some established research methodologies involving comparison with a baseline set-up. As Munteanu et al. (2015) observe, even just inviting disabled users to attend a laboratory study introduces potential ethical concerns related to exposure to risks during travel.
From an access standpoint, it can be difficult to gather a representative sample of users, even if constrained to a particular disability or impairment. Consider as an example the well-executed user study performed by Wedoff et al. (2019) involving 34 participants with visual impairments playing a VR ball game. Participants covered the spectrum from totally blind to vision in one eye only. Such participant group profiles are often inescapable in experimental work with disabled user groups but may potentially mask the requirements of a specific subset of users with particular needs. It can therefore be difficult to generalise some of the findings obtained due to small sample sizes or mixed capability participant groups. This presents a significant challenge in terms of coalescing specific accessibility strategies into detailed guidelines.
10 Conclusions
In this paper, we have presented an introduction to the concept of Inclusive Immersion and placed it within the context of the broader research and commercial landscape relevant to accessible Virtual and Augmented Reality and the emerging metaverse. The two objectives of this paper have been to: (i) capture and synthesise this landscape into a set of emerging strategies for inclusivity; and (ii) establish a helpful reference for developers and designers building immersive content. The core survey, providing the first major contribution of this paper, is presented in Sects. 4, 5, 6 and 7. An extensive collection of research and commercial efforts addressing work in both Virtual and Augmented Reality were reviewed. Additional notable research efforts in the emerging application of VR and AR in the metaverse were also reviewed. Finally, non-immersive efforts to enhance accessibility of computer system that can be generalised for application in VR and AR were reviewed in Sect. 7.
This survey formed the basis for the two subsequent major contributions of this paper: (i) a synthesis of demonstrated strategies for improving the accessibility of Virtual and Augmented Reality content (presented in Sect. 8); and (ii) an identification of the key barriers to more accessible engagement with immersive technologies (presented in Sect. 9). The list of demonstrated strategies presented in Sect. 8 represents the first effort, to our knowledge, to consolidate accessibility strategies across the broad spectrum of potential access difficulties. On this basis, we anticipate that this list and the common design principles that underlie it, can provide a valuable reference for developers when they consider how their immersive experiences could be made more inclusive. The set of three challenges identified in Sect. 9 is a call to action for the research community. We identify the key shortcomings of the current state of research that emerge from looking at the body of work in totality and propose several promising avenues for further research. It is hoped that this survey with synthesised challenges and accessibility strategies ultimately provides a useful foundation for advancing work in the area of Inclusive Immersion.
Data availability
All data generated or analysed during this study are included in this published article and its supplementary information files.
Notes
The complete relevancy and categorisation assignments for the 177 papers captured by the Scopus search are summarised in the supplementary material.
References
Achanccaray D, Pacheco K, Carranza E, et al. (2018) Immersive virtual reality feedback in a brain computer interface for upper limb rehabilitation. In: 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp 1006–1010, https://doi.org/10.1109/SMC.2018.00179
Agulló B, Montagud M, Fraile I (2019) Making interaction with virtual reality accessible: rendering and guiding methods for subtitles. Artif Intell Eng Des Anal Manuf 33(4):416–428. https://doi.org/10.1017/S0890060419000362
Ahmetovic D, Bernareggi C, Keller K, et al. (2021) MusA: artwork accessibility through augmented reality for people with low vision. In: Proceedings of the 18th International Web for All Conference, W4A 2021, https://doi.org/10.1145/3430263.3452441
Albouys-Perrois J, Laviole J, Briant C, et al. (2018) Towards a multisensory augmented reality map for blind and low vision people: a participatory design Approach. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Montreal QC, Canada, CHI ’18, pp 1–14, https://doi.org/10.1145/3173574.3174203
Aljowaysir N, Ozdemir T, Kim T, (2019) Differentiated learning patterns with mixed reality. In: 2019 IEEE Games, Entertainment, Media Conference, GEM 2019. https://doi.org/10.1109/GEM.2019.8811558
Arafat I, Ferdous S, Quarles J (2016) The effects of cybersickness on persons with multiple sclerosis. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, pp 51–59, https://doi.org/10.1145/2993369.2993383
Ates HC, Fiannaca A, Folmer E (2015) Immersive simulation of visual impairments using a wearable see-through display. In: Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction–TEI ’14. ACM Press, Stanford, California, USA, pp 225–228, https://doi.org/10.1145/2677199.2680551
Autismity (2020) Autismity–The Autism Simulator. https://theautismsimulator.com/
Bailey B, Bryant L, Hemsley B (2021) Virtual reality and augmented reality for children, adolescents, and adults with communication disability and neurodevelopmental disorders: a systematic review. Rev J Autism Dev Disorders. https://doi.org/10.1007/s40489-020-00230-x
Baker S, Kelly RM, Waycott J et al (2019) Interrogating social virtual reality as a communication medium for older adults. Proc ACM Hum Comput Interact. https://doi.org/10.1145/3359251
Baker S, Waycott J, Robertson E et al (2020) Evaluating the use of interactive virtual reality technology with older adults living in residential aged care. Inf Process Manag 57(3):102,105. https://doi.org/10.1016/j.ipm.2019.102105
Baker S, Waycott J, Carrasco R, et al. (2021) Avatar-mediated communication in social VR: an in-depth exploration of older adult interaction in an emerging communication platform. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, CHI ’21, https://doi.org/10.1145/3411764.3445752
BBC R &D (2020) 360 Video and Virtual Reality. https://www.bbc.co.uk/rd/projects/360-video-virtual-reality
Benham S, Kang M, Grampurohit N (2019) Immersive virtual reality for the management of pain in community-dwelling older adults. OTJR Occup Particip Health 39(2):90–96. https://doi.org/10.1177/1539449218817291
Bertiz A (2019) 5 VR games considerate of people with disabilities. https://www.keengamer.com/articles/news/5-vr-games-considerate-of-people-with-disabilities/
Bertram C, Evans M, Javaid M, et al. (2013) Sensory augmentation with distal touch: The tactile helmet project. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 8064 LNAI:24–35. https://doi.org/10.1007/978-3-642-39802-5_3
Bigham JP, Jayant C, Ji H, et al. (2010) VizWiz: nearly real-time answers to visual questions. In: Proceedings of the 23nd annual ACM symposium on User interface software and technology. Association for Computing Machinery, New York, NY, USA, UIST ’10, pp 333–342, https://doi.org/10.1145/1866029.1866080
Blaha J, Gupta M (2014) Diplopia: a virtual reality game designed to help amblyopics. In: 2014 IEEE Virtual Reality (VR), pp 163–164, https://doi.org/10.1109/VR.2014.6802102
Boger J, Eisapour M, Cao S, et al. (2018) Participatory design of a virtual reality exercise for people with mild cognitive impairment. In: Conference on Human Factors in Computing Systems–Proceedings, https://doi.org/10.1145/3170427.3174362
Boyd L (2019) Designing sensory-inclusive virtual play spaces for children. In: Proceedings of the 18th ACM International Conference on Interaction Design and Children, IDC 2019, pp 446–451, https://doi.org/10.1145/3311927.3325315
Boyd L, Wasserman B, Hayes G, et al. (2019) Paper prototyping comfortable VR play for diverse sensory needs. In: Conference on Human Factors in Computing Systems–Proceedings, https://doi.org/10.1145/3290607.3313080
Boyd LE, Rangel A, Tomimbang H, et al. (2016) SayWAT: augmenting face-to-face conversations for adults with autism. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, San Jose California USA, pp 4872–4883, https://doi.org/10.1145/2858036.2858215
Boyd LE, Gupta S, Vikmani SB, et al. (2018) vrSocial: toward immersive therapeutic VR systems for children with autism. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems–CHI ’18. ACM Press, Montreal QC, Canada, pp 1–12, https://doi.org/10.1145/3173574.3173778
Bozgeyikli L, Bozgeyikli E, Clevenger M, et al. (2014) VR4VR: Towards vocational rehabilitation of individuals with disabilities in immersive virtual reality environments. In: 2014 2nd Workshop on Virtual and Augmented Assistive Technology (VAAT), pp 29–34, https://doi.org/10.1109/VAAT.2014.6799466
Brown A, Patterson J (2017) Designing subtitles for 360\(^\circ\) content. https://www.bbc.co.uk/rd/blog/2017-03-subtitles-360-video-virtual-reality
Brown A, Turner J, Patterson J, et al. (2018) Exploring subtitle behaviour for 360\(^\circ\) video
Brulé E, Tomlinson BJ, Metatla O, et al. (2020) Review of quantitative empirical evaluations of technology for people with visual impairments. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Honolulu, HI, USA, CHI ’20, pp 1–14, https://doi.org/10.1145/3313831.3376749
Bryant L, Brunner M, Hemsley B (2020) A review of virtual reality technologies in the field of communication disability: implications for practice and research. Disabil Rehabil Assist Technol 15(4):365–372. https://doi.org/10.1080/17483107.2018.1549276
Cassidy B, Read J, MacKenzie I (2019) An evaluation of radar metaphors for providing directional stimuli using non-verbal sound. In: Conference on Human Factors in Computing Systems–Proceedings, https://doi.org/10.1145/3290605.3300289
Chen CC, Lee SH, Wang WJ, et al. (2017) The changes of improvement-related motor kinetics after virtual reality based rehabilitation. In: 2017 International Conference on Applied System Innovation (ICASI), pp 683–685, https://doi.org/10.1109/ICASI.2017.7988517
Ciccone B, Bailey S, Lewis J (2021) The next generation of virtual reality: recommendations for accessible and ergonomic design. Ergon Des. https://doi.org/10.1177/10648046211002578
Clarkson J, Coleman R, Hosking I et al (eds) (2007) Inclusive design toolkit. University Of Cambridge, Cambridge
Climent M, Soler-Vilageliu O, Vila I et al (2021) VR360 subtitling: requirements, technology and user experience. IEEE Access 9:2819–2838. https://doi.org/10.1109/ACCESS.2020.3047377
Cook D, Dissanayake D, Kaur K (2019) Virtual reality and older hands: dexterity and accessibility in hand-held VR control. In: ACM International Conference Proceeding Series, pp 147–151, https://doi.org/10.1145/3328243.3328262
Corriveau Lecavalier N, Ouellet E, Boller B et al (2020) Use of immersive virtual reality to assess episodic memory: a validation study in older adults. Neuropsychol Rehabil 30(3):462–480. https://doi.org/10.1080/09602011.2018.1477684
Coughlan JM, Miele J (2017) AR4VI: AR as an accessibility tool for people with visual impairments. In: 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp 288–292, https://doi.org/10.1109/ISMAR-Adjunct.2017.89
D’Agnano F, Balletti C, Guerra F, et al. (2015) Tooteko: a case study of augmented reality for an accessible cultural heritage. Digitization, 3D printing and sensors for an audio-tactile experience. In: International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences–ISPRS Archives, pp 207–213, https://doi.org/10.5194/isprsarchives-XL-5-W4-207-2015
Davison SMC, Deeprose C, Terbeck S (2018) A comparison of immersive virtual reality with traditional neuropsychological measures in the assessment of executive functions. Acta Neuropsychiatrica 30(2):79–89. https://doi.org/10.1017/neu.2017.14
Devigne L, Babel M, Nouviale F, et al. (2017) Design of an immersive simulator for assisted power wheelchair driving. In: 2017 International Conference on Rehabilitation Robotics (ICORR), pp 995–1000, https://doi.org/10.1109/ICORR.2017.8009379
Di Luca M, Seifi H, Egan S, et al. (2021) Locomotion vault: the extra mile in analyzing VR locomotion techniques. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 128, Association for Computing Machinery, New York, NY, USA, p 1–10, https://doi.org/10.1145/3411764.3445319
Dick E (2021) Principles and policies to unlock the potential of AR/VR for equity and inclusion. Technical report, information technology and innovation foundation, https://itif.org/publications/2021/06/01/principles-and-policies-unlock-potential-arvr-equity-and-inclusion
Didehbani N, Allen T, Kandalaft M et al (2016) Virtual reality social cognition training for children with high functioning autism. Comput Hum Behav 62:703–711. https://doi.org/10.1016/j.chb.2016.04.033
Do Amaral W, De Martino J (2010) Towards a transcription system of sign language for 3d virtual agents. In: Innovations in Computing Sciences and Software Engineering, pp 85–90, https://doi.org/10.1007/978-90-481-9112-3_15
Do Amaral W, De Martino J, Angare L (2011) Sign language 3D virtual agent. In: IMSCI 2011 - 5th International Multi-Conference on Society, Cybernetics and Informatics, Proceedings, pp 93–97, https://www.scopus.com/inward/record.uri?eid=2-s2.0-84896289883 &partnerID=40 &md5=caa30f732eca354cd7f3b6eb46154188
Dombrowski M, Smith PA, Manero A et al (2019) Designing inclusive virtual reality experiences. In: Chen JY, Fragomeni G (eds) Virtual, augmented and mixed reality multimodal interaction, vol 11574. Springer International Publishing, Cham, pp 33–43
Dzardanova E, Kasapakis V, Gavalas D (2018) Social virtual reality. In: Lee N (ed) Encyclopedia of computer graphics and games. Springer International Publishing, Cham, pp 1–3
Epelde G, Valencia X, Carrasco E et al (2013) Providing universally accessible interactive services through TV sets: implementation and validation with elderly users. Multimed Tools Appl 67(2):497–528. https://doi.org/10.1007/s11042-011-0949-0
Evangelista Belo JM, Feit AM, Feuchtner T, et al. (2021) XRgonomics: facilitating the creation of ergonomic 3d interfaces. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, CHI ’21, pp 1–11, https://doi.org/10.1145/3411764.3445349
Ferdous SMS (2017) Improve accessibility of virtual and augmented reality for people with balance impairments. In: 2017 IEEE Virtual Reality (VR), pp 421–422, https://doi.org/10.1109/VR.2017.7892356
Ferdous SMS, Arafat IM, Quarles J (2016) Visual feedback to improve the accessibility of head-mounted displays for persons with balance impairments. In: 2016 IEEE Symposium on 3D User Interfaces (3DUI), pp 121–128, https://doi.org/10.1109/3DUI.2016.7460041
Ferdous SMS, Chowdhury T, Muhammad Arafat I, et al. (2018a) Investigating the reason for increased postural instability in virtual reality for persons with balance impairments. In: 25th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2018–Proceedings, pp 547–548, https://doi.org/10.1109/VR.2018.8446488
Ferdous SMS, Chowdhury TI, Arafat IM, et al. (2018b) Investigating the reason for increased postural instability in virtual reality for persons with balance impairments. In: Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. Association for Computing Machinery, Tokyo, Japan, VRST ’18, pp 1–7, https://doi.org/10.1145/3281505.3281547
Fidyka A, Matamala A (2018) Audio description in 360\(^\circ\) videos Results from focus groups in Barcelona and Kraków. Transl Spaces (Netherland) 7(2):285–303. https://doi.org/10.1075/ts.18018.fid
Fidyka A, Matamala A (2021) Retelling narrative in 360\(^\circ\) videos: implications for audio description. Transl Stud 14(3):298–312. https://doi.org/10.1080/14781700.2021.1888783
Findlater L, Jansen A, Shinohara K, et al. (2010) Enhanced area cursors: reducing fine pointing demands for people with motor impairments. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, New York, NY, USA, UIST ’10, pp 153–162, https://doi.org/10.1145/1866029.1866055
Findlater L, Chinh B, Jain D, et al. (2019) Deaf and hard-of-hearing individuals’ preferences for wearable and mobile sound awareness technologies. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Glasgow, Scotland Uk, CHI ’19, pp 1–13, https://doi.org/10.1145/3290605.3300276
Flower A, Burns MK, Bottsford-Miller NA (2007) Meta-analysis of disability simulation research. Remedial Spec Educ 28(2):72–79. https://doi.org/10.1177/07419325070280020601
Fogli D, Sansoni D, Trivella E et al (2018) Advanced interaction technologies for accessible and engaging cultural heritage. Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST 233:364–373. https://doi.org/10.1007/978-3-319-76111-4_36
Foxman MH (2018) Playing with Virtual reality: early adopters of commercial immersive technology. PhD thesis, Columbia University, https://doi.org/10.7916/D8M05NH3
Franz R, Junuzovic S, Mott M (2021) Nearmi: A framework for designing point of interest techniques for VR Users with limited mobility. In: ASSETS 2021 - 23rd International ACM SIGACCESS Conference on Computers and Accessibility, https://doi.org/10.1145/3441852.3471230
Freeman G, Acena D (2021) Hugging from a distance: building interpersonal relationships in social virtual reality. In: ACM International Conference on Interactive Media Experiences. Association for Computing Machinery, New York, NY, USA, IMX ’21, pp 84–95, https://doi.org/10.1145/3452918.3458805
Freeman G, Maloney D (2021) Body, Avatar, and Me. Proc ACM Hum Comput Interact. https://doi.org/10.1145/3432938
Fuglerud KS (2014) Inclusive design of ICT: the challenge of diversity. PhD thesis, University of Oslo, http://rgdoi.net/10.13140/2.1.4471.5844
Ganapathi P, Thiel F, Swapp D, et al. (2022) Head in the clouds-floating locomotion in virtual reality. In: Proceedings - 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022. Institute of Electrical and Electronics Engineers Inc., pp 668–669, https://doi.org/10.1109/VRW55335.2022.00185
Gang P, Hui J, Stirenko S et al (2019) User-driven intelligent interface on the basis of multimodal augmented reality and brain-computer interaction for people with functional disabilities. Adv Intell Syst Comput 886:612–631. https://doi.org/10.1007/978-3-030-03402-3_43
Garaj V, Pokinko T, Hemphill C, et al. (2019) Inclusive design of the immersive reality: eliciting user perspectives. In: The 7th International Conference for Universal Design (UD2019). International Association for Universal Design (IAUD), Bangkok, Thailand
Garzotto F, Gelsomini M, Matarazzo V et al (2017) XOOM: an end-user development tool for web-based wearable immersive virtual tours. In: Cabot J, De Virgilio R, Torlone R (eds) Web engineering. Springer International Publishing, Cham, pp 507–519
Garzotto F, Gelsomini M, Occhiuto D, et al. (2017b) Wearable immersive virtual reality for children with disability: a case study. In: Proceedings of the 2017 Conference on Interaction Design and Children. ACM, Stanford California USA, pp 478–483, https://doi.org/10.1145/3078072.3084312
Garzotto F, Matarazzo V, Messina N, et al. (2018) Improving museum accessibility through storytelling in wearable immersive virtual reality. In: Proceedings of the 2018 3rd Digital Heritage International Congress, Digital Heritage 2018 - Held jointly with the 2018 24th International Conference on Virtual Systems and Multimedia, VSMM 2018, https://doi.org/10.1109/DigitalHeritage.2018.8810097
Gerling K, Buttrick L (2014) Last tank rolling: exploring shared motion-based play to empower persons using wheelchairs. In: Proceedings of the first ACM SIGCHI annual symposium on Computer-human interaction in play. Association for Computing Machinery, Toronto, Ontario, Canada, CHI PLAY ’14, pp 415–416, https://doi.org/10.1145/2658537.2661303
Gerling K, Spiel K (2021) A critical examination of virtual reality technology in the context of the minority body. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, CHI ’21, pp 1–14, https://doi.org/10.1145/3411764.3445196
Gerling K, Dickinson P, Hicks K, et al. (2020) Virtual reality games for people using wheelchairs. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, CHI ’20, pp 1–11, https://doi.org/10.1145/3313831.3376265
Gerling KM, Mandryk RL, Miller M et al (2015) Designing wheelchair-based movement games. ACM Trans Access Comput 6(2):6:1-6:23. https://doi.org/10.1145/2724729
Giaconi C, Ascenzi A, Del Bianco N et al (2021) Virtual and augmented reality for the cultural accessibility of people with autism spectrum disorders: a pilot study. Int J Incl Mus 14(1):95–106
Glaser N, Schmidt M, Schmidt C (2022) Learner experience and evidence of cybersickness: design tensions in a virtual reality public transportation intervention for autistic adults. Virtual Real. https://doi.org/10.1007/s10055-022-00661-3
Goldberg A, Tetreault S (2021) Live from the Metaverse: virtual XR Performances for Audiences on-Stage, Online, in-Game, and in-VR. In: ACM SIGGRAPH 2021 Production Sessions. Association for Computing Machinery, New York, NY, USA, SIGGRAPH ’21, https://doi.org/10.1145/3446368.3452125
Guedes L, Marques L, Vitório G (2020) Enhancing interaction and accessibility in museums and exhibitions with augmented reality and screen readers. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 12376 LNCS:157–163. https://doi.org/10.1007/978-3-030-58796-3_20
Guerrero G, Gomez L, Achig J (2020) Holonote: Text editor of augmented reality oriented to people with motor disabilities. In: Iberian Conference on Information Systems and Technologies, CISTI, https://doi.org/10.23919/CISTI49556.2020.9140857
Guitton P, Sauzéon H, Cinquin PA (2019) Accessibility of immersive serious games for persons with cognitive disabilities. In: 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp 443–447, https://doi.org/10.1109/ISMAR-Adjunct.2019.00051
Hansen J, Trudslev A, Harild S, et al. (2019) Providing access to VR through a wheelchair. In: Conference on Human Factors in Computing Systems-Proceedings, https://doi.org/10.1145//3290607.3299048
Hedvall P (2009) Towards the era of mixed reality: accessibility meets three waves of HCI. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 5889 LNCS:264–278. https://doi.org/10.1007/978-3-642-10308-7_18
Herskovitz J, Wu J, White S, et al. (2020) Making mobile augmented reality applications accessible. In: ASSETS 2020-22nd International ACM SIGACCESS Conference on Computers and Accessibility, https://doi.org/10.1145/3373625.3417006
Hoffman HG, Patterson DR, Magula J et al (2004) Water-friendly virtual reality pain control during wound care. J Clin Psychol 60(2):189–195. https://doi.org/10.1002/jclp.10244
Hong S, Shin H, Gil Y et al (2021) Analyzing visual attention of people with intellectual disabilities during virtual reality-based job training. Electronics. https://doi.org/10.3390/electronics10141652
Hoppe A, Anken J, Schwarz T, et al. (2020) CLEVR: A customizable interactive learning environment for users with low vision in virtual reality. In: ASSETS 2020–22nd International ACM SIGACCESS Conference on Computers and Accessibility, https://doi.org/10.1145/3373625.3418009
Hurd O, Kurniawan S (2019) Insights for more usable vr for people with amblyopia. In: The 21st International ACM SIGACCESS Conference on Computers and Accessibility. Association for Computing Machinery, Pittsburgh, PA, USA, ASSETS ’19, pp 706–708, https://doi.org/10.1145/3308561.3356110
IrisVision (2020) See clearly, live fully with IrisVision. https://irisvision.com/product/
Jaballah K (2012) Accessible 3D signing avatars: the Tunisian experience. In: W4A 2012–International Cross-Disciplinary Conference on Web Accessibility, https://doi.org/10.1145/2207016.2207033
Jain D, Chinh B, Findlater L, et al. (2018a) Exploring augmented reality approaches to real-time captioning: a preliminary autoethnographic study. In: Proceedings of the 2018 ACM Conference Companion Publication on Designing Interactive Systems. Association for Computing Machinery, Hong Kong, China, DIS ’18 Companion, pp 7–11, https://doi.org/10.1145/3197391.3205404
Jain D, Franz R, Findlater L, et al. (2018b) Towards accessible conversations in a mobile context for people who are deaf and hard of hearing. In: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. Association for Computing Machinery, Galway, Ireland, ASSETS ’18, pp 81–92, https://doi.org/10.1145/3234695.3236362
Jain D, Junuzovic S, Ofek E, et al. (2021a) A taxonomy of sounds in virtual reality. In: Designing Interactive Systems Conference 2021. Association for Computing Machinery, New York, NY, USA, DIS ’21, pp 160–170, https://doi.org/10.1145/3461778.3462106
Jain D, Junuzovic S, Ofek E, et al. (2021b) Towards sound accessibility in virtual reality. In: ICMI 2021 - Proceedings of the 2021 International Conference on Multimodal Interaction, pp 80–91, https://doi.org/10.1145/3462244.3479946
Jimenez-Mixco V, De Las Heras R, Villalar JL, et al. (2009) A new approach for accessible interaction within smart homes through virtual reality. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 5615 LNCS(PART 2):75–81. https://doi.org/10.1007/978-3-642-02710-9_9
John NW, Pop SR, Day TW et al (2018) The implementation and validation of a virtual environment for training powered wheelchair manoeuvres. IEEE Trans Visual Comput Graph 24(5):1867–1878. https://doi.org/10.1109/TVCG.2017.2700273
Jones P, Somoskeöy T, Chow-Wing-Bom H et al (2020) Seeing other perspectives: evaluating the use of virtual and augmented reality to simulate visual impairments (OpenVisSim). NPJ Digit Med. https://doi.org/10.1038/s41746-020-0242-6
Kaminer C, LeBras K, McCall J, et al. (2014) An immersive physical therapy game for stroke survivors. In: Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility - ASSETS ’14. ACM Press, Rochester, New York, USA, pp 299–300, https://doi.org/10.1145/2661334.2661340
Kandalaft MR, Didehbani N, Krawczyk DC et al (2013) Virtual reality social cognition training for young adults with high-functioning autism. J Autism Dev Disorders 43(1):34–44. https://doi.org/10.1007/s10803-012-1544-6
Kane S, Frey B, Wobbrock J (2013) Access lens: a gesture-based screen reader for real-world documents. In: Conference on Human Factors in Computing Systems–Proceedings, pp 347–350, https://doi.org/10.1145/2470654.2470704
Kartick P, Quevedo AJU, Gualdron DR (2020) Design of virtual reality reach and grasp modes factoring upper limb ergonomics. In: 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp 798–799, https://doi.org/10.1109/VRW50115.2020.00250
Kaul O, Behrens K, Rohs M (2021) Mobile recognition and tracking of objects in the environment through augmented reality and 3D audio cues for people with visual impairments. In: Conference on Human Factors in Computing Systems - Proceedings, https://doi.org/10.1145/3411763.3451611
Keselj A, Topolovac I, Kacic-Barisic M, et al. (2021) Design and evaluation of an accessible mobile AR application for learning about geometry. In: Proceedings of the 16th International Conference on Telecommunications, ConTEL 2021, pp 49–53, https://doi.org/10.23919/ConTEL52528.2021.9495975
Kim A, Darakjian N, Finley JM (2017) Walking in fully immersive virtual environments: an evaluation of potential adverse effects in older adults and individuals with Parkinson’s disease. J NeuroEng Rehabil. https://doi.org/10.1186/s12984-017-0225-2
Kim J (2020) VIVR: presence of immersive interaction for visual impairment virtual reality. IEEE Access 8:196151–196159. https://doi.org/10.1109/ACCESS.2020.3034363
Kim W, Choo K, Lee Y, et al. (2018) Empath-D: VR-based empathetic app design for accessibility. In: MobiSys 2018–Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services, pp 123–135, https://doi.org/10.1145/3210240.3210331
Kizony R, Katz N, Weiss PLT (2003) Adapting an immersive virtual reality system for rehabilitation. J Visual Comput Anim 14(5):261–268. https://doi.org/10.1002/vis.323
Kizony R, Korman M, Sinoff G, et al. (2012) Using a virtual supermarket as a tool for training executive functions in people with mild cognitive impairment. In: Virtual Reality p 6
Klinger E, Chemin I, Lebreton S et al (2004) A virtual supermarket to assess cognitive planning. Cyberpsychol Behav 7(3):292–293
Kreimeier J, Karg P, Götzelmann T (2020a) BlindWalkVR: formative insights into blind and visually impaired people’s VR locomotion using commercially available approaches. In: ACM International Conference Proceeding Series, pp 213–220, https://doi.org/10.1145/3389189.3389193
Kreimeier J, Karg P, Götzelmann T (2020b) Tabletop virtual haptics: feasibility study for the exploration of 2.5D virtual objects by blind and visually impaired with consumer data gloves. In: ACM International Conference Proceeding Series, pp 221–230, https://doi.org/10.1145/3389189.3389194
Kunze K, Henze N, Kise K (2014) Wearable computing for older adults: initial insights into head-mounted display usage. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. Association for Computing Machinery, Seattle, Washington, UbiComp ’14 Adjunct, pp 83–86, https://doi.org/10.1145/2638728.2638747
Lamash L, Klinger E, Josman N (2017) Using a virtual supermarket to promote independent functioning among adolescents with autism spectrum disorder. In: 2017 International Conference on Virtual Rehabilitation (ICVR), pp 1–7, https://doi.org/10.1109/ICVR.2017.8007467
Lang F, Machulla T (2021) Pressing a button you cannot see: evaluating visual designs to assist persons with low vision through augmented reality. In: Proceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST, https://doi.org/10.1145/3489849.3489873
Levin MF (2011) Can virtual reality offer enriched environments for rehabilitation? Expert Rev Neurotherapeutics 11(2):153–155. https://doi.org/10.1586/ern.10.201
Li L, Gajos KZ (2014) Adaptive click-and-cross: adapting to both abilities and task improves performance of users with impaired dexterity. In: Proceedings of the 19th international conference on Intelligent User Interfaces–IUI ’14. ACM Press, Haifa, Israel, pp 299–304, https://doi.org/10.1145/2557500.2557511
Li N, Han T, Tian F, et al. (2020) Get a grip: evaluating grip gestures for vr input using a lightweight pen. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Honolulu, HI, USA, CHI ’20, pp 1–13, https://doi.org/10.1145/3313831.3376698
Li X, Yi W, Chi HL et al (2018) A critical review of virtual and augmented reality (VR/AR) applications in construction safety. Autom Constr 86:150–162. https://doi.org/10.1016/j.autcon.2017.11.003
Liu KH, Sasaki T, Kajihara H, et al. (2022) Designing WindCage-unpacking the thinking and prototyping a propeller-based haptic unit. In: Proceedings–2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022. Institute of Electrical and Electronics Engineers Inc., pp 131–135, https://doi.org/10.1109/VRW55335.2022.00038
Logitech (2020) Logitech G adaptive gaming kit for the Xbox adaptive controller. https://www.logitechg.com/en-gb/products/gamepads/adaptive-gaming-kit-accessories.html
Lowtek Games (2021) Dislectek|integration|unity asset store. https://assetstore.unity.com/packages/tools/integration/dislectek-192133
Lukava T, Morgado Ramirez DZ, Barbareschi G (2022) Two sides of the same coin: accessibility practices and neurodivergent users’ experience of extended reality. J Enabling Technol. https://doi.org/10.1108/JET-03-2022-0025
Mahmud M, Stewart M, Cordova A, et al. (2022) Auditory feedback for standing balance improvement in virtual reality. In: Proceedings–2022 IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2022. Institute of Electrical and Electronics Engineers Inc., pp 782–791, https://doi.org/10.1109/VR51125.2022.00100
Maidenbaum S, Amedi A (2015) Non-visual virtual interaction: can sensory substitution generically increase the accessibility of graphical virtual reality to the blind? In: 2015 3rd IEEE VR International Workshop on Virtual and Augmented Assistive Technology (VAAT), pp 15–17, https://doi.org/10.1109/VAAT.2015.7155404
Maidenbaum S, Levy-Tzedek S, Chebat DR et al (2013) Increasing accessibility to the blind of virtual environments, using a virtual mobility aid based on the “EyeCane”: feasibility study. PLoS ONE. https://doi.org/10.1371/journal.pone.0072555
Maloney D, Freeman G, Robb A (2021) Social virtual reality: ethical considerations and future directions for an emerging research space. In: 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp 271–277, https://doi.org/10.1109/VRW52623.2021.00056
Malu M, Findlater L (2015) Personalized, wearable control of a head-mounted display for users with upper body motor impairments. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. Association for Computing Machinery, Seoul, Republic of Korea, CHI ’15, pp 221–230, https://doi.org/10.1145/2702123.2702188
Masood T, Egger J (2019) Augmented reality in support of Industry 4.0-implementation challenges and success factors. Robot Comput Integr Manuf 58:181–195. https://doi.org/10.1016/j.rcim.2019.02.003
Matamala A (2021) Accessibility in 360\(^\circ\) videos: methodological aspects and main results of evaluation activities in the IMAC project. Sendebar 32:65–89
McCloskey R (2022) Irish sign language in a virtual reality environment. In: Proceedings - 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2022. Institute of Electrical and Electronics Engineers Inc., pp 866–867, https://doi.org/10.1109/VRW55335.2022.00284
McIntosh J, Marques B, Harkness R (2020) Simulating impairment through virtual reality. Int J Archit Comput 18(3):284–295. https://doi.org/10.1177/1478077120914020
McLaughlin N, Rogers J, D’Arcy J et al (2020) Sorry doctor....I didn’t hear that...: phenomenological analysis of medical students’ experiences of simulated hearing impairment through virtual reality. BMJ Simul Technol Enhanc Learn. https://doi.org/10.1136/bmjstel-2020-000683
McNaney R, Vines J, Roggen D, et al. (2014) Exploring the acceptability of google glass as an everyday assistive device for people with Parkinson’s. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Toronto, Ontario, Canada, CHI ’14, pp 2551–2554, https://doi.org/10.1145/2556288.2557092
Meijer J, Batalas N (2020) Exploring visuo-tactile embodiment in a social virtual reality setting with a physical wheelchair for training empathy towards social disability barriers. In: CEUR Workshop Proceedings
Mengoni M, Iualè M, Peruzzini M et al (2015) An adaptable AR user interface to face the challenge of ageing workers in manufacturing. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 9194:311–323. https://doi.org/10.1007/978-3-319-20913-5_29
MetalPop Games (2021) UI accessibility plugin (UAP)|GUI tools|Unity asset store. https://assetstore.unity.com/packages/tools/gui/ui-accessibility-plugin-uap-87935
Microsoft (2019) Launch the Windows Settings app–UWP applications. https://docs.microsoft.com/en-us/windows/uwp/launch-resume/launch-settings-app#ease-of-access
Microsoft (2020) Seeing AI App from microsoft. https://www.microsoft.com/en-us/ai/seeing-ai
Milgram P, Kishino F (1994) A taxonomy of mixed reality visual displays. IEICE Trans Inf Syst 77(12):1321–1329
Miller KJ, Adair BS, Pearce AJ et al (2013) Effectiveness and feasibility of virtual reality and gaming system use at home by older adults for enabling physical activity to improve health-related domains: a systematic review. Age Ageing 43(2):188–195. https://doi.org/10.1093/ageing/aft194
Minakata K, Hansen J, MacKenzie I, et al. (2019) Pointing by gaze, head, and foot in a head-mounted display. In: Eye Tracking Research and Applications Symposium (ETRA), https://doi.org/10.1145/3317956.3318150
Mirzaei M, Kán P, Kaufmann H (2020) EarVR: using ear haptics in virtual reality for deaf and hard-of-hearing people. IEEE Trans Visual Comput Graph 26(5):2084–2093. https://doi.org/10.1109/TVCG.2020.2973441
Misztal S, Carbonell G, Zander L, et al. (2020) Simulating illness: experiencing visual migraine impairments in virtual reality. In: 2020 IEEE 8th International Conference on Serious Games and Applications for Health, SeGAH 2020, https://doi.org/10.1109/SeGAH49190.2020.9201756
Molloy D, Carter P (2020) Last of us part II: is this the most accessible game ever? BBC News https://www.bbc.co.uk/news/technology-53093613
Montagud M, Fraile I, Nuñez J, et al. (2018) ImAc: enabling immersive, accessible and personalized media experiences. In: TVX 2018 - Proceedings of the 2018 ACM International Conference on Interactive Experiences for TV and Online Video, pp 245–250, https://doi.org/10.1145/3210825.3213570
Montagud M, Orero P, Fernández S (2020) Immersive media and accessibility: hand in hand to the future. ITU J 3:8
Montagud M, Orero P, Matamala A (2020) Culture 4 all: accessibility-enabled cultural experiences through immersive VR360 content. Pers Ubiquitous Comput 24(6):887–905. https://doi.org/10.1007/s00779-019-01357-3
Montagud M, Segura-Garcia J, De Rus J, et al. (2020c) Towards an immersive and accessible virtual reconstruction of theaters from the early modern: bringing back cultural heritage from the past. In: IMX 2020 - Proceedings of the 2020 ACM International Conference on Interactive Media Experiences, pp 143–147, https://doi.org/10.1145/3391614.3399390
Montagud M, Soler O, Fraile I et al (2020) VR360 subtitling: requirements, technology and user experience. In: IEEE Access. https://doi.org/10.1109/ACCESS.2020.3047377
Montagud M, Hurtado C, De Rus J et al (2021) Subtitling 3D VR content with limited 6DoF: presentation modes and guiding methods. Appl Sci. https://doi.org/10.3390/app11167472
Morański M, Materka A (2014) Haptic presentation of 3D objects in virtual reality for the visually disabled. In: Virtual Reality: People with Special Needs. pp 137–148
Morita K, Hiraki T, Matsukura H, et al. (2020) Extension of projection area using head orientation in projected virtual hand interface for wheelchair users. In: 2020 59th Annual Conference of the Society of Instrument and Control Engineers of Japan, SICE 2020, pp 421–426, https://www.scopus.com/inward/record.uri?eid=2-s2.0-85096361918 &partnerID=40 &md5=ddadef1bc87b3ba01fd9e9fdb1ec6778
Morris RG, Parslow D, Recce MD (2000) Using immersive virtual reality to test allocentric spatial memory impairment following unilateral temporal lobectomy. In: Virtual Reality, p 8
Mott M, Cutrell E, Gonzalez Franco M, et al. (2019) Accessible by design: an opportunity for virtual reality. In: 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), pp 451–454, https://doi.org/10.1109/ISMAR-Adjunct.2019.00122
Mott M, Tang J, Kane S, et al. (2020) I just went into it assuming that I wouldn’t be able to have the full experience: understanding the accessibility of virtual reality for people with limited mobility. In: The 22nd International ACM SIGACCESS Conference on Computers and Accessibility. Association for Computing Machinery, New York, NY, USA, ASSETS ’20, pp 1–13, https://doi.org/10.1145/3373625.3416998
Munteanu C, Molyneaux H, Moncur W, et al. (2015) Situational ethics: re-thinking approaches to formal ethics requirements for human-computer interaction. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, CHI ’15, pp 105–114, https://doi.org/10.1145/2702123.2702481
Murray CD, Pettifer S, Howard T et al (2007) The treatment of phantom limb pain using immersive virtual reality: three case studies. Disabil Rehabil 29(18):1465–1469. https://doi.org/10.1080/09638280601107385
Nabors L, Monnin J, Jimenez S (2020) A scoping review of studies on virtual reality for individuals with intellectual disabilities. Adv Neurodev Disorders 4(4):344–356. https://doi.org/10.1007/s41252-020-00177-4
Oculus (2021) Quest virtual reality check (VRC) guidelines|oculus developers. https://developer.oculus.com/resources/publish-quest-req/
Oncins E, Bernabé R, Montagud M et al (2020) Accessible scenic arts and virtual reality: a pilot study with aged people about user preferences when reading subtitles in immersive environments. Monografias Traduccion Interpretacion (MonTI) 12:214–241. https://doi.org/10.6035/MonTI.2020.12.07
Oña ED, García JA, Raffe W et al (2019) Assessment of Manual dexterity in VR: towards a fully automated version of the box and blocks test. Stud Health Technol Inf 266:57–62. https://doi.org/10.3233/SHTI190773
Paciulli G, De Lima Y, Faccin M, et al. (2020) Accessible augmented reality to support chemistry teaching. In: Proceedings of the 15th Latin American Conference on Learning Technologies, LACLO 2020, https://doi.org/10.1109/LACLO50806.2020.9381132
Palaniappan S, Zhang T, Duerstock B (2019) Identifying comfort areas in 3D space for persons with upper extremity mobility impairments using virtual reality. In: ASSETS 2019 - 21st International ACM SIGACCESS Conference on Computers and Accessibility, pp 495–499, https://doi.org/10.1145/3308561.3353810
Palos XB (2017) Daydream Labs: Accessibility in VR. https://blog.google/products/daydream/daydream-labs-accessibility-vr/
Park SM, Kim YG (2022) A metaverse: taxonomy, components, applications, and open challenges. IEEE Access 10:4209–4251. https://doi.org/10.1109/ACCESS.2021.3140175
Paudyal P, Banerjee A, Hu Y, et al. (2019) DAVEE: a deaf accessible virtual environment for education. In: C and C 2019 - Proceedings of the 2019 Creativity and Cognition, pp 522–526, https://doi.org/10.1145/3325480.3326546
Peng YH, Hsu MW, Taele P, et al. (2018) Speechbubbles: enhancing captioning experiences for Deaf and hard-of-hearing people in group conversations. In: Conference on Human Factors in Computing Systems–Proceedings, https://doi.org/10.1145/3173574.3173867
Piovesan SD, Wagner R, Medina RD, et al. (2013) Virtual reality system to the inclusion of people with disabilities in the work market. In: 2013 12th International Conference on Information Technology Based Higher Education and Training (ITHET), pp 1–7, https://doi.org/10.1109/ITHET.2013.6671021
PlayStation (2020) The last of us part II: accessibility features. https://www.playstation.com/en-us/games/the-last-of-us-part-ii-ps4/accessibility/
Powell W, Powell V, Cook M (2020) The accessibility of commercial off-the-shelf virtual reality for low vision users: a macular degeneration case study. Cyberpsychol Behav Social Network 23(3):185–191. https://doi.org/10.1089/cyber.2019.0409
Poyade M, Morris G, Taylor I, et al. (2017) Using mobile virtual reality to empower people with hidden disabilities to overcome their barriers. In: Proceedings of the 19th ACM International Conference on Multimodal Interaction–ICMI 2017. ACM Press, Glasgow, UK, pp 504–505, https://doi.org/10.1145/3136755.3143025
Preiser WFE, Smith KH (2011) Universal design handbook. McGraw-Hill, Maidenhead
Richard E, Billaudeau V, Richard P, et al. (2007) Augmented reality for rehabilitation of cognitive disabled children: a preliminary study. In: 2007 Virtual Rehabilitation, pp 102–108, https://doi.org/10.1109/ICVR.2007.4362148
Rizzo AS (2002) Virtual reality and disability: emergence and challenge. Disabil Rehabil 24(11–12):567–569. https://doi.org/10.1080/09638280110111315
Roberts J, Slattery O, Swope B et al (2002) Small-scale tactile graphics for virtual reality systems. Proc SPIE Int Soc Optic Eng 4660:422–429. https://doi.org/10.1117/12.468059
Rose FD, Brooks BM, Rizzo AA (2005) Virtual reality in brain damage rehabilitation: review. Cyberpsychol Behav Impact Internet Multimed Virtual Real Behav Soc 8(3):241–262. https://doi.org/10.1089/cpb.2005.8.241. (discussion 263–271.)
Salamon N, Jacques J, Musse S (2014) Seeing the movement through sound: giving trajectory information to visually impaired people. In: Brazilian Symposium on Games and Digital Entertainment, SBGAMES, pp 165–172, https://doi.org/10.1109/SBGAMES.2014.12
Salivia G, Hourcade JP (2013) PointAssist: assisting individuals with motor impairments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Paris, France, CHI ’13, pp 1213–1222, https://doi.org/10.1145/2470654.2466157
Samsung (2017) Samsung relumino. https://www.samsungrelumino.com/home
Saposnik G, Levin M (2011) Virtual Reality in Stroke Rehabilitation. Stroke 42(5):1380–1386. https://doi.org/10.1161/STROKEAHA.110.605451
Schneider O, Shigeyama J, Kovacs R, et al. (2018) DualPanto: A haptic device that enables blind users to continuously interact with virtual worlds. In: Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, Berlin, Germany, UIST ’18, pp 877–887, https://doi.org/10.1145/3242587.3242604
Seaborn K, Edey J, Dolinar G et al (2016) Accessible play in everyday spaces: mixed reality gaming for adult powered chair users. ACM Trans Comput Hum Interact. https://doi.org/10.1145/2893182
Sears A, Young M (2002) Physical disabilities and computing technologies: an analysis of impairments. In: Erlbaum L (ed) The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications. Taylor and Francis, London, pp 482–503
Seigneur JM, Choukou MA (2022) How should metaverse augment humans with disabilities? in: 13th augmented human international conference. Association for Computing Machinery, New York, NY, USA, AH2022, https://doi.org/10.1145/3532525.3532534
Shaker A, Lin X, Kim D et al (2020) Design of a virtual reality tour system for people with intellectual and developmental disabilities: a case study. Comput Sci Eng 22(3):7–17. https://doi.org/10.1109/MCSE.2019.2961352
Shakespeare T (2006) The social model of disability. Disabil Stud Read 2:197–204
Shao D, Lee IJ (2020) Acceptance and influencing factors of social virtual reality in the urban elderly. Sustainability 12(22):9345. https://doi.org/10.3390/su12229345
Sharar SR, Miller W, Teeley A et al (2008) Applications of virtual reality for pain management in burn-injured patients. Expert Rev Neurotherapeutics 8(11):1667–1674. https://doi.org/10.1586/14737175.8.11.1667
Sky (2020) Sky immersive. https://www.sky.com/pages/sky-immersive
Smith K (2012) Universal life: the use of virtual worlds among people with disabilities. Univers Access Inf Soc 11(4):387–398. https://doi.org/10.1007/s10209-011-0254-8
Standen PJ, Brown DJ (2006) Virtual reality and its role in removing the barriers that turn cognitive impairments into intellectual disability. Virtual Real 10(3):241–252. https://doi.org/10.1007/s10055-006-0042-6
Stearns L, Findlater L, Froehlich JE (2018) Design of an augmented reality magnification aid for low vision users. In: Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. Association for Computing Machinery, Galway, Ireland, ASSETS ’18, pp 28–39, https://doi.org/10.1145/3234695.3236361
Sykownik P, Maloney D, Freeman G, et al. (2022) Something personal from the metaverse: goals, topics, and contextual factors of self-disclosure in commercial social VR. In: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, CHI ’22, https://doi.org/10.1145/3491102.3502008
Sánchez-Cabrero R, Costa-Román O, Pericacho-Gómez FJ et al (2019) Early virtual reality adopters in Spain: sociodemographic profile and interest in the use of virtual reality as a learning tool. Heliyon 5(3):e01338. https://doi.org/10.1016/j.heliyon.2019.e01338
Teófilo M, Nascimento J, Santos J, et al. (2016) Bringing basic accessibility features to virtual reality context. In: 2016 IEEE Virtual Reality (VR), pp 293–294, https://doi.org/10.1109/VR.2016.7504769
Teófilo M, Lourenço A, Postal J, et al. (2018a) Exploring Virtual Reality to Enable Deaf or Hard of Hearing Accessibility in Live Theaters: A Case Study. In: Antona M, Stephanidis C (eds) Universal Access in Human-Computer Interaction. Virtual, Augmented, and Intelligent Environments. Springer International Publishing, Cham, Lecture Notes in Computer Science, pp 132–148, https://doi.org/10.1007/978-3-319-92052-8_11
Teófilo M, Lucena VF, Nascimento J, et al. (2018b) Evaluating accessibility features designed for virtual reality context. In: 2018 IEEE International Conference on Consumer Electronics (ICCE), pp 1–6, https://doi.org/10.1109/ICCE.2018.8326167
The University of Melbourne (2019) Accessibility of virtual reality environments. https://www.unimelb.edu.au/accessibility/virtual-reality
Thevin L, Brock A (2018) Augmented reality for people with visual impairments: designing and creating audio-tactile content from existing objects. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 10897 LNCS:193–200. https://doi.org/10.1007/978-3-319-94274-2_26
Thevin L, Jouffrais C, Rodier N, et al. (2019) Creating accessible interactive audio-tactile drawings using spatial augmented reality. In: Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces. Association for Computing Machinery, Daejeon, Republic of Korea, ISS ’19, pp 17–28, https://doi.org/10.1145/3343055.3359711
Thevin L, Rodier N, Oriola B et al (2021) Inclusive adaptation of existing board games for gamers with and without visual impairments using a spatial augmented reality framework for touch detection and audio feedback. Proc ACM Hum Comput Interact 5(ISS):505:1-505:33. https://doi.org/10.1145/3488550
Tianwu Y, Changjiu Z, Jiayao S (2016) Virtual reality based independent travel training system for children with intellectual disability. In: 2016 European Modelling Symposium (EMS), pp 143–148, https://doi.org/10.1109/EMS.2016.034
Trost Z, Parsons TD (2014) Beyond distraction: virtual reality graded exposure therapy as treatment for pain-related fear and disability in chronic pain. J Appl Biobehav Res 19(2):106–126. https://doi.org/10.1111/jabr.12021
Tsoupikova D, Stoykov NS, Corrigan M et al (2015) Virtual immersion for post-stroke hand rehabilitation therapy. Ann Biomed Eng 43(2):467–477. https://doi.org/10.1007/s10439-014-1218-y
Vedamurthy I, Knill DC, Huang SJ et al (2016) Recovering stereo vision by squashing virtual bugs in a virtual reality environment. Philos Trans Royal Soc B Biol Sci 371(1697):20150. https://doi.org/10.1098/rstb.2015.0264
Vinayagamoorthy V, Glancy M, Ziegler C, et al. (2019) Personalising the TV experience using augmented reality an exploratory study on delivering synchronised sign language interpretation. In: Conference on Human Factors in Computing Systems–Proceedings, https://doi.org/10.1145/3290605.3300762
Vivid Vision (2020) Vivid Vision for lazy eye, crossed eye, and convergence insufficiency. https://www.seevividly.com/
Vona F, Torelli E, Beccaluva E, et al. (2020) Exploring the potential of speech-based virtual assistants in mixed reality applications for people with cognitive disabilities. In: ACM International Conference Proceeding Series, https://doi.org/10.1145/3399715.3399845
Voss C, Washington P, Haber N, et al. (2016) Superpower glass: delivering unobtrusive real-time social cues in wearable systems. In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, Heidelberg Germany, pp 1218–1226, https://doi.org/10.1145/2968219.2968310
W3C (2017) Accessibility of virtual reality. https://www.w3.org/WAI/APA/task-forces/research-questions/wiki/Accessibility_of_Virtual_Reality
W3C (2018) Web content accessibility guidelines (WCAG) 2.1. https://www.w3.org/TR/WCAG21/
W3C (2021) XR accessibility user requirements. https://www.w3.org/TR/2021/NOTE-xaur-20210825/
Waddingham P, Cobb S, Eastgate R, et al. (2006) Virtual reality for interactive binocular treatment of amblyopia. In: The Sixth International Conference on Disability, Virtual Reality and Associated Technologies
WalkinVR (2020) WalkinVR–Virtual reality for people with disabilities. https://www.walkinvrdriver.com/
Waller S, Langdon P, Clarkson J (2010) Designing a more inclusive world. J Integr Care 18(4):19–25
Wang KJ, Liu Q, Zhao Y, et al. (2019) Intelligent wearable virtual reality (VR) gaming controller for people with motor disabilities. In: Proceedings–2018 IEEE International Conference on Artificial Intelligence and Virtual Reality, AIVR 2018, pp 161–164, https://doi.org/10.1109/AIVR.2018.00034
Wang X, Liang X, Yao J et al (2021) A study of the use of virtual reality headsets in Chinese adolescents with intellectual disability. Int J Dev Disabil. https://doi.org/10.1080/20473869.2021.1970938
Wasserman B, Prate D, Purnell B, et al. (2019) Vrsensory: designing inclusive virtual games with neurodiverse children. In: CHI PLAY 2019 - Extended Abstracts of the Annual Symposium on Computer-Human Interaction in Play, pp 755–761, https://doi.org/10.1145/3341215.3356277
Wedel M, Bigné E, Zhang J (2020) Virtual and augmented reality: advancing research in consumer marketing. Int J Res Market. https://doi.org/10.1016/j.ijresmar.2020.04.004
Wedoff R, Ball L, Wang A, et al. (2019) Virtual showdown: an accessible virtual reality game with scaffolds for youth with visual impairments. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Glasgow, Scotland UK, CHI ’19, pp 1–15, https://doi.org/10.1145/3290605.3300371
Weir K, Loizides F, Nahar V, et al. (2020) Creating a bespoke virtual reality personal library space for persons with severe visual disabilities. In: Proceedings of the ACM/IEEE Joint Conference on Digital Libraries, pp 393–396, https://doi.org/10.1145/3383583.3398610
Werner P, Rabinowitz S, Klinger E et al (2009) Use of the virtual action planning supermarket for the diagnosis of mild cognitive impairment: a preliminary study. Dement Geriatr Cognit Disorders 27(4):301–309. https://doi.org/10.1159/000204915
WHO (2013) How to use the ICF. A practical manual for using the international classification of functioning, disability and health (ICF). WHO Geneva
Williams K, Moffatt K, McCall D, et al. (2015) Designing conversation cues on a head-worn display to support persons with aphasia. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. Association for Computing Machinery, Seoul, Republic of Korea, CHI ’15, pp 231–240, https://doi.org/10.1145/2702123.2702484
Wobbrock JO, Kane SK, Gajos KZ et al (2011) Ability-based design: concept, principles and examples. ACM Trans Access Comput 3(3):9:1-9:27. https://doi.org/10.1145/1952383.1952384
Wong A, Gillis H, Peck B (2017) VR Accessibility: Survey for People with Disabilities. Technical report, https://drive.google.com/file/d/0B0VwTVwReMqLMFIzdzVVaVdaTFk/view
Wu HY, Calabrèse A, Kornprobst P (2021) Towards accessible news reading design in virtual reality for low vision. Multimed Tools Appl. https://doi.org/10.1007/s11042-021-10899-9
Xbox (2020) Xbox adaptive controller. https://www.xbox.com/en-GB/accessories/controllers/xbox-adaptive-controller
XR Access (2020) About XR Access. https://xraccess.org/about/
XR Association (2020) Chapter Three: Accessibility & Inclusive Design in Immersive Experience. In: XR Association Ddevelopers Guide: An Industry-Wide Collaboration for Better XR. XR Association, https://xra.org/wp-content/uploads/2020/10/XRA_Developers-Guide_Chapter-3_Web_v3.pdf
Xu J, Papangelis K, Dunham J, et al. (2022) Metaverse: The vision for the future. In: Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, CHI EA ’22, https://doi.org/10.1145/3491101.3516399
Yang D, Zhou J, Chen R et al (2022) Expert consensus on the metaverse in medicine. Clin eHealth 5:1–9. https://doi.org/10.1016/j.ceh.2022.02.001
Yoon C, Louie R, Ryan J, et al. (2019) Leveraging augmented reality to create apps for people with visual disabilities: a case study in indoor navigation. In: ASSETS 2019–21st International ACM SIGACCESS Conference on Computers and Accessibility, pp 210–221, https://doi.org/10.1145/3308561.3353788
Yue K (2022) Breaking down the barrier between teachers and students by using metaverse technology in education: based on a survey and analysis of Shenzhen city, China. In: 2022 13th International Conference on E-Education, E-Business, E-Management, and E-Learning (IC4E). Association for Computing Machinery, New York, NY, USA, IC4E 2022, pp 40–44, https://doi.org/10.1145/3514262.3514345
Zhang S, Banerjee PP, Luciano C (2010) Virtual exercise environment for promoting active lifestyle for people with lower body disabilities. In: 2010 International Conference on Networking, Sensing and Control (ICNSC), pp 80–84, https://doi.org/10.1109/ICNSC.2010.5461539
Zhao Y, Hu M, Hashash S, et al. (2017) Understanding low vision people’s visual perception on commercial augmented reality glasses. In: Conference on Human Factors in Computing Systems–Proceedings, pp 4170–4181, https://doi.org/10.1145/3025453.3025949
Zhao Y, Bennett CL, Benko H, et al. (2018) Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Montreal QC, Canada, CHI ’18, pp 1–14, https://doi.org/10.1145/3173574.3173690
Zhao Y, Cutrell E, Holz C, et al. (2019a) SeeingVR: A set of tools to make virtual reality more accessible to people with low vision. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Glasgow, Scotland UK, CHI ’19, pp 1–14, https://doi.org/10.1145/3290605.3300341
Zhao Y, Kupferstein E, Castro B, et al. (2019b) Designing AR visualizations to facilitate stair navigation for people with low vision. In: UIST 2019–Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, pp 387–402, https://doi.org/10.1145/3332165.3347906
Zhao Y, Kupferstein E, Rojnirun H, et al. (2020) The effectiveness of visual and audio wayfinding guidance on smartglasses for people with low vision. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, Honolulu, HI, USA, CHI ’20, pp 1–14, https://doi.org/10.1145/3313831.3376516
Zhu S, Sui Y, Shen Y et al (2021) Effects of virtual reality intervention on cognition and motor function in older adults with mild cognitive impairment or dementia: a systematic review and meta-analysis. Front Aging Neurosci. https://doi.org/10.3389/fnagi.2021.586999
Zolyomi A, Shukla A, Snyder J (2017) Technology-mediated sight: a case study of early adopters of a low vision assistive technology. In: Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. Association for Computing Machinery, Baltimore, Maryland, USA, ASSETS ’17, pp 220–229, https://doi.org/10.1145/3132525.3132552
Funding
This work was funded by the EPSRC (Grants EP/S027432/1 and EP/S027637/1).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Dudley, J., Yin, L., Garaj, V. et al. Inclusive Immersion: a review of efforts to improve accessibility in virtual reality, augmented reality and the metaverse. Virtual Reality 27, 2989–3020 (2023). https://doi.org/10.1007/s10055-023-00850-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10055-023-00850-8