Abstract
Testing is important to improve accessibility. However, within the serious games area, this can sometimes rely on minimal testing with the use of heuristics and external assistive devices, with limited input from impaired users. Efficiency would be improved if designers could readily evaluate their designs with the assistance of virtual users. The VERITAS framework simulates and presents data on the impact of a virtual user’s impairments; thus, facilitating a more efficient approach to inclusive design. This article reports insights into the use of the framework by 31 evaluators from the serious games field. A log-file analysis highlights key areas of concern, which are then further explored through a questionnaire. The findings suggest that the background knowledge of designers should be considered in order to improve acceptance and usability. Specifically, by addressing challenges comprehending interface elements, following the simulation workflow, and reacting to feedback.
Keywords
- Accessibility
- Universal design
- Inclusion
- Games
- Simulations
- VERITAS framework
- Designers
Download conference paper PDF
1 Introduction
It is important for game designers to consider the accessibility of their products. According to census data, almost 11 % of the population in the United States [1] and 15 % of the population in the European Union [2] have some form of cognitive, motor, or sensory impairment. Many such individuals want to play games, but cannot [1]. Yet, enabling them to do so would likely improve their quality of life [3] and need not be difficult to achieve [4, 5].
However, designers can encounter several challenges such as: understanding the constraints associated with specific impairments; evaluating designs in terms of those constraints; and selecting designs [6–9]. The Virtual and Augmented Environments and Realistic User Interactions to Achieve Embedded Accessibility Designs (VERITAS) project helps to overcome these challenges by simulating impairments and using data to help designers assess their designs. A previous study demonstrates adequate acceptance and usability [10], but the participants were recruited from many different industries and so the specific challenges encountered by game designers were not explored. A particular concern is the diversity of employees in the games industry. As no domain-specific qualification is needed for a career in games design, designers in this field can possess a wide range of skills as a result of their diverse backgrounds. This means that any individual designer could have a broad range of technical competence. Thus, this article aims to addresses the following question:
-
What are the key challenges that game designers encounter while using the VERITAS framework to design GUI-based games?
In answer, this article will illustrate three themes and their associated implications for the design of future accessibility testing tools that are better suited for the serious games industry.
2 The VERITAS Simulation Framework
The VERITAS approach to accessibility is driven by simulations and metrics. The process incorporates three phases: (i) virtual user modeling; (ii) simulation scenario definition; and (iii) the simulation of an impaired virtual user. Three tools are used to achieve this: the User Model Generator (VerGen), to specify the nature of the virtual user’s impairments; the Simulation Editor (VerSEd-GUI), to define the actions to test; and the Simulation Viewer (VerSim-GUI), to simulate the experience of the impaired virtual user. These form a workflow that includes the tasks listed below in Table 1:
Figure 1 on the next page illustrates this process. It is important to note that designers have two sources of information on the accessibility of their design: the post-simulation metrics, providing support for criterion-based assessment and comparison; as well as the experience within the simulation itself, providing the designer with insight into the impact of a proposed design on a particular user.
Workflow of the VERITAS framework for evaluating the accessibility of GUI-based digital games (from [10])
3 Methodology
To assess the usability of the framework, each tool in the VERITAS framework was assessed using an empirical user testing approach in which expert users use the tool under observation in a lab setting (based on [11, 12]). A group of 31 evaluators from the serious games community, with a high level of task-related design expertise, were recruited. They used each tool to assess the accessibility of a sample game provided by the research team. A mixed-methods approach to data collection was adopted in order to identify areas of key concern, while providing rich insights. As such, while the participants used the tool, log-files were generated. Descriptive statistics, such as the number of click events and total duration required to complete each task, were compared to a benchmark set by an experienced user to identify those tasks which were problematic. The designers then made comments on each tool using an open-ended questionnaire (derived from well-known heuristics, e.g., [13]), thereby enabling a thematic analysis to elicit insight into each challenge that emerged.
4 Findings
4.1 Log File Analysis
Figure 2 below shows the mean duration, measured in seconds, that the evaluators needed to perform each task, listed in Table 1, when compared to the experienced user:
As expected, the evaluators were consistently slower compared to the experienced user. Of the users that successfully completed tasks, they were only marginally slower in eight (~62 %) of the tasks. Areas of concern include the definition of the user model (1.3–1.5), which typically required approximately more than 15 s to setup the user model compared to the experienced user for each task. Additionally, they were also approximately 20 s slower at setting the hot areas (2.3).
Figure 3 below shows the mean count of click events needed by the evaluators to complete each task compared to the experienced user:
In many cases, the evaluators required a greater number of clicks to complete tasks compared to the experienced user, suggesting lower efficiency. It is interesting to note, however, the fewer clicks during the initialization of the user model (1.2–1.3), which hints at less sophisticated models being defined. The evaluators were also less efficient at setting hot areas (2.3) and there was notably less interaction during the analysis of the simulation results (3.3), suggesting challenges in setting up the test scenario and then reviewing the results.
4.2 Thematic Analysis
Qualitative data procured through the questionnaires were analyzed using two types of thematic analysis, following the initial stages posed in [14] and [15]. Figure 4 shows an example of a VOSviewer visualization of the frequency and relatedness of terms used by evaluators. Figure 5 shows an example of a thematic map constructed through the inductive and deductive coding (see [16] for more details) of the questionnaire responses in nVIVO and Microsoft Excel.
As can be seen in Fig. 4, the meanings of the values required to setup the virtual users parameters appear quite prominently on the left. Likewise, on the right, the setup of events and assigning hot areas to images were raised frequently. To a lesser extent, the workflow of the tools and finding buttons also appeared frequently.
Figure 5 expands on these issues, providing greater insight into specific issues such as unclear units of measurement. Here, potential reasons behind some of these issues, such as unclear feedback in the way evaluators know that a model parameter values have changed, begin to appear more prominently.
5 Conclusion
In general, the feedback provided by the users was positive in nature. Furthermore, evaluator performance was comparable to an experienced user for most of the tasks. However, based on a triangulation of the analyses presented in the previous sections, four themes are proposed as key challenges that game designers can encounter while using the VERITAS framework:
-
Comprehending Model Parameters and Interface Features;
-
Understanding the Workflow of the Simulation Tools;
-
Efficiently Setting Up The Simulation Scenario;
-
Responding to Feedback Provided by the Tools
In order to overcome these challenges, it is recommended that additional features be incorporated to better meet the background knowledge of designers as well as the demands of their work environment. In particular, incorporating support features that: guide designers through the terminology and interface used in each tool to improve comprehension; address low familiarity with simulation tools to improve ease of use; streamline the workflow with as much automation as possible to reduce complexity and time required to complete tasks; and present clearer feedback in order to facilitate the setup of realistic virtual users while better supporting decision making between different design features.
Nevertheless, the VERITAS framework has received an encouraging evaluation, paving the way for a radical change in how accessibility concerns are addressed in serious games. With further improvements, in line with these recommendations, it is hoped that adoption of the framework will increase and subsequently enable improved access to games, thereby enhancing the quality of life of those with impairments.
References
Yuan, B., Folmer, B., Harris Jr., F.C.: Game Accessibility: A Survey. Univ. Access Inf. Soc. 10(1), 81–100 (2011)
European Union: Report of the Inclusive Communications (INCOM) Subgroup of the Communications Committee (COCOM) (2004)
Barbotte, E., Guillemin, F., Chau, N.: Prevalence of impairments, disabilities, handicaps and quality of life in the general population: a review of recent literature. Bull. World Health Organ. 79(11), 1047–1055 (2001)
Scott, M.J., Ghinea, G., Hamilton, I.: Promoting inclusive design practice at the global game jam: a pilot evaluation. In: Proceedings of IEEE Frontiers in Education, pp. 1076–1079. IEEE Press, New York (2014)
Scott, M.J., Ghinea, G.: Promoting game accessibility: experiencing an induction on inclusive design practice at the global games jam. In: Proceedings of the Inaurgural Workshop on the Global Games Jam, pp. 17–20. SASDG, Santa Cruz (2013)
Keates, S., Clarkson, P.J., Harrison, L.-A., Robinson, P.: Towards a practical inclusive design approach. In: Proceedings of the Conference on Universal Usability, pp. 42–52, ACM, New York (2000)
Choi, Y.S., Yi, J.S., Law, C.M., Jacko, J.A.: Are universal design resources designed for designers? In: Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 87–94. ACM, New York (2006)
Law, C.M., Yi, J.S, Choi, Y.S., Jacko, J.A.: Are disability access guidelines designed for designers? Do They Need To Be? In: Proceedings of the 18th Australian Conference on Computer-Human Interaction, pp. 357–360. ACM, New York (2006)
Stephanidis, C., Akoumianakis, D.: Universal design: towards universal access in the information society. In: Extended Abstracts on Human Factors in Computing Systems, pp. 499–500. ACM, New York (2001)
Spyridonis, F., Moschonas, P., Touliou, K., Tsakiris, A., Ghinea, G.: Designing accessible ICT products and services: the VERITAS accessibility testing platform. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, pp. 113–116. ACM, New York (2014)
Nielsen, J.: Usability inspection methods. In: Proceedings of the International ACM Conference on Human Factors in Computing Systems, pp. 413–414. ACM, New York (1994)
Gould, J.D., Lewis, C.: Designing for usability: key principles and what designers think. Commun. ACM 28(3), 300–311 (1985)
Nielsen, J.: Finding usability problems through heuristic evaluation. In: Proceedings of the International ACM Conference on Human Factors in Computing Systems, pp. 373–380. ACM, New York (1992)
Van Eck, N., Waltman, L.: Text mining and visualisation using VOSviewer. ISSI Newsletter 7(3), 50–54 (2011)
Braun, V., Clarke, V.: Using thematic analysis in psychology. Qual. Res. Psychol. 3(2), 77–101 (2006)
Fereday, J., Muir-Cochrane, E.: Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int. J. Qual. Methods 5(1), 80–92 (2008)
Acknowledgements
The work presented in this paper forms part of the VERITAS Project which was funded by the European Commission’s 7th Framework Programme (FP7) (Grant Agreement # 247765 FP7-ICT-2009.7.2). All sites involved in the study received ethical approval from both their regional ethics committee as well as the EU VERITAS ethics committee.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Scott, M.J., Spyridonis, F., Ghinea, G. (2015). Designing Accessible Games with the VERITAS Framework: Lessons Learned from Game Designers. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Access to Learning, Health and Well-Being. UAHCI 2015. Lecture Notes in Computer Science(), vol 9177. Springer, Cham. https://doi.org/10.1007/978-3-319-20684-4_53
Download citation
DOI: https://doi.org/10.1007/978-3-319-20684-4_53
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20683-7
Online ISBN: 978-3-319-20684-4
eBook Packages: Computer ScienceComputer Science (R0)