Abstract
Web accessibility has been the subject of much discussion regarding the need to make Web content accessible to all people, regardless of their abilities or disabilities. While some testing techniques require human intervention, accessibility can also be evaluated by automated tools. Automated evaluation tools are software programs that examine the code of Web pages to determine if they conform to a set of accessibility guidelines that are often based on the Web Content Accessibility Guidelines Version 2.0 (WCAG 2.0), developed by the World Wide Web Consortium (W3C). In this context, the purpose of this study is to analyze an automated software program for evaluating authenticated environments and verify the usability of this tool, since automated systems require precision and reliability in terms of both results and use in any type of environment. With this in mind, this paper aimed at evaluating the ASES software by means of a heuristic evaluation carried out by three experts. The analysis revealed major accessibility problems, as well as improper functioning of available tools and inconsistency of results. Furthermore, ASES was found to have problems of efficiency, interaction, validity, and reliability in the results presented. Considering that this is an open-source accessibility testing tool that can be found on a government web site, the correction or improvement of the system’s deficiencies identified in this study is highly recommended, as there is a lack of software available to evaluate authenticated environments.
Chapter PDF
Similar content being viewed by others
References
World Wide Web Consortium, http://www.w3.org/
Brajnik, G.: Beyond conformance: The role of accessibility evaluation methods. In: Hartmann, S., Zhou, X., Kirchberg, M. (eds.) WISE 2008. LNCS, vol. 5176, pp. 63–80. Springer, Heidelberg (2008)
Brajnik, G.: Web Accessibility Testing: when the method is the culprit. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds.) ICCHP 2006. LNCS, vol. 4061, pp. 156–163. Springer, Heidelberg (2006)
ASES - Avaliador e Simulador para a Acessibilidade de Sítios, http://www.governoeletronico.gov.br/acoes-e-projetos/e-MAG/ases-avaliador-e-simulador-de-acessibilidade-sitios
Moodle.org, http://www.moodle.org
Pivetta, E.M., Flor, C., Saito, D.S., Ulbricht, V.R.: Analysis of an automatic accessibility evaluator to validate a virtual and authenticated environment. International Journal of Advanced Computer Science and Applications 4, 15–22 (2013)
Nielsen, J.: Why You Only Need to Test with 5 Users (2000), http://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/
LabUtil: Laboratório de utilizabilidade: critérios ergonômicos, http://www.labiutil.inf.ufsc.br/CriteriosErgonomicos/LabIUtil2003-Crit/100conduc.html
Web Content Accessibility Guidelines 2.0, http://www.w3.org/TR/WCAG20
Al-Khalifa, H.S., Al-Kanhal, M., Al-Nafisah, H., Al-soukaih, N., Al-hussain, E., Al-onzi, M.: A Pilot Study for Evaluating Arabic Websites Usign Automated WCAG 2.0 Evaluation Tools. In: Proceedings of the 2011 International Conference on Innovations In Information Technology. IEEE Computer Society, Saudi Arabia (2011)
UMIC - Agência para a Sociedade do Conhecimento, http://www.acessibilidade.gov.pt/accessmonitor/
AChecker, http://achecker.ca/checker/index.php
e-MAG – Modelo de Acessibilidade de Governo Eletrônico – versão 3.0 (2011), http://www.governoeletronico.gov.br/biblioteca/arquivos/e-mag-3.0/download
Web Accessibility Initiative, http://www.w3.org/WAI/
Web Accessibility Assessment Tool, http://www.accessible-eu.org/
Web Accessibility Evaluation Tool, http://wave.webaim.org
Worldspace FireEyes, http://www.deque.com/products/worldspace-fireeyes
Barbosa, G.A.R., Santos, N.S.S., Reis, S.S., Prates, R.O.: Relatório da Avaliação de Acessibilidade da Plataforma Lattes do CNPq sob a Perspectiva de Deficientes Visuais. In: Anais Estendidos do IHC 2010, Belo Horizonte, Minas Gerais, pp. 139–150 (2010)
Paddison, C., Englefield, P.: Applying heuristics to accessibility inspections. Interacting with Computers 16, 507–521 (2004)
Preece, J., Rogers, Y., Sharp, H.: Design de Interação - Além da interação homem-computador. Bookman, Porto Alegre (2005)
Nielsen, J.: Topic: Heuristic Evaluation (1997), http://www.nngroup.com/topic/heuristic-evaluation/
Prates, R.O., Barbosa, S.D.J.: Avaliação de interfaces de usuário: conceitos e métodos. In: XXIII Congresso Nacional da Sociedade Brasileira de Computação, SBC, Rio de Janeiro, pp. 1–43 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Pivetta, E.M., Saito, D.S., da Silva Flor, C., Ulbricht, V.R., Vanzin, T. (2014). Automated Accessibility Evaluation Software for Authenticated Environments. In: Stephanidis, C., Antona, M. (eds) Universal Access in Human-Computer Interaction. Design for All and Accessibility Practice. UAHCI 2014. Lecture Notes in Computer Science, vol 8516. Springer, Cham. https://doi.org/10.1007/978-3-319-07509-9_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-07509-9_8
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-07508-2
Online ISBN: 978-3-319-07509-9
eBook Packages: Computer ScienceComputer Science (R0)