Advertisement

Evaluation of the Android Accessibility API Recognition Rate Towards a Better User Experience

  • Mauro C. PichilianiEmail author
  • Celso M. Hirata
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9175)

Abstract

Mobile applications are based on interactive common UI elements that represents pointing targets visible on the screen. The usage of mobile applications in eyes-free scenarios or by individuals with vision impairments requires effective alternative access to visual elements, i.e. accessibility features. Previous works evaluated the accuracy of UI element’s identification by accessibility APIs on desktop applications reporting that only 74 % of the targets were correctly identified, but no recent research evaluated the accuracy for similar mobile APIs. We present an empirical evaluation based on the Android accessibility API that computes the UI recognition accuracy rate on ten popular mobile applications. Our findings indicate that accessibility average recognition rate is 97 %.

Keywords

Accessibility Android Mobile API Evaluation User interface User experience 

References

  1. 1.
    Abou-Zahra, S., Brewer, J., Henry, S.L.: Essential components of mobile web accessibility. In: Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility, article No. 5 (2013)Google Scholar
  2. 2.
  3. 3.
    Dixon, M., Laput, G., Fogarty J.: Pixel-based methods for widget state and style in a runtime implementation of sliding widgets. In: Proceedings of the CHI 2014 Conference on Human Factors in Computing Systems, pp. 2231–2240 (2014)Google Scholar
  4. 4.
    Findlater, L., McGrenere, J.: A comparison of static, adaptive, and adaptable menus. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM Press, pp. 89–96 (2004)Google Scholar
  5. 5.
    Hurst, A., Hudson, S.E., Mankoff, J.: Automatically identifying targets users interact with during real world tasks. In: Proceedings of the 15th International Conference on Intelligent User Interfaces, pp. 11–20 (2010)Google Scholar
  6. 6.
    Kaufman, D.R., Patel, V.L., Hilliman, C., Morin, P.C., Pevzner, J., Weinstock, R.S., Goland, R., Shea, S., Starren, J.: Usability in the real world: assessing medical information technologies in patient’s homes. J. Biomed. Inform. 36(1/2), 45–60 (2003)CrossRefGoogle Scholar
  7. 7.
    Keates, S., Hwang, F., Langdon, P., Clarkson, P.J., Robinson, P.: Cursor measures for motion impaired computer users. In: Proceedings of the ACM SIGACCESS Conference on Computers and Accessibility, pp. 135–142. ACM Press (2002)Google Scholar
  8. 8.
    MacKenzie, I.S., Kauppinen, T., Silfverberg, M.: Accuracy measures for evaluating computer pointing devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 9–16. ACM Press (2001)Google Scholar
  9. 9.
    StAmant, R., Lieberman, H., Potter, R., Zettlemoyer, L.: Programming by example: visual generalization in programming by example. Commun. ACM 43(3), 107–114 (2000)CrossRefGoogle Scholar
  10. 10.
    Zhong, Y., Raman, T.V., Burkhardt, C., Biadsy, F., Bigham, J.P.: JustSpeak: enabling universal voice control on Android. In: Proceedings of the 11th Web for All Conference, p. 36 (2014)Google Scholar
  11. 11.

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Department of Computer ScienceInstituto Tecnológico de AeronáuticaSão José dos CamposBrazil

Personalised recommendations