Advertisement

CrowdAE: A Crowdsourcing System with Human Inspection Quality Enhancement for Web Accessibility Evaluation

  • Liangcheng Li
  • Jiajun Bu
  • Can Wang
  • Zhi Yu
  • Wei Wang
  • Yue Wu
  • Chunbin Gu
  • Qin Zhou
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10896)

Abstract

Crowdsourcing technology can help manual testing by soliciting the contributions from volunteer evaluators. But crowd evaluators may give inaccurate or invalid evaluation results. This paper proposes an advanced crowdsourcing-based web accessibility evaluation system called CrowdAE by enhancing the crowdsourcing-based manual testing module of the previous version. Through three main process namely learning system, task assignment and task review, we can improve the quality of evaluation results from the crowd. From the comparison on the two years’ evaluation process of Chinese government websites, our CrowdAE outperforms the previous version and improve the accuracy of the evaluation results.

Keywords

Web accessibility evaluation Crowdsourcing 

Notes

Acknowledgement

This work is supported by the National Natural Science Foundation of China (Nos. 61173185 and 61173186), the National Key Technology R&D Program of China (No. 2012BAI34B01 and 2014BAK15B02), and the Hangzhou S&T Development Plan (No. 20150834M22).

References

  1. 1.
    Sullivan, T., Matson, R.: Barriers to use: usability and content accessibility on the Web’s most popular websites. In: ACM Conference on Universal Usability, pp. 139–144 (2000)Google Scholar
  2. 2.
    Abou-Zahra, S.: Web accessibility evaluation. In: Harper, S., Yesilada, Y. (eds.) Web Accessibility. Human-Computer Interaction Series, 1st edn, pp. 79–106. Springer, London (2008).  https://doi.org/10.1007/978-1-84800-050-6_7CrossRefGoogle Scholar
  3. 3.
    Brajnik, G., Yesilada, Y., Harper, S.: Is accessibility conformance an elusive property? A study of validity and reliability of WCAG 2.0. ACM Trans. Access. Comput. 4(2), 8 (2012)CrossRefGoogle Scholar
  4. 4.
    Brajnik, G., Yesilada, Y., Harper, S.: The expertise effect on web accessibility evaluation methods. Hum.-Comput. Interact. 26(3), 246–283 (2011)Google Scholar
  5. 5.
    Lujan-Mora, S., Navarrete, R., Penafiel, M.: E-government and web accessibility in South America. In: First International Conference on E-democracy & E-government, pp. 77–82 (2014)Google Scholar
  6. 6.
    Akgül, Y., Vatansever, K.: Web accessibility evaluation of government websites for people with disabilities in Turkey. J. Adv. Manag. Sci. 4, 201–210 (2016)CrossRefGoogle Scholar
  7. 7.
    Brajnik, G.: A comparative test of Web accessibility evaluation methods. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 113–120 (2008)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Liangcheng Li
    • 1
  • Jiajun Bu
    • 1
  • Can Wang
    • 1
  • Zhi Yu
    • 1
  • Wei Wang
    • 1
  • Yue Wu
    • 1
  • Chunbin Gu
    • 1
  • Qin Zhou
    • 2
  1. 1.Alibaba-Zhejiang University Joint Institute of Frontier Technologies, Zhejiang Provincial Key Laboratory of Service Robot, College of Computer ScienceZhejiang UniversityHangzhouChina
  2. 2.Information CenterChina Disabled Persons’ FederationBeijingChina

Personalised recommendations