A Framework for Remote User Evaluation of Accessibility and Usability of Websites

  • Christopher Power
  • Helen Petrie
  • Richard Mitchell
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5614)


The inclusion of participants that are representative of the diverse populations of users is essential for meaningful and useful evaluations of usability and accessibility on the web. This paper proposes the requirements and architecture for an automated tool suite to help manage the design and deployment of evaluations to these participants. A prototype implementation of this architecture that is being prepared is also discussed.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Andreasen, M.S., Nielsen, H.V., Schrøder, S.O., Stage, J.: What happened to remote usability testing?: an empirical study of three methods. In: Proceedings of the SIGCHI Conference on Human factors in Computing Systems (2007)Google Scholar
  2. 2.
    Castillo, J.C., Hartson, H.R., Hix, D.: Remote usability evaluation: can users report their own critical incidents. In: CHI 1998: Proceedings of the SIGCHI Conference on Human factors in Computing Systems, pp. 253–254 (1998)Google Scholar
  3. 3.
    Dray, S., Siegel, D.: Remote possibilities?: International usability testing at a distance. Interactions 11(2), 10–17 (2004)CrossRefGoogle Scholar
  4. 4.
    Hartson, H.R., Castillo, J.C.: Remote evaluation for post-deployment usability improvement. In: AVI 1998: Proceedings of the working conference on Advanced visual interfaces, pp. 22–29. ACM, New York (1998)Google Scholar
  5. 5.
    Hartson, H.R., Castillo, J.C., Kelso, J., Neale, W.C.: Remote evaluation: the network as an extension of the usability laboratory. In: CHI 1996: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 228–235. ACM, New York (1996)Google Scholar
  6. 6.
    Hill, W.C., Terveen, L.G.: Involving remote users in continuous design of web content. In: Proceedings of the conference on Designing interactive systems: processes, practices, methods, and techniques, pp. 137–145 (1997)Google Scholar
  7. 7.
    Monk, A., Wright, P., Haber, J., Davenport, L.: Improving your human-computer interface: A practical technique. Prentice Hall International (UK) Ltd., Englewood Cliffs (1993)Google Scholar
  8. 8.
    Petrie, H., Hamilton, F., King, N., Pavan, P.: Remote usability evaluations with disabled people. In: Proceedings of the SIGCHI conference on Human Factors in computing systems (2006)Google Scholar
  9. 9.
    Siochi, A.C., Ehrich, R.W.: Computer analysis of user interfaces based on repetition in transcripts of user sessions. ACM Trans. Inf. Syst. 9(4), 309–335 (1991)CrossRefGoogle Scholar
  10. 10.
    Strobbe, C., Koch, J., Vlachogiannis, E., Ruemer, R., Velasco, C.A., Engelen, J.: The BenToWeb test case suites for the web content accessibility guidelines (WCAG) 2.0. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds.) ICCHP 2008. LNCS, vol. 5105, pp. 402–409. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  11. 11.
    Winckler, M.A.A., Freitas, C.M.D.S., de Lima, J.V.: Usability remote evaluation for www. In: CHI 2000: CHI 2000 extended abstracts on Human factors in computing systems, pp. 131–132. ACM, New York (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Christopher Power
    • 1
  • Helen Petrie
    • 1
  • Richard Mitchell
    • 1
  1. 1.Department of Computer ScienceUniversity of YorkYorkUK

Personalised recommendations