Abstract
Web usability testing and research presents challenges for accurate data collection. An instrumented browser solution, Uzilla, is compared with existing solutions, and its contributions to usability testing practice are noted. Uzilla implements a client—server architecture based on the open source Mozilla browser. Instrumentation of the browser facilitates the evaluation of Web sites and applications inside and outside of the laboratory. An integrated data collection and analysis server application decreases the effort required to understand test results and facilitates iterative testing.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Byrne, M. D. (2001). Aquantitative simulation framework for human factors engineering: ACT—R/PM. Paper presented at Human. Systems 2001, Houston.
Catledge, L. D., & Pitkow, J. E. (1995). Characterizing browsing strategies in the World-Wide Web. InProceedings of the 3rd International World Wide Web Conference (pp. 1065–1073).Darmstadt, Germany.
Choo, C., Detlor, D., &Turnbull, D. (2000a).Web work:Information seeking and knowledge work on the World Wide Web. Dordrecht: Kluwer.
Choo, C., Detlor, B., &Turnbull, D. (2000b). Working the Web: An empirical model of Web use. InProceedings of Hawaii International Conference on Systems Science 33. Maui, HI: IEEE.
Claypool, M., Le, P., Waseda, M., &Brown, D. (2001). Implicit interest indicators. InProceedings of ACM Intelligent User Interfaces Conference (IUI) (pp. 33–40). New York: ACM Press.
Cockburn, A., Greenberg, S., Jones, S., McKenzie, B., &Moyle, M. (2003). Improving Web page revisitation: Analysis, design, and evaluation.Information Technology & Society,3, 159–183.
Cockburn, A., &McKenzie, B. (2001). What do Web users do? An empirical analysis of Web use.International Journal of Human-Computer Studies,54, 903–922.
Dumas, J. (2002). User-based evaluations. In J. Jacko & A. Sears (Eds.),The human-computer interaction handbook (pp. 1093–1117). NJ: Erlbaum.
Ebling, M. R., &John, B. E. (2000). On the contributions of different empirical data in usability testing. InConference proceedings on designing interactive systems (pp. 289–296). New York: ACM Press.
Edmonds, A. (2001).Lucidity. Retrieved March 3, 2003 from http://sourceforge.net/projects/lucidity/.
Ergosoft Laboratories(n.d.).ErgoBrowser. Retrieved March 3, 2002 from http://www.ergolabs.com/resources.htm.
Etgen, M., & Cantor, J. (1999). What does getting WET (Web event-logging tool) mean for Web usability? InFifth Human Factors and the Web Conference Proceedings. Retrieved March 3, 2002 from http://zing.ncsl.nist.gov/hfweb/proceedings/etgen-cantor/.
Faulkner, L. L. (2002).Quantifying and analyzing human behavior during usability testing: Cross user analysis and the optimal path test method. Manuscript submitted for publication.
Goldberg, J. H., Stimson, M. J., Lewenstein, M., Scott, N., &Wichansky, A. M. (2002). Eye tracking in Web search tasks: Design implications. InProceedings of the Eye Tracking Research and Applications Symposium (ETRA) (pp. 51–58). New York: ACM Press.
Good, M. (1985). The use of logging data in the design of a new text editor. InProceedings of CHI ’85 Human Factors in Computing Systems (pp. 93–97). New York: ACM Press. New York.
Hammontree, M., Weiler, P., &Nayak, M. (1994). Remote usability testing.Interact,1, 21–24.
Hilbert, D., &Redmiles, D. (2000). Extracting usability information from user interface events.ACM Computing Surveys,32, 384–421.
Hong, J.I., Heer, J., Waterson, S., &Landay, J. A. (2003). WebQuilt: A proxy-based approach to remote Web usability testing.ACM Transactions on Information Systems,19, 263–285.
Hong, J. I., & Landay, J. A. (2001). WebQuilt: A framework for capturing and visualizing the Web experience. InProceedings of the Tenth International World Wide Web Conference (WWW10) (pp. 718–724). Hong Kong: Retrieved March 3, 2002 from http://www10.org/cdrom/.
Lockerd, A., &Mueller, F. (2001). Cheese: Tracking mouse movements on Websites: A tool for user modeling. InComputer Human Interaction Conference Proceedings (pp. 279–280). New York: ACM Press.
NIST (1999a).The IUSR project: Industry usability report. Retrieved March 3, 2002 from http://zing.ncsl.nist.govAusr/documents/White Paper.html.
NIST (1999b).WebVIP: Overview. Retrieved March 3, 2002 from http://zing.ncsl.nist.gov/WebTools/WebVIP/overview.html.
Otimoz (2002).CVS tag 0.3.4. Retrieved March 3, 2002 from http:// optimoz.mozdev.org/gestures/.
Reeder, R. W., Pirolli, P., & Card, S. K. (2001).WebLogger: A data collection tool for Web-use studies (UIR Technical Reports UIR-R-2000-6). Xerox PARC.
Rubin, J. (1994).Handbook of usability testing. New York: Wiley.
Tauscher, L. &Greenberg, S. (1997). How people revisit Web pages: Empirical findings and implications for the design of history systems.International journal of Human-Computer Studies,47, 97–137.
Tullis, T., Fleischman, S., McNulty, M., Cianchette, C., & Bergel, M. (2002).An empirical comparison of lab and remote usability testing of Web sites. Paper presented at the Usability Professionals Conference, Orlando, FL.
Virzi, R. A. (1992). Refining the test phase of usability evaluation: How many subjects is enough?Human Factors,34, 457–468.
W3C (1999). XPath. Retrieved March 03, 2003 from http://www.w3.org/TR/xpath.
Author information
Authors and Affiliations
Corresponding author
Additional information
The case study mentioned in this paper was conducted by Kranti Dugiraala, Pallavi Dharwada, Andy Edmonds, Jonathan Johnson, Deepthi Nalanagula, and Sajay Sadasivan under the supervision of Andrew Duchowski.
Rights and permissions
About this article
Cite this article
Edmonds, A. Uzilla: A new tool for Web usability testing. Behavior Research Methods, Instruments, & Computers 35, 194–201 (2003). https://doi.org/10.3758/BF03202542
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.3758/BF03202542