Motivating Change and Reducing Cost with the Discount Video Data Analysis Technique

  • Jody Wynn
  • Jeremiah D. Still
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6770)

Abstract

Testing the usability of an interface is a critical phase of product development. However, it is often reported that analyzing the data from such testing consumes too many limited resources. We attempted to reduce this consumption by proposing a new technique, Discount Video Data Analysis (DVDA). We compared it with another popular accelerated analysis technique, Instant Data Analysis (IDA). Using IDA, evaluators analyze data after a series of usability tests, whereas DVDA calls for analyzing the data after every test in the series. Immediate analysis decreases the chance that subsequent test data will negatively interfere with evaluators’ recall. Additionally, DVDA produces a video of the testing allowing the users’ emotional responses (e.g., frustration) to be shared with developers who may be resistant to interface modifications. We found evaluators using DVDA identified more usability issues and provided more supportive evidence for each issue than evaluators using IDA.

Keywords

Data Analysis Usability Evaluation Discount Usability Testing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    John, B.E., Packer, H.: Learning and using the cognitive walkthrough method: A case study approach. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 429–436 (1995)Google Scholar
  2. 2.
    Frøkjær, E., Hornbæk, K.: Two psychology-based usability inspection techniques studied in a diary experiment. In: Proceedings of the Third Nordic Conference on Human-Computer Interaction, pp. 3–12 (2004)Google Scholar
  3. 3.
    Muller, M.J., Matheson, L., Page, C., Gallup, R.: Methods & tools: Participatory heuristic evaluation. Interactions 5(5), 13–18 (1998)CrossRefGoogle Scholar
  4. 4.
    Nielsen, J.: Usability inspection methods. In: Conference Companion on Human Factors in Computing Systems, pp. 413–414 (1994)Google Scholar
  5. 5.
    Dumas, J.S., Redish, J.C.: A Practical Guide to Usability Testing (Rev Sub.). Intellect Ltd., Bristol (1999)Google Scholar
  6. 6.
    Dumas, J.S.: Stimulating change through usability testing. SIGCHI Bulletin 21(1), 37–44 (1989)CrossRefGoogle Scholar
  7. 7.
    Frøkjær, E., Hornbæk, K.: Cooperative usability testing: Complementing usability tests with user-supported interpretation sessions. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, pp. 1383–1386 (2005)Google Scholar
  8. 8.
    Rubin, J.: Handbook of usability testing: How to plan, design, and conduct effective tests. John Wiley & Sons, Inc., Chichester (1994)Google Scholar
  9. 9.
    Jeffries, R., Desurvire, H.: Usability testing vs. heuristic evaluation: Was there a contest? SIGCHI Bulletin 24(4), 39–41 (1992)CrossRefGoogle Scholar
  10. 10.
    Karat, C., Campbell, R., Fiegel, T.: Comparison of empirical testing and walkthrough methods in user interface evaluation. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 397–404 (1992)Google Scholar
  11. 11.
    Thyvalikakath, T., Monaco, V., Thambuganipalle, H., Schleyer, T.: Comparative study of heuristic evaluation and usability testing methods. Studies in Health Technology and Informatics 143, 322–327 (2009)Google Scholar
  12. 12.
    Kjeldskov, J., Skov, M.B., Stage, J.: Instant data analysis: Conducting usability evaluations in a day. In: Proceedings of the Third Nordic Conference on Human-Computer Interaction, pp. 233–240 (2004)Google Scholar
  13. 13.
    Dewar, M.T., Cowan, N., Sala, S.D.: Forgetting due to retroactive interference: A fusion of Müller and Pilzecker’s early insights into everyday forgetting and recent research on anterograde amnesia. Cortex 43(5), 616–634 (1900)CrossRefGoogle Scholar
  14. 14.
    TechSmith Corporation, ©1995-2010, http://www.techsmith.com/morae.asp
  15. 15.
    Woolrych, A., Cockton, G.: Why and when five test users aren’t enough. In: Proceedings of IHM-HCI 2001, vol. 2, pp. 105–108 (2001)Google Scholar
  16. 16.
    Nielsen, J.: Corporate usability maturity (2006), http://www.useit.com/alertbox/maturity.html
  17. 17.
    Bak, J.O., Nguyen, K., Risgaard, P., Stage, J.: Obstacles to usability evaluation in practice: A survey of software development organizations. In: Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, pp. 23–32 (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Jody Wynn
    • 1
  • Jeremiah D. Still
    • 1
  1. 1.Department of PsychologyMissouri Western State UniversitySt. JosephUSA

Personalised recommendations