Advertisement

Inuit: The Interface Usability Instrument

  • Maximilian SpeicherEmail author
  • Andreas Both
  • Martin Gaedke
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9186)

Abstract

Explicit user testing tends to be costly and time-consuming from a company’s point of view. Therefore, it would be desirable to infer a quantitative usability score directly from implicit feedback, i.e., the interactions of users with a web interface. As a basis for this, we require an adequate usability instrument whose items form a usability score and can be meaningfully correlated with such interactions. Thus, we present Inuit, the first instrument consisting of only seven items that have the right level of abstraction to directly reflect user behavior on the client. It has been designed in a two-step process involving usability guideline reviews and expert interviews. A confirmatory factor analysis shows that our model reasonably well reflects real-world perceptions of usability.

Keywords

Instrument Metrics Questionnaire Usability Interfaces 

Notes

Acknowledgements.

We thank our interviewees and all participants of the Unister Friday PhD Symposia. This work has been supported by the ESF and the Free State of Saxony.

References

  1. 1.
    Abdinnour-Helm, S.F., Chaparro, B.S., Farmer, S.M.: Using the End-User Computing Satisfaction (EUCS) Instrument to Measure Satisfaction with a Web Site. Decision Sci 36(2) (2005)Google Scholar
  2. 2.
    Arbuckle, J.L.: IBM\({\textregistered }\) SPSS\({\textregistered }\) Amos\(^{\rm TM}\) 20 User’s Guide. IBM Corporation, Armonk, NY (2011)Google Scholar
  3. 3.
    ATLAS Collaboration: Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC. Phys Lett B 716(1) (2012)Google Scholar
  4. 4.
    Atterer, R., Wnuk, M., Schmidt, A.: Knowing the User’s Every Move - User Activity Tracking for Website Usability Evaluation and Implicit Interaction. In: Proc. WWW. (2006)Google Scholar
  5. 5.
    Brooke, J.: SUS: A “quick and dirty" usability scale. In: Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, A.L. (eds.) Usability Evaluation in Industry. Taylor and Francis (1996)Google Scholar
  6. 6.
    Byrne, B.M.: Structural Equation Modeling With AMOS: Basic Concepts, Applications, and Programming. CRC Press (2009)Google Scholar
  7. 7.
    Fadeyev, D.: 10 Useful Usability Findings and Guidelines, riptsize http://www.smashingmagazine.com/2009/09/24/10-useful-usability-findings-and-guidelines/
  8. 8.
    Fisher, J., Bentley, J., Turner, R., Craig, A.: A usability instrument for evaluating websites - navigation elements. In: Proc. OZCHI. (2004)Google Scholar
  9. 9.
    Goldstein, D.: Beyond Usability Testing, http://alistapart.com/article/beyond-usability-testing
  10. 10.
    Green, D., Pearson, J.M.: Development of a Website Usability Instrument based on ISO 9241–11. JCIS 47(1) (2006)Google Scholar
  11. 11.
    Harms, I., Schweibenz, W., Strobel, J.: Usability Evaluation von Web-Angeboten mit dem Web Usability Index [Usability evaluation of web applications using the Web Usability Index]. In: Proc. 24. DGI-Online-Tagung. (2002)Google Scholar
  12. 12.
    Hassenzahl, M.: User Experience (UX): Towards an experiential perspective on product quality. In: Proc. IHM. (2008)Google Scholar
  13. 13.
    ISO: ISO/IEC 25010:2011 Systems and software engineering - Systems and software Quality Requirements and Evaluation (SQuaRE) - System and software quality models. (2011)Google Scholar
  14. 14.
    ISO: ISO 9241–11:1998 Ergonomic requirements for office work with visual display terminals (VDTs) - Part 11: Guidance on usability. (1998)Google Scholar
  15. 15.
    Lavie, T., Tractinsky, N.: Assessing dimensions of perceived visual aesthetics of web sites. Int J Hum-Comput St 60(3) (2004)Google Scholar
  16. 16.
    Lew, P., Olsina, L., Zhang, L.: Quality, Quality in Use, Actual Usability and User Experience as Key Drivers for Web Application Evaluation. In: Benatallah, B., Casati, F., Kappel, G., Rossi, G. (eds.) ICWE 2010. LNCS, vol. 6189, pp. 218–232. Springer, Heidelberg (2010) Google Scholar
  17. 17.
    Mandel, T.: The Elements of User Interface Design. John Wiley & Sons, Hoboken, NJ (1997)Google Scholar
  18. 18.
    Nebeling, M., Speicher, M., Norrie, M.C.: CrowdStudy: General Toolkit for Crowdsourced Evaluation of Web Interfaces. In: Proc. EICS. (2013)Google Scholar
  19. 19.
    Nielsen, J., Molich, R.: Heuristic Evaluation of User Interfaces. In: Proc. CHI. (1990)Google Scholar
  20. 20.
    Nielsen, J.: 10 Usability Heuristics for User Interface Design, http://www.nngroup.com/articles/ten-usability-heuristics/
  21. 21.
    Nielsen, J.: Putting A/B Testing in Its Place, http://www.nngroup.com/articles/putting-ab-testing-in-its-place/
  22. 22.
    Palmer, J.W.: Website Usability, Design, and Performance Metrics. Inform Syst Res 13(2) (2002)Google Scholar
  23. 23.
    Sauro, J.: Does Better Usability Increase Customer Loyalty? http://www.measuringusability.com/usability-loyalty.php
  24. 24.
    Speicher, M., Both, A., Gaedke, M.: Towards Metric-based Usability Evaluation of Online Web Interfaces. In: Mensch & Computer Workshopband. (2013)Google Scholar
  25. 25.
    Speicher, M., Both, A., Gaedke, M.: Ensuring Web Interface Quality through Usability-Based Split Testing. In: Casteleyn, S., Rossi, G., Winckler, M. (eds.) ICWE 2014. LNCS, vol. 8541, pp. 93–110. Springer, Heidelberg (2014) Google Scholar
  26. 26.
    Tognazzini, B.: First Principles of Interaction Design, http://www.asktog.com/basics/firstPrinciples.html (accessed Mar 22, 2013)
  27. 27.
    Travis, D.: 247 web usability guidelines, http://www.userfocus.co.uk/resources/guidelines.html

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Maximilian Speicher
    • 1
    Email author
  • Andreas Both
    • 2
  • Martin Gaedke
    • 1
  1. 1.Technische Universität ChemnitzChemnitzGermany
  2. 2.Research and development, Unister GmbHLeipzigGermany

Personalised recommendations