Advertisement

Assessing Usability of a Post-Mission Reporting Technology

A Novel Usability Questionnaire in Practice
  • Mitchell J. TindallEmail author
  • Beth F. Wheeler Atkinson
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 528)

Abstract

Usability evaluation has received extensive attention in both academic and applied arenas. Despite this, there have been few formal attempts to integrate past research and best practices in an effort to develop a newly updated and adaptable approach. This poster provides an overview of the types of results yielded by a novel usability assessment approach (i.e., Experienced-based Questionnaire for Usability Assessments Targeting Elaborations [EQUATE]) when applied to a post mission reporting tool. The goal of this study was to develop software to automate performance tracking for anti-submarine aircraft, digitize performance and training information, and automate the display of post mission summaries. Although some of these technologies exist, the prototype tested during this research was the first, of which the authors are aware, to provide a single point of access for data entry, analysis and reporting. Due to the potential benefits across a variety of naval aviation platforms, the program’s usability goals focused on identifying means to optimize the tool by gathering novice user feedback. Traditional methods for end-user feedback have tended to focus on user performance and satisfaction, rather than providing prescriptive inputs to identifying and rectifying issues. The results of this study provided usability input for post mission reporting, as well as identified and narrowed the heuristic dimensions used for final validation.

Keywords

Usability Heuristic evaluation GUIs 

Notes

Acknowledgements

This research was sponsored by the NAVAIR Section 219 and PMA-205 Air Warfare Training Development programs. We wish to thank interns who facilitated data collection and analysis, and colleagues who provided input throughout this process. The views expressed in this paper are those of the authors and do not represent the official views of the organizations with which they are affiliated.

References

  1. 1.
    Bowman, D.A., Gabbard, J.L., Hix, D.: A survey of usability evaluation in virtual environments: classification and comparison of methods. Presence Teleoperators Virtual Environ. 11(4), 404–424 (2002)CrossRefGoogle Scholar
  2. 2.
    Hix, D., Hartson, H.R.: Developing User Interfaces Ensuring Usability Through Product and Process. John Wiley, NJ (1993)Google Scholar
  3. 3.
    Shneiderman, B.: Designing User Interface Strategies for Effective Human-Computer Interaction. Addision-Wesley Reading, MA (1998)Google Scholar
  4. 4.
    Nielsen, J.: Usability Engineering. Academic Press, New York (1993)zbMATHGoogle Scholar
  5. 5.
    Preece, J., Rogers, Y., Sharp, H., Benyon, D., Holland, S., Carey, T.: Human-Computer Interaction. Addison-Wesley Longman Ltd, London (1994)Google Scholar
  6. 6.
    Atkinson, B., Tindall, M., Igel, G.: Validated Usability Heuristics: Defining Categories and Design Guidance. Extended poster abstract submitted to HCI International (2015)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Mitchell J. Tindall
    • 1
    Email author
  • Beth F. Wheeler Atkinson
    • 2
  1. 1.StraCon Services Group, LLCOrlandoUSA
  2. 2.Orlando, FL Naval Air Warfare Center Training Systems DivisionOrlandoUSA

Personalised recommendations