The foundational design philosophy of user-centered design (UCD) offers an ideal approach for systems engineers, programmers, designers, and any other stakeholder involved with the design of high-stakes systems with human operators. Furthermore, UCD, as presented here, is tailor-made to meet the unique needs of critical human–machine systems in systems like air traffic control towers, 911 call centers, or NASA’s Mission Control Center. Whenever the operator is a mission-critical component of the system, stakeholders must be able to make informed decisions during the design process, and this book provides the tools necessary to make those decisions.
This book summarizes a process for designing and implementing op centers like the Water Detection System introduced in Chap. 1. As the work is performed, risks are assessed using a spiral development model that checks with stakeholders at each major phase, and adjusts the process based on the risks that can be perceived at that stage. The intermediate and final system can be assessed using simple usability tests as well as cognitive walkthroughs.
The process uses shared representations of the operators, their tasks, and the context of the work. An example of these is provided in Appendix 1. These shared representations are used to design and create an op center. Appendix 1, with its subsections, provides an example set of documents for describing your users and their tasks in a way that is useful for design. Larger systems will need correspondingly larger and more complex descriptions, while smaller systems will typically need less. Systems only used by their developers might not need anything, but systems that are designed without these documents are designed informally and solely for their designer ’s use, not for the operators. As architects would discuss blueprints particularly before building a project, op center designers should expect to prepare and discuss these documents during design with other stakeholders, such as managers, future operators, and funders. These discussions can reduce misunderstandings, lead to supporting all the tasks for all stakeholders, defend designs, and help keep the relevant goals, missions, and tasks in mind when designing a system. Using these documents reduces risks (Pew & Mavor, 2007).
Chapters 2 and 3 provide design principles that managers, designers, and implementers can be informed by. These stakeholders can also be informed by greater knowledge of the operators as a type of system component. Chapter 3 provides a short overview of the types of knowledge of operators that can help inform system design and implementation. Further sources for learning more are noted in each chapter.
This book should also be seen as an initial review. There is more to know about how to support operators than is covered here. Appendix 2 provides pointers to further information on how to support operators in control rooms and to support the designers who create them. Appendix 3 aggregates the most important design principles that we have described in this book. The rest of this chapter briefly summarizes the book, offers areas of future work, and responds to the set of design questions presented at the end of Chap. 1.
4.2 The Need for User-Centered Design
One of the difficulties with this approach will be investing the perceived additional time and effort to avoid the risks that this approach helps mitigate, ameliorate, or avoid. Typically, this approach takes additional effort, and organizations do not always see the risks until they arrive. There is evidence, however, that a mindful approach can overall reduce costs (Booher and Minninger 2005).
A problem that remains then is to provide evidence that there are risks and that this approach helps reduce risks and their impact . Pew and Mavor (2007) call for examples to help motivate the different team members to appreciate how usability can influence system performance. Table 4.1 notes a few examples. Support from management for this more engineering-based approach as well as further local examples could be useful to motivate implementers and technology designers to take operator tasks and their knowledge, skills, and abilities more seriously.
Keeping a list of known risks and accidents related to the design domain could also be helpful in several ways. The particular risks to op centers’ success may be difficult to quantify and will often arise from unexpected events. It may be worthwhile for an organization to keep track of misses and near misses to accidents, as NASA does for air traffic control in the NASA Aviation Safety Reporting System (asrs.arc.nasa.gov/).
4.3 The Need for Better Shared Representations
Another problem is the usability of the shared representations of users, tasks, and technology. Shared representations are documents about the design (e.g., types of users and tasks) that are shared across groups of stakeholders. The managers, designers, and implementers can come from different intellectual backgrounds, and have different assumptions. There is a need to translate some representations to “engineer speak,” and perhaps in the other direction. There is a young literature on how to prepare knowledge about design aspects to share with other team members. This is a problem noted by Pew and Mavor (2007), where it is called shared representations, and work remains to make sure the shared representations are as usable as they can be.
4.4 Open Problems
We can now revisit the questions in Table 1.4, presented here as Table 4.2. The responses are included in the table for convenience of reading and presentation.
As the material in Table 4.2 notes, there remain open problems with applying this approach. The degree of detail required for the documents will vary across particular op centers, and across different technologies, and thus should be adjusted accordingly to the needs of the proposed system. The risks that arise in the use of particular op centers will vary with the domain that the op center is supporting. This approach does not guarantee a perfect or even a better system, but it overall reduces risk and the probability of system failures.
4.5 Ways to Learn More
Designers of control rooms will need to know more about design and about operators than what is covered in this book. They will need to know more theory about design and human users, and they will need more details about the situations and operators and tasks that they are designing for. This appendix notes a few ways to learn more. These ways include reading , discussion, and formal and informal education. An hour a week of learning is not much in a week, but in a year, it can change how you think.
4.5.1 Readings to Learn More
Designers wanting to learn more about design and operators can most easily read more. There are numerous books on how operators (as people) think and learn. A good book of this type is Anderson’s Cognitive Psychology and Its Implications (2020). There are similar books for learning about perception (Sekuler and Blake 2005). Norman’s (1988/2013) book helped start the area of human–computer interaction but does not provide a unified theory of how to support design. It makes the case for paying attention to users and provides food for thought. As design moves in different directions, related books and textbooks can be found on broader topics such as the effects of emotions on our interactions with systems (Norman 2004).
There are also books describing operators in terms that support design. Our favorite is Foundations for Designing User-Centered Systems: What System Designers Need to Know about People (Ritter et al. 2014), but textbooks by Wickens (e.g., Wickens et al. 2012) and Lewis and Rieman (1994) are also useful . If detailed knowledge about users is required, one can try to find the information in Boff and Lincoln’s (1988) large compendium, but often the designer will be driven to reading more specialized papers, asking experts, running a study , or making an educated guess based on similar circumstances. Finally, the book Designing for situation awareness (Endsley et al. 2003) provides further useful advice. It will be familiar because we use it extensively in this book.
We also recommend Sommerville’s (2015) Software engineering (10th ed.) , and particularly the chapters on reliability engineering (Chap. 11), systems engineering (Chap. 19), and systems of systems (Chap. 20). While not directly addressed in this book, Baxter and Sommerville’s work on socio-technical systems brings a new perspective on the holistic design by integrating organizational change and system development into a unified framework.
There are also two final topics that we did not broach in this book: automation and the related topic of how operators use automation. Automation generally refers to the execution of some task that was formerly performed by a human. Eventually, some tasks will become fully automated with no future human interaction, at which point, these are simply machine tasks (Parasuraman and Riley 1997).
Designers should be careful not to rush into automating tasks, particularly for complex tasks that will continue to rely on human input. Under perfect conditions, automation seems like an easy way to reduce the workload for your operators; however, when faced with the complications that reality brings, you can quickly run into issues.
Operators use their trust in the automation to know how to use the automation and to then perform their tasks successfully with automation doing part of the task . Working with automation that is hard to calibrate can end up requiring more effort because the operator will need to monitor the automation to ensure success. Optimal performance can only be achieved when designers instill the proper amount of reliance and trust on the automated systems (Lee et al. 2004). The mental model of how the automation works and when it works should be accessible and easy to learn and easy to use. The process for automating tasks in complex systems is difficult and outside the scope of this book, but we recommend reading Lee et al.’s (2004) article Trust in Automation: Designing for Appropriate Reliance and Parasuraman and Riley’s (1997) article Humans and Automation: Use, Misuse, and Disuse if you wish to learn more. We also recommend reviewing NASA’s Automation Interface Design Development project (https://techport.nasa.gov/view/23597).
4.5.2 Reading Groups
One way to solidify knowledge from reading and to learn information not completely codified is to participate in a reading group. Sometimes these groups appear as graduate courses. They can also be organized around a work group or, better, across work groups. They take time, but a group can help digest a book, and even the social loafers who do not read the material can learn something. It is also a way to build a shared theory of design in a workplace.
4.5.3 Continuing Education
Finally, the most solid but expensive way to learn more is to take courses. Some will be available at local universities, and some are available online. Coursera and Lynda offer various courses that are related to these topics.
Anderson, J. R. (2020). Cognitive psychology and its implications (9th ed.). New York: Worth Publishers.
Baxter, G. D., & Sommerville, I. (2011). Socio-technical systems: From design methods to systems engineering. Interacting with Computers, 23(1), 4–17. https://doi.org/10.1016/j.intcom.2010.07.003.
Boff, K. R., & Lincoln, J. E. (1988). Engineering data compendium: Human perception and performance. Wright-Patterson Air Force Base, OH: AFRL.
Booher, H. R., & Minninger, J. (2005). Human systems integration in Army systems acquisition. In H. R. Booher (Ed.), Handbook of human systems integration (pp. 663–698). https://doi.org/10.1002/0471721174.ch18.
Card, S. K., Moran, T. P., & Newell, A. (1980). The keystroke-level model for user performance time with interactive systems. Communications of the ACM, 23(7), 396–410.
Card, S. K., Moran, T. P., & Newell, A. (1983). The psychology of human-computer interaction. Hillsdale: Erlbaum.
Casey, S. M. (1998). Set phasers to stun: And other true tales of design, technology, and human error. Santa Barbara: Aegean.
Chipman, S. F., & Kieras, D. E. (2004). Operator centered design of ship systems. In Engineering the total ship symposium. American Society of Naval Engineers, NIST, Gaithersburg, MD.
Endsley, M. R., Bolte, B., & Jones, D. G. (2003). Designing for situation awareness: An approach to user-centered design (1st ed.). London: Taylor & Francis.
Hursh, S. R., Redmond, D. P., Johnson, M. L., Thorne, D. R., Belenky, G., & Balkin, T. J. (2004). Fatigue models for applied research in warfighting. Aviation, Space, and Environmental Medicine, 73(3), A44–A53.
Lee, J. D., See, K. A., & City, I. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46(1), 50–80.
Lewis, C., & Rieman, J. (1994). Task-centered user interface design: A practical introduction. Available at: http://www.hcibib.org/tcuid/
Norman, D. A. (1988/2013). The design of everyday things (Revised). New York: Basic Books .
Norman, D. A. (2004). Emotional design: Why we love (or hate) everyday things. New York: Basic Books.
Parasuraman, R., & Riley, V. (1997). Human and automation-misuse, disuse and abuse. Human Factors, 39(2), 230–253.
Pew, R. W., & Mavor, A. S. (2007). Human-system integration in the system development process. Washington, DC: The National Academies Press. https://doi.org/10.17226/11893.
Ritter, F. E., Baxter, G. D., & Churchill, E. F. (2014). Foundations for designing user-centered systems. London: Springer. https://doi.org/10.1007/978-1-4471-5134-0.
Sekuler, R., & Blake, R. (2005). Perception (2nd ed.). New York: McGraw-Hill.
Sommerville, I. (2015). Software engineering (10th ed.). Harlow: Pearson.
Wickens, C. D., Hollands, J. G., Banbury, S., & Parasuraman, R. (2012). Engineering psychology and human performance (4th ed.). New York: Psychology Press.
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
© 2021 The Author(s)
About this chapter
Cite this chapter
Oury, J.D., Ritter, F.E. (2021). Conclusion and Final Comments. In: Building Better Interfaces for Remote Autonomous Systems . Human–Computer Interaction Series(). Springer, Cham. https://doi.org/10.1007/978-3-030-47775-2_4
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-47774-5
Online ISBN: 978-3-030-47775-2
eBook Packages: Computer ScienceComputer Science (R0)