Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Cooperative Advocacy: An Approach for Integrating Diverse Perspectives in Anomaly Response

Abstract

This paper contrasts cooperative work in two cases of distributed anomaly response, both from space shuttle mission control, to learn about the factors that make anomaly response robust. In one case (STS-76), flight controllers in mission control recognized an anomaly that began during the ascent phase of a space shuttle mission, analyzed the implications of the failure for mission plans, and made adjustments to plans (the flight ended safely). In this case, a Cooperative Advocacy approach facilitated a process in which diverse perspectives were orchestrated to provide broadening and cross-checks that reduced the risk of premature narrowing. In the second case (the Columbia space shuttle accident—STS-107), mission management treated a debris strike during launch as a side issue rather than a safety of flight concern and was unable to recognize the dangers of this event for the flight which ended in tragedy. In this case, broadening and cross-checks were missing due to fragmentation over the groups involved in the anomaly response process. The comparison of these cases points to critical requirements for designing collaboration over multiple groups in anomaly response situations.

This is a preview of subscription content, log in to check access.

References

  1. Branlat, M., S. Anders, D.D. Woods and E.S. Patterson (2008): Detecting an Erroneous Plan: Does a System Allow for Effective Cross-Checking? In E. Hollnagel, C. Nemeth and S.W.A. Dekker (eds): Resilience Engineering: Remaining Sensitive to the Possibility of Failure., Aldershot: Ashgate, pp. 247–257.

  2. Columbia Accident Investigation Board (2003): Columbia Accident Investigation Board Report. Washington, DC: U.S. Government Printing Office.

  3. De Keyser, V. and D.D. Woods (1990): Fixation Errors: Failures to Revise Situation Assessment in Dynamic and Risky Systems. In A.G. Colombo and A. Saiz de Bustamante (eds): Systems Reliability Assessment (pp. 231–251). Dordrecht: Kluwer Academic.

  4. Elm, W., S. Potter, J. Tittle, D.D. Woods, J. Grossman and E.S. Patterson (2005): Finding decision support requirements for effective intelligence analysis tools. In Proceedings of the Human Factors and Ergonomics Society 49th Annual Meeting. Santa Monica, CA: Human Factors and Ergonomics Society.

  5. Fischer, U. and J. Orasanu (2000): Error-challenging strategies: Their role in preventing and correcting errors. In Proceedings of the International Ergonomics Association 14th Triennial Congress and Human Factors and Ergonomics Society 44th Annual Meeting in San Diego, California, August 2000.

  6. Flin, R. and K. Arbuthnot (eds) (2002): Incident Command: Tales from the Hot Seat. Aldershot: Ashgate.

  7. Flin, R., G. Slaven and K. Stewart (1996): Emergency decision making in the offshore oil and gas industry. Human Factors, vol. 38, pp. 262–277. doi:10.1518/001872096779048110.

  8. Gaba, D.M., M. Maxwell and A. DeAnda (1987): Anesthetic mishaps: breaking the chain of accident evolution. Anesthesiology, vol. 66, pp. 670–676.

  9. Garrett, S.K. and B.S. Caldwell (2002): Mission Control Knowledge Synchronization: Operations To Reference Performance Cycles. Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting, Baltimore, MD.

  10. Gettys, C.F., R.M. Pliske, C. Manning and J.T. Casey (1987): An Evaluation of Human Act Generation Performance.

  11. Hong, L. and S.E. Page (2002): Groups of Diverse Problem Solvers Can Outperform Groups of High-ability Problem Solvers. Proceedings of the National Academy of Science: Economic Sciences, vol. 101, pp. 16385–16389.

  12. Johnson, P.E., K. Jamal and R.G. Berryman (1991): Effects of framing on auditor decisions. Organizational Behavior and Human Decision Processes, vol. 50, pp. 75–105. doi:10.1016/0749-5978(91)90035-R.

  13. Klein, G., R. Pliske, B. Crandall and D. Woods (2005): Problem detection. Cognition Technology and Work, vol. 7(1), pp. 14–28. doi:10.1007/s10111-004-0166-y.

  14. Klein, G., B. Moon and R.R. Hoffman (2006): Making sense of sensemaking 2: a macrocognitive model. IEEE Intelligent Systems, vol. 21(5), pp. 88–92. doi:10.1109/MIS.2006.100.

  15. Klein, G. (2007): Flexecution 1: flexible execution as a paradigm for re-planning. IEEE Intelligent Systems, vol. 22(5), pp. 79–83. doi:10.1109/MIS.2007.4338498.

  16. Layton, C., P.J. Smith and C.E. McCoy (1994): Design of a cooperative problem-solving system for en-route flight planning: an empirical evaluation. Human Factors, vol. 36, pp. 94–119.

  17. Mark, G. (2002): Extreme collaboration. Communications of the ACM, vol. 45, pp. 89–93. doi:10.1145/508448.508453.

  18. Militello, L.G., E.S. Patterson, L. Bowman and R. Wears (2007): Information flow during crisis management: challenges to coordination in the emergency operations center. Cognition Technology and Work, vol. 9, pp. 25–31. doi:10.1007/s10111-006-0059-3.

  19. Page, S.E. (2007): The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies. Princeton: Princeton University Press.

  20. Patterson, E.S. and D.D. Woods (2001): Shift changes, updates, and the on-call model in space shuttle mission control. Computer Supported Cooperative Work: The Journal of Collaborative Computing, vol. 10, pp. 317–346.

  21. Patterson, E.S., J.C. Watts-Perotti and D.D. Woods (1999): Voice loops as coordination aids in space shuttle mission control. Computer Supported Cooperative Work, vol. 8, pp. 353–371. doi:10.1023/A:1008722214282.

  22. Patterson, E.S., E.M. Roth and D.D. Woods (2001): Predicting vulnerabilities in computer-supported inferential analysis under data overload. Cognition Technology and Work, vol. 3(4), pp. 224–237. doi:10.1007/s10111-001-8004-y.

  23. Patterson, E.S., R.I. Cook, D.D. Woods and M.L. Render (2004): Examining the complexity behind a medication error: generic patterns in communication. IEEE SMC Part A, vol. 34, pp. 749–756.

  24. Return to Flight Task Group (2005): Return to Flight Task Group Final Report. Washington, DC: U.S. Government Printing Office.

  25. Rudolph, J.W., J.B. Morrison and J.S. Carroll. The Dynamics of Action-Oriented Problem Solving: Linking Interpretation and Choice. Academy of Management Review, (2008) in press.

  26. Shalin, V.L. (2005): The roles of humans and computers in distributed planning for dynamic domains. Cognition Technology and Work, vol. 7, pp. 198–211. doi:10.1007/s10111-005-0186-2.

  27. Shattuck, L.G. and D.D. Woods (2000): Communication of Intent in Military Command and Control Systems. In Carol McCann and Ross Pigeau (eds): The Human in Command: Exploring the Modern Military Experience (pp. 279–292), New York: Kluwer Academic/Plenum Publishers.

  28. Smith, P.J., E. McCoy and C. Layton (1997): Brittleness in the design of cooperative problem-solving systems: the effects on user performance. IEEE Transactions on Systems, Man, and Cybernetics, vol. 27, pp. 360–371. doi:10.1109/3468.568744.

  29. Smith, P.J., M. Klopfenstein, J. Jezerinac and A. Spenser (2004): Distributed Work in the National Airspace System: Providing Feedback Loops Using the Post-Operations Evaluation Tool (POET). In B. Kirwan, M. Rodgers and D. Schaefer (eds): Human Factors Impacts in Air Traffic Management (pp. 127–152), London: Ashgate.

  30. Smith, P.J., A.L. Spenser, C.E. Billings (2007): Strategies for designing distributed systems: Case Studies in the design of an Air Traffic Management System. Cognition Technology and Work, vol. 9, pp. 39–49.

  31. Starbuck, W.H. and M. Farjoun (eds) (2005): Organization at the Limit: NASA and the Columbia disaster. Malden, MA: Blackwell.

  32. Watts-Perotti, J. and D.D. Woods (2007): How anomaly response is distributed across functionally distinct teams in space shuttle mission control. Journal of Cognitive Engineering and Decision Making, vol. 1(4), pp. 405–433. doi:10.1518/155534307X264889.

  33. Woods, D.D. (2005): Creating Foresight: Lessons for Resilience from Columbia. In W.H. Starbuck and M. Farjoun (eds): Organization at the limit: NASA and the Columbia Disaster (pp. 289–308), Malden: Blackwell.

  34. Woods, D.D. (2006): Essential Characteristics of Resilience for Organizations. In E. Hollnagel, D.D. Woods and N. Leveson (eds): Resilience Engineering: Concepts and Precepts., Aldershot: Ashgate.

  35. Woods, D.D. and E.S. Patterson (2000): How Unexpected Events Produce an Escalation of Cognitive and Coordinative Demands. In P.A. Hancock and P. Desmond (eds.): Stress Workload and Fatigue (pp. 290--302), Hillsdale NJ: Lawrence Erlbaum.

  36. Woods, D.D. and E. Hollnagel (2006): Joint Cognitive Systems: Patterns in Cognitive Systems Engineering. Boca Raton: Taylor & Francis.

  37. Woods, D.D., J. O’Brien and L.F. Hanes (1987): Human Factors Challenges in Process Control: The Case of Nuclear Power Plants. In G. Salvendy (ed): Handbook of Human Factors/Ergonomics (first edition, pp. 1724–1770). New York: Wiley.

  38. Woods, D.D., S.W.A. Dekker, R.I. Cook, L.L. Johannesen and N.B. Sarter (in press): Behind Human Error (Second Edition). Aldershot, Ashgate.

  39. Zelik, D., E.S. Patterson and D.D. Woods (2007a): The Impact of Process Insight on Judgment of Analytic Rigor in Information Analysis. In Proceedings of the Human Factors and Ergonomics Society 51st Annual Meeting. October 1–5, Baltimore, MD.

  40. Zelik, D., E.S. Patterson and D.D. Woods (2007b): Understanding Rigor in Information Analysis: The Role of Rigor in Professional Intelligence Analysis. In K. Mosier and U. Fischer (eds): Proceedings of Naturalistic Decision Making 8, June 2007.

Download references

Author information

Correspondence to David D. Woods.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Watts-Perotti, J., Woods, D.D. Cooperative Advocacy: An Approach for Integrating Diverse Perspectives in Anomaly Response. Comput Supported Coop Work 18, 175–198 (2009). https://doi.org/10.1007/s10606-008-9085-4

Download citation

Key words

  • anomaly response
  • space missions
  • mission control
  • accidents
  • replanning
  • premature narrowing
  • Columbia accident
  • cooperative work
  • cross-checks
  • crisis management