Skip to main content

Part of the book series: NATO ASI Series ((NATO ASI F,volume 110))

  • 278 Accesses

Abstract

The term ‘complexity’ covers a wide variety of system attributes in everyday language. It may refer to the number of subsystems or components, the dynamics of component or subsystems behavior, the number of possible interactions, the presence of non-linear interactions, difficulties in identifying and understanding a system’s interactions with its environment, the impact of human judgment and actions, or even our lack of familiarity with the system. We will not give a rigorous definition of complexity here. A workable approximation may be to state that complexity is a function of the number and properties of the dependencies (intentional and unintentional) which exist between the items of a system and between a system and its environment (adapted from Mancini, 1988).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Brehmer, B. (1987). Models of diagnostic judgements. In J. Rasmussen, K. Duncan, J. Leplat (Eds.), New Technology and Human Error. Chichester: Wiley.

    Google Scholar 

  • Brunsson, N (1989). The Organization of Hypocrisy, Talk, Decisions and Actions in Organizations. Chichester: Wiley.

    Google Scholar 

  • Checkland, P. (1981). Systems Thinking, Systems Practice. Chichester: Wiley.

    Google Scholar 

  • Embrey, D. E., Humphreys, P., Rosa, E. A., Kirwan, B., Pea, K. (1984). SLIM-MAUD: An Approach to Assessing Human Error Probabilities using Structured Expert Judgment. Nuclear Regulatory Commission, NUREG/CR-3518, Brookhaven National Laboratory.

    Google Scholar 

  • Foster, H. D. (1993). Resilience theory and system validation. In J. A. Wise, V. D. Hopkin, P. Stager (Eds.), Verification and Validation of Complex Systems: Human Factors Issues. NATO ASI Series F, Vol. 110. Berlin: Springer-Verlag, pp. 35–60 (this volume).

    Google Scholar 

  • Funtowicz, S. O., Ravetz, J. R. (1990). Uncertainty and Quality in Science for Policy. Dordrecht: Kluwer.

    Book  Google Scholar 

  • Gleick, J (1988). Chaos: Making a New Science. London: Cardinal.

    Google Scholar 

  • Hollnagel, E. (1993). The reliability of interactive systems: Simulation based assessment. In J. A. Wise, V. D. Hopkin, P. Stager (Eds.), Verification and Validation of Complex Systems: Human Factors Issues. NATO ASI Series F, Vol. 110. Berlin: Springer-Verlag, pp. 205–221 (this volume).

    Google Scholar 

  • Johnson, W. G. (1980). MORT Safety Assurance Systems. New York: Marcel Dekker. Kahneman, D., Slovic, P., Tversky, A. (1982). Judgment Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.

    Google Scholar 

  • Kolding, J. (1990). “Enna. vet vi ikke nok, vi ma. forske mer.” Om modellbegrepet i fiskeribiologi. Ariadne 4, 46–69.

    Google Scholar 

  • Kuhlman, A. (1986). Introduction to Safety Science. New York: Springer-Verlag. March, J. G., Olsen, J. P. (1976). Ambiguity and Choice in Organizations. Bergen: Universitetsforlaget.

    Google Scholar 

  • March, J. G., Simon, H.A. (1958). Organizations. New York: Wiley.

    Google Scholar 

  • Mancini, G. (1988). Modelling humans and machines. In L.P. Goodstein, H.B Andersen, S. E. Olsen (Eds.), Tasks, Errors and Mental Models. London: Taylor Francis.

    Google Scholar 

  • Perrow, C. (1984). Normal Accidents: Living with High-Risk Technologies. New York: Basic Books.

    Google Scholar 

  • Petrosky, H. (1985). To Engineer is Human. The Role of Failure in Successful Design. New York: St. Martin’s Press.

    Google Scholar 

  • Poucet, A. (1988). Survey of Methods Used to Assess Human Reliability in the Human Factors Benchmark Exercise. Reliability Engineering and System Safety, 22, 257–268.

    Article  Google Scholar 

  • Rasmussen, J. (1982). Human reliability in risk analysis. In A. E. Green (Ed.), High Risk Safety Technology. Chichester: Wiley.

    Google Scholar 

  • Reason, J. (1984). Little slips and big disasters. Interdisciplinary Science Reviews, 9 (2), 179–189.

    Article  Google Scholar 

  • Rosenhead, J. (1989a). Introduction: Old and new paradigms of analysis. In J. Rosenhead (Ed.), Rational Analysis for a Problematic World. Problem Structuring methods for Complexity, Uncertainty and Conflict. Chichester: Wiley.

    Google Scholar 

  • Rosenhead, J. (1989b). Robustness analysis: Keeping your options open. In J. Rosenhead (Ed.), Rational Analysis for a Problematic World. Problem Structuring methods for Complexity, Uncertainty and Conflict. Chichester: Wiley.

    Google Scholar 

  • Rosenhead, J. (1989c). Rational Analysis for a Problematic World. Problem Structuring methods for Complexity, Uncertainty and Conflict. Chichester: Wiley.

    Google Scholar 

  • Swain, A. D., Guttmann, H. E. (1983). Handbook of Human Reliability Analysis With Emphasis on Nuclear Plant Applications Nuclear Regulatory Commission (Report No. NUREG/CR-1278). Sandia Laboratories.

    Google Scholar 

  • Waagenaar, W. A., Groeneweg, J. (1987). Accidents at sea: Multiple causes and impossible consequences. International Journal of Man-Machine Studies, 27, 587–598.

    Article  Google Scholar 

  • Waagenaar, W. A., Hudson, T. T. W., Reason, J. (1990). Cognitive failures and accidents. Applied Cognitive Psychology, 4, 273–294

    Article  Google Scholar 

  • Westrum, R. (1993). Cultures with requisite imagination. In J. A. Wise, V. D. Hopkin, P. Stager (Eds.), Verification and Validation of Complex Systems: Human Factors Issues. NATO ASI Series F, Vol. 110. Berlin: Springer-Verlag, pp. 401–416 (this volume).

    Google Scholar 

  • Westrum, R. (1986). Vulnerable technologies: Accident, crime and terrorism. Interdisciplinary Science Reviews, 11 (4), 386–391.

    Article  Google Scholar 

  • Woods, D. D. (1990). Risk and human performance: Measuring the potential for disaster. Reliability Engineering and System Safety, 29, 387–405.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1993 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rosness, R. (1993). Limits to Analysis and Verification. In: Wise, J.A., Hopkin, V.D., Stager, P. (eds) Verification and Validation of Complex Systems: Human Factors Issues. NATO ASI Series, vol 110. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-02933-6_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-02933-6_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-08155-2

  • Online ISBN: 978-3-662-02933-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics