Humans and Their Relation to Ill-Defined Systems

  • Neville Moray
Part of the NATO Conference Series book series (NATOCS, volume 16)


Using a taxonomy of systems proposed by Ashby this paper examines the various species of ill-defined systems which may occur in man-machine interaction. A distinction is made between objectively ill-defined systems and effectively ill-defined systems. Various properties of humans as information processors are examined, and it is claimed that humans turn almost all systems, whether initially well-defined or objectively ill-defined into effectively ill-defined systems. It is suggested that while conscious decision making is particularly ill-suited to controlling ill-defined systems, sheer practice frequently enables humans to control them through skills which are not understood even by their owner.


State Vector Transition Matrix Human Relation Information Processor System State Vector 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Ashby, R., 1956, “Introduction to cybernetics,” Chapman and Hall, London.MATHGoogle Scholar
  2. Conant, R., 1976, The laws of information which govern systems, IEEE Trans. Sys., Man & Cyb., SMC-6:240.MathSciNetGoogle Scholar
  3. Conant, R., 1980, Set-theoretic structure modelling. Int. J. General Systems, 6:1.CrossRefGoogle Scholar
  4. Conant, R., and Ashby, R., 1970, Every good regulator of a system must be a model of that system. Int. J. Syst. Science, 1:89.MathSciNetCrossRefMATHGoogle Scholar
  5. Dale, H., 1968, Weighing evidence: an attempt to assess the efficiency of the human operator. Ergonomics, 11:215.CrossRefGoogle Scholar
  6. Gaines, B., 1976, On the complexity of causal models, IEEE Trans. Sys., Man & Cyb., SMC-6:56.Google Scholar
  7. Hopf-Weichel, R., Lucciani, L., Saleh, J., and Freedy, A., 1979, Aircraft emergency decisions: cognitive and situational variables, Perceptronics PATR 1065–79–7.Google Scholar
  8. Kahneman, D., and Tversky, A., 1973, Belief in the law of small numbers, Psychol. Bull., 76:105.Google Scholar
  9. Kelley, C., 1968, “Manual and Automatic Control,” Wiley, N.Y.Google Scholar
  10. Kleinman, D.L., Baron, S., and Levison, W.H., 1970, An optimal control model of hinnan response, part I: theory and validation, Automatica, 6:357.CrossRefGoogle Scholar
  11. McRuer, D., Hoffman, L., Jex, H., Moore, G., Phatak, A., Weir, D., and Wolkovitch, J., 1968, New approaches to human-pilot/vehicle dynamic analysis, AFFDL-TR-67–150, Wright-Patterson A.F.B.Google Scholar
  12. Moray, N., 1980a, Human information processing and supervisory control, M.I.T. Man-Machine System Laboratory Report.Google Scholar
  13. Moray, N., 1980b, The use of information transmission as nonparametric correlation in the analysis of complex behaviour, M.I.T. Man-Machine System Laboratory Report.Google Scholar
  14. Petersen, C., and Beach, L., 1964, Man as an intuitive statistician, Psychol. Bull., 68:29.CrossRefGoogle Scholar
  15. Rouse, W., 1980, “Systems Engineering Models of Himian-Machine Interaction,” North-Holland-Elsevier, New York.Google Scholar
  16. Schrenck, L., 1969, Aiding the decision maker: a decision process model. Ergonomics, 12:543.CrossRefGoogle Scholar
  17. Simon, H., 1962, The structure of complexity Proc. Amer. Philos. Soc., 106:467.Google Scholar
  18. Tversky, A., and Kahneman, D., 1974, Judgement under uncertainty, science, 185:1124.CrossRefGoogle Scholar
  19. Young, L., 1969, On adaptive manual control. Ergonomics, 12:635.CrossRefGoogle Scholar

Copyright information

© Plenum Press, New York 1984

Authors and Affiliations

  • Neville Moray
    • 1
  1. 1.Department of Industrial EngineeringUniversity of TorontoTorontoCanada

Personalised recommendations