Abstract
Recently, Norman (Living with complexity, 2011) wrote, “Machines have rules they follow. They are designed and programmed by people, mostly engineers and programmers, with logic and precision. As a result, they are often designed by technically trained people who are far more concentrated about the welfare of their machines than the welfare of the people who will use them. The logic of machines is imposed on people, human beings who do not work by the same rules of logic.” Isn’t it obvious? Nevertheless, this is what we observe everyday, and very little is being done in engineering to solve this recurring problem effectively. This kind of observation has been made for a long time by ergonomists who preached the adaptation of machines to people and not the opposite. What is new is the consideration of this requirement not as a post-development validation of machines (i.e., human factors and ergonomics, or HFE), but as a pre-design process, as well as a life cycle iterative process (i.e., human-centered design, or HCD). Cognitive engineering is about understanding people’s needs and experience along the life cycle of a product, and most importantly with influence during its high-level requirements definition.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
http://en.wikipedia.org/wiki/Ammonium_nitrate_disasters.
- 2.
I recently discussed with a young architect who told me that this term could be confusing since in architecture, architects have to perceive, understand and project themselves inside the house (interior) they are building to investigate human factors issues reflecting them in the exterior and structure as well. In this book, “inside-out” refers to the inside of technology, the kernel, bricks, beams, engines and so on. It means that design is done from the kernel of technology to the periphery where users are (i.e., taking a technology-centered approach). In fact, when I talk about architects, I mean human-centered designers, the ones who try to find out how houses they build will be used. In addition, some architecture schools currently tend to move toward engineering. This is why human-centered design should have its role as a discipline in its own right.
- 3.
What are emerging cognitive functions? First, let us define a human cognitive function, which can be modeled by the role they play in the organism where they are involved. It should be noted that this role is defined in a limited context, which also needs to be defined. Finally, a cognitive function is necessarily supported by resources, which can be cognitive functions themselves. Consequently, a cognitive function can be defined by the triplet, “role, context, resources” (Boy 1998, 2011). Note that both humans and information-based machines have cognitive functions. For example, people have situation awareness and decision-making cognitive functions; speed control systems installed on cars have regulation mechanism that enable to maintain a given speed, and collision avoidance systems enable aircraft to inform pilots that they are approaching another flying vehicle and they need to immediately take action to avoid collision. When designing a new system, cognitive functions can be defined deliberatively as prescriptions, but they most naturally emerge from human activity. Emerging cognitive functions cannot be easily anticipated at design time. Furthermore, it may take some time to discover them. This is why tests are so crucial (e.g., flight testing in aeronautics).
- 4.
Artifact, User, Task, Organization and Situation. The AUTOS pyramid will be more extensively described in Chap. 7—devoted to Modeling and Simulation.
References
Alexander, I. (2001). Visualizing requirements in UML. Telelogic Newsbyte, Issue 13, Sept–Oct, http://www.telelogic.com/newsbyte/article.cfm?id=0000201800032832. Accessed 15 April 2012.
Amalberti, R. (2001). The paradoxes of almost totally safe transportation systems. Safety Science, 37, 109–126.
Artman, H., & Garbis, C. (1998). Situation Awareness as Distributed Cognition. Proceedings of ECCE ’98, Limerick.
Barthelemy, F., Hornus, H., Roussot, J., Hufschmitt, J. P., & Raffoux, J.F. (2001). Accident on the 21st of September 2001 at a factory belonging to the Grande Paroisse Company in Toulouse. Report of the General Inspectorate for the Environment, October.
Beringer, D. B., & Hancock, P. A. (1989). Exploring situational awareness: A review and the effects of stress on rectilinear normalization. In Proceedings of the Fifth International Symposium on Aviation Psychology (Vol. 2, pp. 646–651). Columbus: Ohio State University.
Bernoulli, D. (1738). Exposition of a new theory of measurement of risk. Econometrica (The Econometric Society trans: Dr. L. Sommer, 1954 Vol. 22 (1), pp. 22–36).
Billings, C. E., & Cheaney, E. (1981). Information transfer problems in the aviation system. NASA TP-1875. Moffett Field: NASA Ames Research Center.
Billings, C. E. (1995). Situation awareness measurement and analysis: A commentary. Proceedings of the International Conference on Experimental Analysis and Measurement of Situation Awareness. Florida: Embry-Riddle Aeronautical University Press.
Billings, C. E. (1996). Aviation automation: The search for a human-centered approach. Mahwah: Erlbaum.
Bødker, S. (1996). Creating conditions for participation: Conflicts and resources in systems design. Human Computer Interaction, 11(3), 215–236.
Boy, G. A. (1986). An expert system for fault diagnosis in orbital refueling operations. AIAA 24th Aerospace Sciences Meeting, Reno.
Boy, G. A. (1987). Operator assistant systems. International journal of man-machine studies. In G. Mancini, D. D. Woods & E. Hollnagel (Eds.), Cognitive engineering in dynamic worlds (Vol. 27, pp. 541–554). London: Academic.
Boy, G. A. (1998). Cognitive function analysis. Ablex: Greenwood. ISBN 9781567503777.
Boy, G. A. (2002). Theories of human cognition: To better understand the co-adaptation of people and technology, in knowledge management, organizational intelligence and learning, and complexity. In L. D. Kiel (Ed.), Encyclopedia of life support systems (EOLSS), developed under the auspices of the UNESCO. Oxford: Eolss. http://www.eolss.net.
Boy, G. A. (2002). Procedural interfaces (in French). Proceedings of the National Conference on Human-Computer Interaction (AFIHM). New York: ACM.
Boy, G. A. (2011). Cognitive function analysis in the design of human and machine multi-agent systems. In G. A. Boy (Ed.), Handbook of human-machine interaction: A Human-centered design approach. Aldershot: Ashgate.
Boy, G. A., & Brachet, G. (2008). Risk taking. Dossier of the air and space academy. Toulouse: ASA.
Cockburn, A. (2001). Writing effective use cases. Addison-Wesley.
Davenport, T. H. (2005). Thinking for a living: How to get better performance and results from knowledge workers. Boston: Harvard Business School Press. ISBN 1591394236.
Disasters caused by ammonium nitrate (2010). http://en.wikipedia.org/wiki/Ammonium_nitrate_disasters. Accessed 16 Dec 2011.
Dreyfus, S. E., & Dreyfus, H. L. (1980). A five-stage model of the mental activities involved in directed skill acquisition. Operations Research Center, ORC-80–2. Berkeley: University of California.
Endsley, M. R. (1988). Situation awareness global assessment technique (SAGAT). Paper presented at the National Aerospace and Electronic Conference (NAECON), Dayton.
Endsley, M. R. (1995). Measurement of situation awareness in dynamic systems. Human Factors, 37, 65–84.
Endsley, M. R. (1996). Automation and situation awareness. In R. Parasuraman & M. Mouloua (Eds.), Automation and human performance: Theory and applications (pp. 163–181). Mahwah: Laurence Erlbaum.
Evans, J. (1990). Bias in human reasoning: Causes and consequences. London: Lawrence Erlbaum Associates.
FABIG (2011). Major accident listing: AZF (Azote de France) fertilizer factory. Toulouse. March. http://www.fabig.com/Accidents/AZF + Toulouse.htm. Accessed 17 Dec 2011.
Fitts, P. M. (1951). Human engineering for an effective air navigation and traffic control system. Washington DC: National Research Council, Committee on Aviation Psychology.
Grudin, J. (1993). Obstacles to participatory design in large product development organizations. In A. Namioka & D. Schuler (Eds.), Participatory design. Principles and practices (pp. 99–122). Hillsdale: Lawrence Erlbaum Associates.
Hollnagel, E. (1993). Human reliability analysis: Context and control. London: Academic.
Hollnagel, E. (1998). Cognitive reliability and error analysis method: CREAM. New York: Elsevier.
Hollnagel, E., & Amalberti, R. (2001). The emperor’s new clothes, or whatever happened to “human error”? Invited keynote presentation at 4th International Workshop on Human Error, Safety and System Development. Linköping.
Jacobsen, I. (1992). Object oriented software engineering: A use case driven approach. Addison-Wesley.
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica (Vol. 47, No. 2, pp. 263–292 March).
Kellert, S. H. (1993). In the wake of chaos: Unpredictable order in dynamical systems. Chicago: University of Chicago Press. ISBN 0-226-42976-8.
Klein, G. A. (1997). The recognition-primed decision (RDP) model: Looking back, looking forward. In C. E. Zsambok & G. A. Klein (Eds.), Naturalistic decision-making. Mahwah: Lawrence Erlbaum Associates
Klein, G. A., & Klinger, D. (1991). Naturalistic decision-making. Human Systems IAC Gateway (Vol. XI: No. 3, 2, 1, pp. 16–19) Winter.
Kulak, D., & Guiney, E. (2000). Use cases: Requirements in context. Addison-Wesley.
Lilly, S. (1999). Use case pitfalls: Top 10 problems from real projects using use cases, proceedings of the technology of object-oriented languages and systems, IEEE.
Llewellyn, D. J. (2003). The psychology of risk taking behavior. PhD Thesis, The University of Strathclyde.
Loukopoulos, L. D., Dismukes, R. K., & Barshi, I. (2009). The multitasking myth—Handling complexity in real-world operations. Aldershot: Ashgate. ISBN: 978-0-7546-7382-8.
Maturana, H. R., & Varela, F. J. (1980). Autopoiesis: The organization of the living. In H.R. Maturana & F.J. Varela (Eds.), Autopoiesis and cognition: The realization of the living (pp. 59–138). Dordrecht: D. Reidel.
Meister, D. (1999). The history of human factors and ergonomics. Mahwah: Lawrence Erlbaum Associates. ISBN 0805827692.
Mosco, V. & McKercher, C. (2007). Introduction: Theorizing knowledge labor and the information society. In C. McKercher, V. Mosco & M. D. Lanham (Eds.), Knowledge Workers in the Information Society (pp. vii–xxiv). Lexington Books.
Mosier-O.Neill, K. L. (1989). A contextual analysis of pilot decision-making. In R. S. Jensen (Ed), Proceedings of the Fifth International Symposium of Aviation Psychology. Columbus: Ohio State University
Muller, M. J. (2007). Participatory design: The third space in HCI (revised). In J. Jacko & A. Sears (Eds.), Handbook of HCI (2nd ed). Mahway: Erlbaum.
Nielsen, J. (1993). Usability engineering. Boston: Academic Press.
Norman, D. (1988). The psychology of everyday things. New York: Basic Books.
Norman, D. A. (2011). Living with complexity. Cambridge: MIT.
Peirce, C. S. (1958). Science and philosophy: Collected papers of Charles S. Peirce (Vol. 7). Cambridge: Harvard University Press.
Rasmussen, J. (1983). Skills, rules, knowledge; signals, signs and symbols, and other distinctions in human performance models. IEEE Transactions on Systems, Man and Cybernetics, 13, 257–266.
Reason, J. (1990). Human error. Cambridge: University Press.
Rosenbloom, P. S., & Newell, A. (1987). Learning by chunking, a production system model of practice. In D. Klahr, P. Langley, R. Neches (Eds.). Production system models of learning and de v elopment (pp. 221–286). Cambridge: MIT.
Salmon, P. M., Stanton, N. A., Walker, G. H., & Jenkins, D. P. (2009). Distributed situation awareness: Theory, measurement and application to teamwork. Aldershot: Ashgate. ISBN: 978-0-7546-7058-2.
Sarter, N. B., & Woods, D. D. (1995). How in the world did we ever get into that mode? Mode error and awareness in supervisory control. Human Factors, 37(1), 5–19.
Sheridan, T. B., & Verplank, W. (1978). Human and computer control of undersea teleoperators. Cambridge: Man-Machine Systems Laboratory, Department of Mechanical Engineering, MIT.
Sperber, D. (2005). Modularity and relevance: How can a massively modular mind be flexible and context-sensitive? In P. Carruthers, S. Laurence & S. Stich (Eds.), The innate mind: Structure and content. Oxford: Oxford University Press.
Spirkovska, L. (2010). Intelligent automation approach for improving pilot Situational awareness. NASA Ames Research Center.
Suchman, L. (1987). Plans and situated actions: The problem of human-machine communication. Cambridge: Cambridge University Press.
Von Neumann, J., & Morgenstern, O. (1944). Theory of games and economic behavior (2nd ed. 1947, 3rd ed. 1953). Princeton: Princeton University Press.
Wickens, C. D. (1987). Information processing, decision-making and cognition. In G. Salvendy (Ed.), Handbook of human factors. New York: Wiley.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2013 Springer-Verlag London
About this chapter
Cite this chapter
Boy, G. (2013). Cognitive Engineering. In: Orchestrating Human-Centered Design. Springer, London. https://doi.org/10.1007/978-1-4471-4339-0_3
Download citation
DOI: https://doi.org/10.1007/978-1-4471-4339-0_3
Published:
Publisher Name: Springer, London
Print ISBN: 978-1-4471-4338-3
Online ISBN: 978-1-4471-4339-0
eBook Packages: Computer ScienceComputer Science (R0)