Abstract
This chapter presents an analytical framework that brings answers to and overcomes the “classical” debate on direct manipulation versus interface agents. Direct manipulation is always appropriate when the system to be controlled is simple. However, when users need to interact with complex systems, direct manipulation is also complex and requires a sufficient level of expertise. Users need to be trained, and in some cases deeply trained. They also need to be assisted to fulfill overall criteria such as safety, comfort or high performance. Artificial agents are developed to assist users in the control of complex systems. They are usually developed to simplify work, in reality they tend to change the nature of work. They do not remove training. Artificial agents are evolving very rapidly, and incrementally create new practices. An artificial agent is associated to a cognitive function. Cognitive function analysis enables human-centered design of artificial agents by providing answers to questions such as: Artificial agents for what? Why are artificial agents not accepted or usable by users? An example is provided, analyzed and evaluated. Current critical issues are discussed.
Chapter PDF
Similar content being viewed by others
References
Billings, C.E., 1991, Human-centered aircraft automation philosophy. NASA TM 103885, NASA Ames Research Center, Moffett Field, CA, USA.
Boy, G.A., 1998a, Cognitive function analysis. Ablex, distributed by Greenwood Publishing Group, Westport, CT.
Boy, G.A., 1998b, Cognitive Function Analysis for Human-Centered Automation of Safety-Critical Systems in Proceedings of CHI’98, ACM Press, 265–272.
Chin, D.N., 1991, Intelligent interfaces as agents. In Intelligent User Interfaces, J.W. Sullivan and S.W. Tyler (Eds.). ACM Press, New York, U.S.A, pp. 177–206.
De Brito, G., Pinet, J.& Boy, G.A., 1998, About the use of written procedures in glass cockpits: Abnormal and emergency situations. EURISCO Technical Report No.T-98-049, Toulouse, France.
Dornheim, M.A.,1995, Dramatic incidents highlight mode problems in cockpits. Aviation Week and Space Technology, Jan.30, pp. 57–59.
Ehrlich, K., 1998, A Conversation with Austin Henderson. Interview. Interactions. New visions of human-computer interaction. November/December.
FCOM-A320, 1997, Flight Crew Operation Manual A320. Airbus Industrie, Toulouse-Blagnac, France.
Gibson, J., 1979, The ecological approach to visual perception. Boston: Houghton, Mifflin.
Hutchins, E., 1995, How a cockpit remembers its speeds. Cognitive Science, 19, pp. 265–288.
Irving, S., Polson, P.& Irving, J.E., 1994, A GOMS analysis of the advanced automated cockpit. Human Factors in Computing Systems. CHI’94 Conference Proceedings. ACM Press, 344–350.
Lanier, J,, 1995, Agents of Alienation, interactions. July, pp. 66–72.
Mathé, N.& Chen, J.R., 1996, User-centered indexing for adaptive information access. User Modeling and User-Adapted Interaction. 6(2–3), pp. 225–261.
Minsky, M., 1985, The Society of Mind. Touchstone Books. Simon & Schuster, New York.
Muller, M., 1991, Participatory design in Britain and North America: Responding to the ≪Scandinavian Challenge ≫. In Reading Through Technology, CHI’91 Conference Proceedings. S.P. Robertson, G.M. Ohlson and J.S. Ohlson Eds. ACM, pp. 389–392.
Norman, D.A., 1994, How might people interact with agents. Communications of the ACM, July, Vol. 37, No. 7, pp. 68–71.
Norman, D.A., 1998, The invisible computer. MIT Press.
Rudisill, M., 1994, Flight crew experience with automation technologies on commercial transport flight decks. In M. Mouloua and R. Parasuraman, Eds.), Human Performance in Automated Systems: Current Research and Trends, Hills dale, NJ, Lawrence Erlbaum Associates, pp. 203–211.
Sarter, N.B. & Woods, D.D., 1994, Pilot interaction with cockpit automation II: An experimental study of pilots’ model and awareness of the flight management system. International Journal of Aviation Psychology, 4, 1, pp. 1–28.
Shneiderman, B. & Maes, P., 1997, Direct manipulation versus interface agents. nteractions. November-December issue, pp. 42–61.
Suchman, L., 1987, Plans and situated actions: The problem of human-machine communications. New York: Cambridge University Press.
Vera, A. & Simon, H., 1993, Situated actions: A symbolic interpretation. Cognitive Science, 17, pp. 7–48.
Wickens, C.D., 1996, Situation awareness: Impact of automation and display technology. NATO AGARD Aerospace Medical Panel Symposium on Situation Awareness: Limitations and Enhancement in the Aviation Environment (Keynote Address). AGARD Conference Proceedings 575.
Wickens, C.D. & Flach, J.M., 1988, Information processing. In E.L. Wiener & D.C. Nagel (Eds.), Human Factors in Aviation. San Diego, CA: Academic Press, pp. 111–155.
Wiener, E., 1989, Human factors of advanced technology’ glas cockpit’ transport aircraft. Technical Report 117528. NASA Ames Research Center, Moffett Field, CA.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer Science + Business Media, Inc.
About this paper
Cite this paper
Boy, G. (2004). Human-Centered Automation: A Matter of Agent Design and Cognitive Function Allocation. In: Reis, R. (eds) Information Technology. IFIP International Federation for Information Processing, vol 157. Springer, Boston, MA. https://doi.org/10.1007/1-4020-8159-6_11
Download citation
DOI: https://doi.org/10.1007/1-4020-8159-6_11
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4020-8158-3
Online ISBN: 978-1-4020-8159-0
eBook Packages: Springer Book Archive