Keywords

1 Introduction

The Digital technology advances, has changed the shape and size of instruments used for navigation and communication. This has changed the actions of pilots, especially in relation to emergency procedures. There are few studies that correlate the reduction of accidents with the cognitive and technological changes. The increased cognitive load relates to these changes and requires assessment. The benefits presented by new technologies do not erase the mental models built, with hard work, during times of initial training of the aircraft career pilots in flying schools. The public must be heeded when an aircraft incident or accident becomes part of the news. In search of who or what to blame, the pilot is guilty and immediately appointed as the underlying factors that involve real evidence of the fact they are neglected.The reading of the Black-Boxes notes that 70 % to 80 % of accidents happen due to human error, or to a string of failures that were related to the human factor [3]. We can mention stress and the failure to fully understand the new procedures related to technological innovations linked to automation. Complex automation interfaces always promote a wide difference in philosophy and procedures for implementation of these types of aircraft, including aircraft that are different even manufactured by the same manufacturer. In this case, we frequently can identify inadequate training that contributes to the difficulty in understanding procedures by the crews. Accident investigations concluded that the ideal would be to include, in the pilot training, a psychological stage, giving to him the opportunity of self-knowledge, identifying possible “psychological breakdowns” that his biological machine can present that endangers the safety of flight. Would be given, thus, more humane and scientific support to the crew and to everyone else involved with the aerial activity, minimizing factors that cause incidents and accidents. Accident investigators concluded that the ideal situation for pilot training should include a psychological phase [4], giving him or her, the opportunity of self-knowledge, identifying possible “psychological breakdowns” that biological features can present and can endanger the safety of flight. It should be given, thus, more humane and scientific support to the crew and everyone else involved with the aerial activity, reducing factors that can cause incidents and accidents. Accidents do not just happen. They have complex causes that can take days, weeks or even years to develop [5]. However, when lack of attention and/or neglect take place resulting in a crash, we can be most certain there was a series of interactions between the user and the system that created the conditions for that to happen [6]. We understand that human variability and system failures are an integral part of the main sources of human error, causing incidents and accidents. The great human effort required managing and performing actions with the interface as the task of monitoring, the precision in the application of command and maintaining a permanent mental model consistent with the innovations in automation make it vulnerable to many human situations where errors can occur.

The human variability in aviation is a possible component of human error and we can see the consequences of these errors leading to serious damage to aircraft and people. It is not easy, in new aviation, to convey the ability to read the instruments displays. This can conduct to the deficiency and the misunderstanding in monitoring and performing control tasks: lack of motivation, the fact that it is stressful and tiring, and generate failures in control (scope, format and activation), poor training and instructions that are wrong or ambiguous. The mind of the pilot is influenced by cognition and communication components during flight, especially if we observe all information processed and are very critical considering that one is constantly getting this information through their instruments. There is information about altitude, speed and position of one’s aircraft and the operation of its hydraulic power systems. If any problem occurs, several lights will light up and warning sounds emerge increasing the volume and type of man-machine communication which can diminish the perception of detail in information that must be processed and administered by the pilot. All this information must be processed by one’s brain at the same time as it decides the necessary action in a context of very limited time. There is a limit of information that the brain can deal with which is part of natural human limitation. It can lead to the unusual situation in which, although the mind is operating normally, the volume of data makes it operate in overload, which may lead to failures and mistakes if we consider this man as a biological machine [6, 7].

Today there are only the pilot and co-pilot in the cockpit and modern automated. Only two men just to control a Boeing 777. This is a large modern aircraft carrying hundreds of passengers and so much faster. Now a days, the tasks of the pilots were multiplied and increased the weights of aircraft, and the number of passengers, speeds takeoffs and landings were more significant, decreasing the number of men in the cockpit. However, the biological machine called human being is not structurally changed in the last thousands of years to support the increased cognitive and emotional overload. How to know your limits? The professional called Mechanics of Flight (the third man in the cockpit), was extinguished when computers arrived. Until the 70 s there was a work station flight engineer. In a modern station with only the pilot and co-pilot, two men just to control a Boeing 777, a huge and modern aircraft carries hundreds of passengers much more quickly. Several procedures were loaded to the pilots that were executed by the Flight Engineer (O terceiro piloto no cockpit - extinto). Several procedures were loaded to the pilots that were executed by the extinguished Flight Engineer (Op.cit).

2 Fundamentation

The following factors are an integral part of cognitive activity in the pilot: fatigue, body rhythm and rest, sleep and its disorders, the circadian cycle and its changes, the G-force and acceleration of gravity, the physiological demands in high-altitude, night-time take-offs and the problem of false illusion of climbing. But, other physiological demands are placed by the aviators. It is suggested that specific studies must be made for each type of aircraft and workplace, with the aim of contributing to the reduction of incidents arising from causes so predictable, yet so little studied. We must also give priority to airmen scientists that have produced these studies in physiology and occupational medicine, since the literature is scarce about indicating the need for further work in this direction. Human cognition refers to mental processes involved in thinking and their use. It is a multidisciplinary area of interest includes cognitive psychology, psychobiology, philosophy, anthropology, linguistics and artificial intelligence as a means to better understand how people perceive, learn, remember and how people think, because will lead to a much broader understanding of human behavior. Cognition is not presented as an isolated entity, being composed of a number of other components, such as mental imagery, attention, consciousness, perception, memory, language, problem solving, creativity, decision making, reasoning, cognitive changes during development throughout life, human intelligence, artificial intelligence and various other aspects of human thought [8].

The procedures of flying an aircraft involve observation and reaction to events that take place inside the cabin of flight and the environment outside the aircraft [4]. The pilot is required to use information that is perceived in order to take decisions and actions to ensure the safe path of the aircraft all the time. Thus, full use of the cognitive processes becomes dominant so that a pilot can achieve full success with the task of flying the “heavier than air”.

With the advent of automated inclusion of artifacts in the cabin of flight that assist the pilot in charge of controlling the aircraft, provide a great load of information that must be processed in a very short space of time, when we consider the rapidity with which changes occur, an approach that cover the human being as an individual is strongly need. Rather, the approach should include their cognition in relation to all these artifacts and other workers who share that workspace [9]. The deployment of the accidents are usually generated by bad-planned-tasks.

A strong component that creates stress and fatigue of pilots, referred to the design of protection, detection and effective handling of fire coming from electrical short circuit on board, is sometimes encountered as tragically happened on the Swissair Airlines flight 111, near Nova Scotia on September 2, 1998. The staff of the Federal Aviation Administration (FAA), responsible for human factors research and modern automated interfaces, reports a situation exacerbated by the widespread use an electrical product and a potentially dangerous wire on aircrafts, called “Kapton” [4].

If a person has to deal with an outbreak of fire, coming from an electrical source at home, the first thing he would do is disconnect the electrical power switch for the fuses. But this option is not available on aircraft like the Boeing B777 and new Airbus. The aviation industry is not adequately addressing the problem of electrical fire in flight and is trying to deal recklessly [10] The high rate of procedural error associated with cognitive errors, in the automation age, suggests that the projects in aviation have ergonomic flaws. In addiction, is has been related that the current generation of jet transport aircraft, used on airlines, like the Airbus A320, A330, A340, Boeing B777, MD11 and the new A380, that are virtually “not flyable” without electricity. We can mention an older generation, such as the Douglas DC9 and the Boeing 737.

Another factor in pushing the pilots that causes emotional fatigue and stress is the reduction of the cockpit crew to just two. The next generation of large transport planes four engines (600 passengers) shows a relatively complex operation and has only two humans in the cockpit. The flight operation is performed by these two pilots, including emergency procedures, which should be monitored or re-checked. This is only possible in a three-crew cockpit or cockpit of a very simple operation. According to the FAA, the only cockpit with two pilots that meets these criteria is the cabin of the old DC9-30 and the MD11 series. The current generation of aircraft from Boeing and Airbus do not fit these criteria, particularly with respect to engine fire during the flight and in-flight electrical fire. The science of combining humans with machines requires close attention to the interfaces that will put these components (human-machine) working properly.

The deep study of humans shows their ability to instinctively assess and treat a situation in a dynamic scenario. A good ergonomic design project recognizes that humans are fallible and not very suitable for monitoring tasks. A properly designed machine (such as a computer) can be excellent in monitoring tasks. This work of monitoring and the increasing the amount of information invariably creates a cognitive and emotional overload and can result in fatigue and stress.

According to a group of ergonomic studies from FAA [11] in the United States this scenario is hardly considered by the management of aviation companies and, more seriously the manufacturers, gradually, introduce further informations on the displays of Glass cockpits. These new projects always determine some physiological, emotional and cognitive impact on the pilots. The accident records of official institutes such as the NTSB (National Transportation Safety Bureau, USA) and CENIPA (Central Research and Prevention of Accidents, Brazil) show that some difficulties in the operation, maintenance or training aircraft, which could affect flight safety are not being rapidly and systematically passed on to crews worldwide. These professionals of aviation may also not be unaware of the particular circumstances involved in relevant accidents and incidents, which makes the dissemination of experiences very precarious.

One of the myths about the impact of automation on human performance: “while investment in automation increases, less investment is needed in human skill”. In fact, many experiments showed that the progressive automation creates new demands for knowledge, and greater, skills in humans. Investigations of the FAA [11], announced that aviation companies have reported institutional problems existing in the nature and the complexity of automated flight platforms. This results in additional knowledge requirements for pilots on how to work subsystems and automated methods differently. Studies showed the industry of aviation introduced the complexities of automated platforms flight inducing pilots to develop mental models about overly simplified or erroneous system operation. This applies, particularly, on the logic of the transition from manual operation mode to operation in automatic mode. The process of performing normal training teaches only how to control the automated systems in normal but do not teach entirely how to manage different situations that the pilots will eventually be able to find.

This is a very serious situation that can proved through many aviation investigation reports that registered the pilots not knowing what to do, after some computers decisions taken, in emergences situations [10]. VARIG (Brazilian Air lines), for example, until recently, had no Boeing 777 simulators where pilots could simulate the emergence loss of automated systems what should be done, at list, twice a month, following the example of Singapore Airlines. According to FAA [11], investigations showed incidents where pilots have had trouble to perform, successfully, a particular level of automation. The pilots, in some of these situations, took long delays in trying to accomplish the task through automation, rather than trying to, alternatively, find other means to accomplish their flight management objectives. Under these circumstances, that the new system is more vulnerable to sustaining the performance and the confidence. This is shaking the binomial Human-Automation compounded with a progression of confusion and misunderstanding. The qualification program presumes it is important for crews to be prepared to deal with normal situations, to deal with success and with the probable. The history of aviation shows and teaches that a specific emergency situation, if it has not happen, will certainly happen.

The future work makes an assessment in systemic performance on pilots. Evaluating performance errors, and crew training qualifications, procedures, operations, and regulations, allows them to understand the components that contribute to errors. At first sight, the errors of the pilots can easily be identified, and it can be postulated that many of these errors are predictable and are induced by one or more factors related to the project, training, procedures, policies, or the job. The most difficult task is centered on these errors and promoting a corrective action before the occurrence of a potentially dangerous situation. The FAA team, which deals with human factors [12], believes it is necessary to improve the ability of aircraft manufacturers and aviation companies in detecting and eliminating the features of a project, that create predictable errors. The regulations and criteria for approval today do not include the detailed project evaluation from a flight deck in order to contribute in reducing pilot errors and performance problems that lead to human errors and accidents. Neither the appropriate criteria nor the methods or tools exist for designers or for those responsible for regulations to use them to conduct such assessments. Changes must be made in the criteria, standards, methods, processes and tools used in the design and certification. Accidents like the crash of the Airbus A320 of the AirInter (a France aviation company) near Strasbourg provide evidence of deficiencies in the project. This accident highlights the weaknesses in several areas, particularly when the potential for seemingly minor features has a significant role in an accident. In this example, inadvertently setting an improper vertical speed may have been an important factor in the accident because of the similarities in the flight path angle and the vertical speed in the way as are registered in the FCU (Flight Control Unit).

This issue was raised during the approval process of certification and it was believed that the warnings of the flight mode and the PFD (Primary Flight Display-display basic flight information) would compensate for any confusion caused by exposure of the FCU, and that pilots would use appropriate procedures to monitor the path of the vertical plane, away from land, and energy state. This assessment was incorrect. Under current standards, assessments of cognitive load of pilots to develop potential errors and their consequences are not evaluated. Besides, the FAA seeks to analyze the errors of pilots, a means of identifying and removing preventively future design errors that lead to problems and their consequences. This posture is essential for future evaluations of jobs in aircraft crews. Identify projects that could lead to pilot error, prematurely, in the stages of manufacture and certification process will allow corrective actions in stages that have viable cost to correct or modify with lower impact on the production schedule. Additionally, looking at the human side, this reduces unnecessary loss of life.

3 Contextualization

On April 26, 1994, an Airbus A300-600 operated by China Airlines crashed at Nagoya, Japan, killing 264 passengers and flightcrew members. Contributing to the accident were conflicting actions taken by the flightcrew and the airplane’s autopilot. The crash provided a stark example of how a breakdown in the flightcrew/automation interface can affect flight safety. Although this particular accident involved an A300-600, other accidents, incidents, and safety indicators demonstrate that this problem is not confined to any one airplane type, airplane manufacturer, operator, or geographical region. This point was tragically demonstrated by the crash of a Boeing 757 operated by American Airlines near Cali, Columbia on December 20, 1995, and a November 12, 1995 incident (very nearly a fatal accident) in which a American Airlines Douglas MD-80 descended below the minimum descent altitude on approach to Bradley International Airport, CT, clipped the tops of trees, and landed short of the runway.

As a result of the Nagoya accident as well as other incidents and accidents that appear to highlight difficulties in flightcrews interacting with the increasing flight deck automation, the Federal Aviation Administration’s (FAA) Transport Airplane Directorate, under the approval of the Director, Aircraft Certification Service, launched a study to evaluate the flightcrew/flight deck automation interfaces of current generation transport category airplanes. The following airplane types were included in the evaluation: Boeing: Models 737/757/767/747-400/777, Airbus: Models A300-600/A310/A320/A330/A340, McDonnell Douglas: Models MD-80/MD-90/MD-11, Fokker: Model F28-0100/-0070 [5].

The Federal Aviation A chartered a human factors (HUMAN FACTOR) team to address these human factors issues, with representatives from the FAA Aircraft Certification and Flight Standards Services, the National Aeronautics and Space Administration, and the Joint Aviation Authorities (JAA), assisted by technical advisors from the Ohio State University, the University of Illinois, and the University of Texas. The HUMAN FACTOR [11]. Team was asked to identify specific or generic problems in design, training, flightcrew qualifications, and operations, and to recommend appropriate means to address these problems. In addition, the HUMAN FACTOR Team was specifically directed to identify those concerns that should be the subject of new or revised Federal Aviation Regulations (FAR), Advisory Circulars (AC), or policies. The HUMAN FACTOR Team relied on readily available information sources, including accident/incident reports, Aviation Safety Reporting System reports, research reports, and trade and scientific journals. In addition, meetings were held with operators, manufacturers, pilots’ associations, researchers, and industry organizations to solicit their input. Additional inputs to the HUMAN FACTOR Team were received from various individuals and organizations interested in the HUMAN FACTOR Team’s efforts [11].

When examining the evidence, the HUMAN FACTOR Team found that traditional methods of assessing safety are often insufficient to pinpoint vulnerabilities that may lead to an accident. Consequently, the HUMAN FACTOR Team examined accident precursors, such as incidents, errors, and difficulties encountered in operations and training. The HUMAN FACTOR Team also examined research studies that were intended to identify issues and improve understanding of difficulties with flightcrew/automation interaction. In examining flightcrew error, the HUMAN FACTOR Team recognized that it was necessary to look beyond the label of flightcrew error to understand why the errors occurred [10].

We looked for contributing factors from design, training and flightcrew qualification, operations, and regulatory processes. While the HUMAN FACTOR Team was chartered primarily to examine the flightcrew interface to the flight deck systems, we quickly recognized that considering only the interface would be insufficient to address all of the relevant safety concerns. Therefore, we considered issues more broadly, including issues concerning the functionality of the uderlying systems. From the evidence, the HUMAN FACTOR Team identified issues that show vulnerabilities in flightcrew management of automation and situation awareness and include concerns about:

  • Pilot understanding of the automation’s capabilities, limitations, modes, and operating principles and techniques. The HUMAN FACTOR Team frequently heard about automation “surprises,” where the automation behaved in ways the flightcrew did not expect. “Why did it do that?” “What is it doing now?” and “What will it do next?” were common questions expressed by flightcrews from operational experience.

  • Differing pilot decisions about the appropriate automation level to use or whether to turn the automation on or off when they get into unusual or non-normal situations (e.g., attempted engagement of the autopilot during the moments preceding the A310 crash at Bucharest). This may also lead to potential mismatches with the manufacturers’ assumptions about how the flightcrew will use the automation.

Flightcrew situation awareness issues included vulnerabilities in, for example:

  • Automation/mode awareness. This was an area where we heard a universal message of concern about each of the aircraft in our charter.

  • Flight path awareness, including insufficient terrain awareness (sometimes involving loss of control or controlled flight into terrain) and energy awareness (especially low energy state).

These vulnerabilities appear to exist to varying degrees across the current fleet of transport category airplanes in our study, regardless of the manufacturer, the operator, or whether accidents have occurred in a particular airplane type. Although the Team found specific issues associated with particular design, operating, and training philosophies, we consider the generic issues and vulnerabilities to be a larger threat to safety, and the most important and most difficult to address. It is this larger pattern that serves as a barrier to needed improvements to the current level of safety, or could threaten the current safety record in the future aviation environment. It is this larger pattern that needs to be characterized, understood, and addressed. In trying to understand this larger pattern, the Team considered it important to examine why these vulnerabilities exist [4]. The Team concluded that the vulnerabilities are there because of a number of interrelated deficiencies in the current aviation system:

  • Insufficient communication and coordination. Examples include lack of communication about in-service experience within and between organizations; incompatibilities between the air traffic system and airplane capabilities; poor interfaces between organizations; and lack of coordination of research needs and results between the research community, designers, regulators, and operators.

  • Processes used for design, training, and regulatory functions inadequately address human performance issues. As a result, users can be surprised by subtle behavior or overwhelmed by the complexity embedded in current systems operated within the current operating environment. Process improvements are needed to provide the framework for consistent application of principles and methods for eliminating vulnerabilities in design, training, and operations.

  • Insufficient criteria, methods, and tools for design, training, and evaluation. Existing methods, data, and tools are inadequate to evaluate and resolve many of the important human performance issues. It is relatively easy to get agreement that automation should be human-centered, or that potentially hazardous situations should be avoided; it is much more difficult to get agreement on how to achieve these objectives.

  • Insufficient knowledge and skills. Designers, pilots, operators, regulators, and researchers do not always possess adequate knowledge and skills in certain areas related to human performance. It is of great concern to this team that investments in necessary levels of human expertise are being reduced in response to economic pressures when two-thirds to three-quarters of all accidents have flightcrew error cited as a major factor.

  • Insufficient understanding and consideration of cultural differences in design, training, operations, and evaluation. The aviation community has an inadequate understanding of the influence of culture and language on flightcrew/automation interaction. Cultural differences may reflect differences in the country of origin, philosophy of regulators, organizational philosophy, or other factors. There is a need to improve the aviation community’s understanding and consideration of the implications of cultural influences on human performance.

4 Conclusion

A few decades ago, in my early life entering the airlines, we were taught to fly the Authomatic Control in the Throttle Quadrant (TQ) course, with SOP’s (Standard Operating Procedure) attached. The line operations were refined during line training. The initial emphasis was knowing how the automated new system worked and how to fly it. The line training refined these skills and expanded how to operate it within the airways system and a multitude of busy airports and small visual airfields. Understanding the complexities of the systems came with our ‘apprenticeship’, which had started. When automation became readily available we used it to reduce workload when we felt like it. We didn’t really trust it but we used it knowing we could easily disconnect it when it didn’t do what we wanted. Now some airliners want everything done on autopilot because it can fly better than any pilot. Airlines hire young pilots with little experience and they are shown how you don’t need to hand fly any more because of automation. Labor is cheap. We developed a study focusing on the guilt of pilots in accidents when preparing our thesis. In fact, the official records of aircraft accidents blame the participation of the pilots like a large contributive factor in these events. Modifying this scenario is very difficult in the short term, but we can see as the results of our study, which the root causes of human participation, the possibility of changing this situation. The cognitive factor has high participation in the origins of the problems (42 % of all accidents found on our search). If we consider other factors, such as lack of usability applied to the ergonomics products, the choise of inappropriate materials and poor design, for example, this percentage is even higher. Time is a factor to consider. This generates a substantial change in the statistical findings of contributive factors and culpability on accidents. The last consideration on this process, as relevant and true, somewhat later, must be visible solutions. In aviation, these processes came very slowly, because everything is wildly tested and involves many people and institutions. The criteria adopted by the official organizations responsible for investigation in aviation accidents do not provide alternatives that allow a clearer view of the problems that are consequence of cognitive or other problems that have originate from ergonomic factors. We must also consider that some of these criteria cause the possibility of bringing impotence of the pilot to act on certain circumstances. The immediate result is a streamlining of the culpability in the accident that invariably falls on the human factor as a single cause or a contributing factor. Many errors are classified as only “pilot incapacitation” or “navigational error”. Our research shows that there is a misunderstanding and a need to distinguish disability and pilot incapacitation (because of inadequate training) or even navigational error.

Our thesis has produced a comprehensive list of accidents and a database that allows extracting the ergonomic, systemic and emotional factors that contribute to aircraft accidents. These records do not correlate nor fall into stereotypes or patterns. These patterns are structured by the system itself as the accident records are being deployed. We developed a computer system to build a way for managing a database called the Aviation Accident Database. The data collected for implementing the database were from the main international entities for registration and prevention of aircraft accidents as the NTSB (USA), CAA (Canada), ZAA (New Zealand) and CENIPA (Brazil). This system analyses each accident and determines the direction and the convergence of its group focused, instantly deployed according to their characteristics, assigning it as a default, if the conditions already exist prior to grouping. Otherwise, the system starts formatting a new profile of an accident.

This feature allows the system to determine a second type of group, reporting details of the accident, which could help point to evidence of origin of the errors. Especially for those accidents that have relation with a cognitive vector. Our study showed different scenarios when the accidents are correlated with multiple variables. This possibility, of course, is due to the ability of Aviation DataBase System [6, 7], which allows the referred type of analysis. It is necessary to identify accurately the problems or errors that contribute to the pilots making it impossible to act properly. These problems could point, eventually, to an temporary incompetence of the pilot due to limited capacity or lack of training appropriateness of automation in aircraft. We must also consider many other reasons that can alleviate the effective participation or culpability of the pilot. Addressing these problems to a systemic view expands the frontiers of research and prevention of aircraft accidents.

This system has the purpose of correlating a large number of variables. In this case, the data collected converges to the casualties of accidents involving aircraft, and so, can greatly aid the realization of scientific cognitive studies or applications on training aviation schools or even in aviation companies. This large database could be used in the prevention of aircraft accidents allowing reaching other conclusions that would result in equally important ways to improve air safety and save lives.