Pilot error

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Lua error in package.lua at line 80: module 'strict' not found.

1994 Fairchild Air Force Base B-52 crash, caused by flying the aircraft beyond its operational limits. Here the aircraft is seen in an unrecoverable bank, moments before the crash. This incident is now used in military and civilian aviation environments as a case study in teaching crew resource management.
File:TWA3.png
Actual flight path (red) of TWA Flight 3 from departure to crash point (controlled flight into terrain). Blue line shows the nominal Las Vegas course, while green is a typical course from Boulder. The pilot inadvertently used the Boulder outbound course instead of the appropriate Las Vegas course.
The locations of the accident and departure airports shown on a map of Brazil.
Maraba Airport
Maraba Airport
Belem Airport
Belem Airport
Pilot error
Location of the crash landing after running out of fuel and departure/destination airports of the Varig Flight 254 (navigational error).
Runway collision caused by taking the wrong taxiing route (red instead of green), as control tower had not given clear instructions. The accident occurred in thick fog.
The Tenerife disaster now serves as a textbook example.[1] Due to several misunderstandings, the KLM flight tried to take off while the Pan Am flight was still on the runway. The airport was accommodating an unusually great number of large aircraft, resulting in disruption of the normal use of taxiways.
The 3p design altimeter is one of the most prone to misreading by pilots (a cause of the UA 389 and G-AOVD crashes).

Pilot error (sometimes called cockpit error) is a decision, action or inaction by a pilot or crew of an aircraft determined to be a cause or contributing factor in an accident or incident. Pilot error can be a mistake, oversight, lapse in judgment, or failure to exercise due diligence by pilots during the performance of their duties.

The causes of pilot error are due to psychological and physiological human limitations, and various forms of threat and error management have been implemented into pilot training programs to teach crew members how to deal with impending situations which arise throughout the course of a flight.

A broader view of how human factors fits into a system is now considered standard practice by accident investigators when examining the chain of events that led to an accident.[2][3]

Description

Usually in an accident caused by pilot error, it is assumed that the pilot in command (captain) makes an error unintentionally. However, an intentional disregard for a standard operating procedure (or warning) is still considered to be a pilot error, even if the pilot's actions justified criminal charges.[citation needed]

Pilot error is a decision or action mistake by pilots when there is an emergency. As the commander of the airplane, pilot is always regarded to be one of the most important factors because the decision of the pilot which determine everything on the plane, can be affected by countless external elements. Although these limitations, the findings state the value that analyses of accident patterns might have for the improvement of passengers safety.[4]

The pilot may be a factor even during adverse weather conditions if the investigating body deems that the pilot did not exercise due diligence.[citation needed] The responsibility for the accident in such a case would depend upon whether the pilot could reasonably know of the danger and whether he or she took reasonable steps to avoid the weather problem.[citation needed] Flying into a hurricane (for other than legitimate research purposes) would be considered pilot error;[original research?] flying into a microburst would not be considered pilot error if it was not detectable by the pilot, or in the time before this hazard was understood.[original research?] Some weather phenomena (such as clear-air turbulence or mountain waves) are difficult to avoid, especially if the aircraft involved is the first aircraft to encounter the phenomenon in a certain area at a certain time.[original research?]

Placing pilot error as a cause of an aviation accident has often been controversial. For example, the NTSB ruled[citation needed] that the crash of American Airlines Flight 587 was because of the failure of the rudder, which was caused by "unnecessary and excessive rudder pedal inputs" on the part of the co-pilot who was operating the aircraft at the time. Attorneys for the co-pilot, who was killed in the crash, argue that American Airlines' pilots had never been properly trained concerning extreme rudder inputs.[citation needed] The attorneys also claimed that the rudder failure was actually caused by a flaw in the design of the Airbus A300 aircraft and that the co-pilot's rudder inputs should not have caused the catastrophic rudder failure that led to the accident that killed 265 people.[citation needed]

Modern accident investigators attempt to avoid the words "pilot error", as the scope of their work is to determine the cause of an accident, rather than apportion blame. Furthermore, any attempt to blame pilots does not consider that they are part of a broader system, which in turn may be at fault for their fatigue, work pressure or lack of training.[3] ICAO and its member states therefore adopted the Reason Model in 1993 in an effort to better understanding the role of human factors in aviation accidents.[5]

Thus, pilot error is a major cause of air accidents. During 2004, pilot error was pointed to be the primary reason of 78.6% of disastrous GA (general aviation) accidents, and as the major cause of 75.5% of general aviation accidents in the US.[6] Pilot errors are related to multiple causes. Decision errors can be caused by several factors such as tendencies, biases as well as breakdowns when human proceeds the information coming in. For pilot in aviation, these errors are highly to produce not only errors but also fatalities.[7]

Causes of Pilot Error

Pilots work in complex environments and are routinely exposed to high amounts of situational stress in the workplace, inducing pilot error which may result in a threat to flight safety. While aircraft accidents are infrequent, they are highly visible and often involve massive loss of life. For this reason, research on causal factors and methodologies of mitigating risk associated with pilot error is exhaustive. Pilot error results from physiological and psychological limitations inherent in humans. “Causes of error include fatigue, workload, and fear as well as cognitive overload, poor interpersonal communications, imperfect information processing, and flawed decision making.” [8] Throughout the course of every flight, crews are intrinsically subjected to a variety of external threats and commit a range of errors that have the potential to negatively impact the safety of the aircraft.[9]

Threats

The term "threat" is defined as any event "external to flight crew's influence which can increase the operational complexity of a flight." [10] Threats may further be broken down into environmental threats and airline threats. Environmental threats are ultimately out of the hands of crew members and the airline, as they hold no influence on "adverse weather conditions, hazardous , air traffic control shortcomings, bird strikes, and high terrain." [10] Conversely, airline threats are not manageable by the flight crew, but may be controlled by the airline's management. These threats include "aircraft malfunctions, cabin interruptions, operational pressure, ground/ramp errors/events, cabin events and interruptions, ground maintenance errors, and inadequacies of manuals and charts."[10]

Errors

The term "error" is defined as any action or inaction leading to deviation from team or organizational intentions.[8] Error stems from physiological and psychological human limitations such as illness, medication, stress, alcohol/drug abuse, fatigue, emotion etc. Error is inevitable in humans and is primarily related to operational and behavioural mishaps.[11] Errors can vary from incorrect altimeter setting and deviations from flight course to more severe errors such as exceeding maximum structural speeds or forgetting to put down landing or takeoff flaps.

Decision Making

Reasons for negative reporting of accident include staff being too busy, confusing data entry forms, lack of training and less education, lack of feedback to staff on reported data and punitive organizational cultures.[12] Wiegmann and Shappell invented three cognitive models to analyze approximately 4,000 pilot factors associated with more than 2,000 U.S. Naval aviation mishaps. Although the three cognitive models has slight difference in the types of errors all three lead to the same conclusion: judgment errors.[13] There are three steps which are decision-making, goal-setting, and strategy-selection errors. All of those were highly related with primary accidents.[13] For example, on December 28, 2014, AirAsia Flight 8501, which carrying seven crew members and 155 passengers, crashed into Java sea due to several fatal mistakes of the captain in the poor weather condition. in this case, the captain chose to adjust the flight altitude at the high rate which is not acceptable.[14]

Psychological Illness

The psychological treatment and requirements of pilots is always listed in aviation law and enforced by individual airlines. Facing multiple special challenges, pilots must exercise control in complicated environments. Psychological illness is typically defined as an unintended physical, mental, or social injury, harm or complication that results in disability, death, or increased use of health care resources.[15] Due to physiological problems such as jet lag, pilots usually feel uncomfortable after long-hour flights. Psychological illness is regarded as a primary problem for pilots which had also caused several fatal accidents in the past.[16]

SilkAir Flight 185 On 19 December 1997, Flight 185 crashed into the Musi River near Palembang in southern Sumatra. All 97 passengers and seven crew were killed on board. After the investigation of the accident, all the evidence pointed to the captain which was concluded to be a planned suicidal accident.

EgyptAir Flight 990 On 31 October 1999, the Boeing 767 crashed into the Atlantic Ocean south of Nantucket Island, Massachusetts. All 217 people on board were killed. Although the result had never been proved, the crash was considered as a deliberate action by the relief first officer.[17]

Germanwings Flight 9525 On 24 March 2015, the aircraft, flight 9525 crashed in the French Alps. All 144 passengers and six crew members were killed. As the co-pilot of the plane, Andreas Lubitz had been treated for suicidal tendencies and been banned to work by a doctor. Lubitz hid this information from his employer. During the flight, the door was locked by Lubitz and the captain could not enter before Lubitz deliberately caused the aircraft to crash into a mountain.

Threat and Error Management (TEM)

TEM involves the effective detection and response to internal or external factors that have the potential to degrade the safety of an aircraft's operations.[9] Methods of teaching TEM stress replicability, or reliability of performance across recurring situations.[18] TEM aims to prepare crews with the "coordinative and cognitive ability to handle both routine and unforeseen surprises and anomalies."[18] The desired outcome of TEM training is the development of 'resiliency'. Resiliency, in this context, is the ability to recognize and act adaptively to disruptions which may be encountered during flight operations. TEM training occurs in various forms, with varying levels of success. Some of these training methods include data collection using the Line Operations Safety Audit (LOSA), implementation of crew resource management (CRM), cockpit task management (CTM), and the integrated use of checklists in both commercial and general aviation. Some other resources built into most modern aircraft that help minimize risk and manage threat and error are airborne collision and avoidance systems (ACAS) and ground proximity warning systems (GPWS).[19] With the consolidation of onboard computer systems and the implementation of proper pilot training, airlines and crew members look to mitigate the inherent risks associated with human factors.

Line Operations Safety Audit (LOSA)

LOSA is a structured observational program designed to collect data for the development and improvement of countermeasures to operational errors.[20] Through the audit process, trained observers are able to collect information regarding the normal procedures, protocol, and decision making processes flight crews undertake when faced with threats and errors during normal operation. This data driven analysis of threat and error management is useful for examining pilot behavior in relation to situational analysis. It provides a basis for further implementation of safety procedures or training to help mitigate errors and risks.[10] Observers on flights which are being audited typically observe the following:[20]

  • Potential threats to safety
  • How the threats are addressed by the crew members
  • The errors the threats generate
  • How crew members manage these errors (action or inaction)
  • Specific behaviors known to be associated with aviation accidents and incidents

LOSA was developed to assist crew resource management practices in reducing human error in complex flight operations.[10] LOSA produces beneficial data that reveals how many errors or threats are encountered per flight, the number of errors which could have resulted in a serious threat to safety, and correctness of crew action or inaction. This data has proven to be useful in the development of CRM techniques and identification of what issues need to be addressed in training.[10]

Crew Resource Management (CRM)

CRM is the "effective use of all available resources by individuals and crews to safely and effectively accomplish a mission or task, as well as identifying and managing the conditions that lead to error."[21] CRM training has been integrated and mandatory for most pilot training programs, and has been the accepted standard for developing human factors skills for air crews and airlines. Although there is no universal CRM program, airlines usually customize their training to best suit the needs of the organization. The principles of each program are usually closely aligned. According to the U.S. Navy, there are seven critical CRM skills:[21]

  • Decision Making - the use of logic and judgement to make decisions based on available information
  • Assertiveness - willingness to participate and state a given position until convinced by facts that another option is more correct
  • Mission Analysis - ability to develop short and long term contingency plans
  • Communication - clear and accurate sending and receiving of information, instructions, commands and useful feedback
  • Leadership - ability to direct and coordinate activities of pilots & crew members
  • Adaptability/Flexibility - ability to alter course of action due to changing situations or availability of new information
  • Situational Awareness - ability to perceive the environment within time and space, and comprehend its meaning

These seven skills comprise the critical foundation for effective aircrew coordination. With the development and use of these core skills, flight crews "highlight the importance of identifying human factors and team dynamics to reduce human errors that lead to aviation mishaps."[21]

Application and Effectiveness of CRM

Since the implementation of CRM circa 1979, following the need for increased research on resource management by NASA, the aviation industry has seen tremendous evolution of the application of CRM training procedures.[22] The applications of CRM has been developed in a series of generations:

  • First generation: emphasized individual psychology and testing, where corrections could be made to behaviour.
  • Second generation: featured a shift in focus to cockpit group dynamics.
  • Third evolution: diversification of scope and an emphasis on training crews in how they must function both in and out of the cockpit.
  • Fourth generation: CRM integrated procedure into training, allowing organizations to tailor training to their needs.
  • Fifth generation (current): acknowledges that human error is inevitable and provides information to improve safety standards.[23]

Today, CRM is implemented through pilot and crew training sessions, simulations, and through interactions with senior ranked personnel and flight instructors such as briefing and debriefing flights. Although it is difficult to measure the success of CRM programs, studies have been conclusive that there is a correlation between CRM programs and better risk management.[23]

Cockpit Task Management (CTM)

Multiple sources of information can be taken from one interface here. Pilots may get information from the attitude indicator, altitude or airspeed in one scan.

Cockpit task management (CTM) is the "management level activity pilots perform as they initiate, monitor, prioritize, and terminate cockpit tasks."[24] A 'task' is defined as a process performed to achieve a goal (i.e. fly to a waypoint, descend to a desired altitude).[24] CTM training focuses on teaching crew members how to handle concurrent tasks which compete for their attention. This includes the following processes:

  • Task Initiation - when appropriate conditions exist
  • Task Monitoring - assessment of task progress and status
  • Task Prioritization - relative to the importance and urgency for safety
  • Resource Allocation - assignment of human and machine resources to tasks which need completion
  • Task Interruption - suspension of lower priority tasks for resources to be allocated to higher priority tasks
  • Task Resumption - continuing previously interrupted tasks
  • Task Termination - the completion or incompletion of tasks

The need for CTM training is a result of the capacity of human attentional facilities and the limitations of working memory. Crew members may devote more mental or physical resources to a particular task which demands priority or requires the immediate safety of the aircraft.[24] CTM has been integrated to pilot training and goes hand in hand with CRM. Some aircraft operating systems have made progress in aiding CTM by combining instrument gauges into one screen. An example of this is a digital attitude indicator, which simultaneously shows the pilot the heading, airspeed, descent or ascent rate and a plethora of other pertinent information. Implementations such as these allow crews to gather multiple sources of information quickly and accurately, which frees up mental capacity to be focused on other, more prominent tasks.

Checklists like these ensure that pilots are able to follow operational procedure and aids in memory recall.

Checklists

The use of checklists before, during and after flights has established a strong presence in all types of aviation as a means of managing error and reducing the possibility of risk. Checklists are highly regulated and consist of protocols and procedures for the majority of the actions required during a flight.[25] The objectives of checklists include "memory recall, standardization and regulation of processes or methodologies."[25] The use of checklists in aviation has become an industry standard practice, and the completion of checklists from memory is considered a violation of protocol and pilot error. Studies have shown that increased errors in judgement and cognitive function of the brain, along with changes in memory function are a few of the effects of stress and fatigue.[26] Both of these are inevitable human factors encountered in the commercial aviation industry. The use of checklists in emergency situations also contributes to troubleshooting and reverse examining the chain of events which may have led to the particular incident or crash. Apart from checklists issued by regulatory bodies such as the FAA or ICAO, or checklists made by aircraft manufacturers, pilots also have personal qualitative checklists aimed to ensure their fitness and ability to fly the aircraft. An example is the IM SAFE checklist (illness, medication, stress, alcohol, fatigue/food, emotion) and a number of other qualitative assessments which pilots may perform before or during a flight to ensure the safety of the aircraft and passengers.[25] These checklists, along with a number of other redundancies integrated into most modern aircraft operation systems, ensure the pilot remains vigilant, and in turn, aims to reduce the risk of pilot error.

Notable examples

Lua error in package.lua at line 80: module 'strict' not found. One of the most famous incidents of an aircraft disaster attributed to pilot error was the nighttime crash of Eastern Air Lines Flight 401 near Miami, Florida on December 29, 1972. The captain, first officer, and flight engineer had become fixated on a faulty landing gear light and had failed to realize that the flight controls had been bumped by one of the crew, altering the autopilot settings from level flight to a slow descent. Told by ATC to hold over a sparsely populated area away from the airport while they dealt with the problem (with, as a result, very few lights on the ground visible to act as an external reference), the distracted flight crew did not notice the plane losing height and the aircraft eventually struck the ground in the Everglades, killing 101 out of 176 passengers and crew.

The subsequent National Transportation Safety Board (NTSB) report on the incident blamed the flight crew for failing to monitor the aircraft's instruments properly. Details of the incident are now frequently used as a case study in training exercises by aircrews and air traffic controllers.

During 2004 in the United States, pilot error was listed as the primary cause of 78.6% of fatal general aviation accidents, and as the primary cause of 75.5% of general aviation accidents overall.[27] For scheduled air transport, pilot error typically accounts for just over half of worldwide accidents with a known cause.[28]

  • 28 July 1945 – a United States Army Air Forces B-25 bomber bound for Newark Airport crashed into the 79th floor of the Empire State Building after the pilot became lost in a heavy fog bank situated over Manhattan. All three crewmen were killed as well as eleven office workers in the building.
  • 24 December 1958 – BOAC Bristol Britannia 312, registration G-AOVD, crashed as a result of a controlled flight into terrain, (CFIT), near Winkton, England while on a test flight. The crash was caused by a combination of bad weather and a failure on the part of both pilots to read the altimeter correctly. The first officer and two other people survived.
  • 3 January 1961 - Aero Flight 311 crashed near Kvevlax, Finland. All twenty-five occupants were killed in the crash, the worst in Finnish history. An investigation later determined that both pilots were intoxicated during the flight, and may have been interrupted by a passenger at the time of the crash.
  • 28 February 1966 – American astronauts Elliot See and Charles Bassett were killed when their T-38 Talon crashed into a building at Lambert-St. Louis International Airport during bad weather. A NASA investigation concluded that See had been flying too low on his landing approach.
  • 29 December 1972 - Eastern Air Lines Flight 401 crashed into the Florida Everglades after the flight crew failed to notice the deactivation of the plane's autopilot, having been distracted by their own attempts to solve a problem with the landing gear. Out of 163 occupants, 75 survived the crash.
  • 27 March 1977 – the Tenerife disaster; a senior KLM pilot failed to hear, understand or follow tower instructions, causing two Boeing 747s to collide on the runway at Tenerife; 583 people were killed in the worst-ever air disaster.
  • 28 December 1978 – United Airlines Flight 173; a flight simulator instructor Captain allowed his Douglas DC-8 to run out of fuel while investigating a landing gear problem. United Airlines subsequently changed their policy to disallow "simulator instructor time" in calculating a pilot's "total flight time". It was thought that a contributory factor to the accident is that an instructor can control the amount of fuel in simulator training so that it never runs out.
  • 13 January 1982 – Air Florida Flight 90, a Boeing 737-200 with 79 passengers and crew, crashed into the 14th Street Bridge and careened into the Potomac River shortly after taking off from Washington National Airport. Seventy-five passengers and crew, and four motorists on the bridge, were killed. The NTSB report blamed the flight crew for not properly employing the plane's de-icing system.
  • 19 February 1985 – above the Pacific Ocean the crew of China Airlines Flight 006 lost control of their Boeing 747SP after the No. 4 engine flamed out. The aircraft fell 10,000 feet in twenty seconds and lost a total of 30,000 feet in two-and-a-half minutes before control was regained. There were no fatalities but the aircraft was badly damaged.
  • 28 August 1988 – the Ramstein airshow disaster; a member of an Italian aerobatic team misjudged a manoeuvre, causing a mid-air collision. Three pilots and 67 spectators on the ground were killed.
  • 31 August 1988 – Delta Air Lines Flight 1141 crashed on takeoff after the crew forgot to deploy the flaps for increased lift. Of the 108 crew and passengers on board, fourteen were killed.
  • 8 January 1989 – in the Kegworth air disaster, a fan blade broke off in the left engine of a new Boeing 737-400, but the pilots mistakenly shut down the right engine. The left engine eventually failed completely and the crew could not restart the right engine before the aircraft crashed. Instrumentation on the 737-400 was different from earlier models, but no flight simulator for the new model was available in Britain.
  • 3 September 1989 – The crew of Varig Flight 254 made a series of mistakes so that their Boeing 737 ran out of fuel hundreds of miles off-course above the Amazon jungle. Thirteen died in the ensuing crash landing.
  • 21 October 1989 – Tan-Sahsa Flight 414 crashed into a hill near Toncontin International Airport in Tegucigalpa, Honduras, because of a bad landing procedure by the pilot. 127 people died in the accident.
  • 24 November 1992 – China Southern Airlines Flight 3943 departed Guangzhou on a 55-minute flight to Guilin. During the descent towards Guilin, at an altitude of 7,000 feet (2,100 m), the captain attempted to level off the plane by raising the nose and the plane's auto-throttle was engaged for descent, but the crew did not notice that the number 2 power lever was at idle. This led to an asymmetrical power condition. It crashed on descent to Guilin Airport, killing all 141 aboard.
  • 23 March 1994 – Aeroflot Flight 593 crashed on its way to Hong Kong. The captain, Yaroslav Kudrinsky, invited his two children into the cockpit, and permitted them to sit at the controls, against airline regulations. His fifteen-year-old son, Eldar Kudrinsky, accidentally disconnected the autopilot, causing the plane to bank to the right before diving. The co-pilot brought up the plane too far, causing it to stall and start a flat spin. The pilots recovered the plane but it crashed into a forest, killing all 75 people on board.
  • June 24, 1994 - B-52 crashes in Fairchild Air Force Base. The crash was largely attributed to the personality and behavior of Lt Col Arthur "Bud" Holland, the pilot in command, and delayed reactions to the earlier incidents involving this pilot. After past histories, Lt Col Mark McGeehan, a USAF squadron commander, refused to allow any of his squadron members to fly with Holland unless he (McGeehan) was also on the aircraft. This crash is now used in military and civilian aviation environments as a case study in teaching crew resource management.
  • 30 June 1994 - Airbus Industrie Flight 129, a certification test flight of the Airbus 330-300, crashed at Toulouse-Blagnac Airport. While simulating an engine-out emergency just after takeoff with an extreme center of gravity location, the pilots chose improper manual settings which rendered the autopilot incapable of keeping the plane in the air, and by the time the PIC regained manual control, it was too late. The aircraft was destroyed, killing the flight crew, a test engineer, and four passengers which included Airbus and airline customer VIPs. The investigative board concluded the PIC was overworked from earlier flight testing that day, and was unable to devote sufficient time to the preflight briefing. As a result, Airbus had to revise the engine-out emergency procedures.
  • 2 July 1994 - USAir Flight 1016 crashed to a residential house and the airplane was destroyed. There were 20 passengers and crew injured in that accident.
  • 20 December 1995, American Airlines Flight 965 Boeing 757-200 with 155 passengers and a crew of eight, departed Miami approximately two hours behind schedule at 1835 Eastern Standard Time (EST). The investigators believe that the pilot's unfamiliarity with the modern technology installed in the Boeing 757-200 may have played a role. The pilots did not know their location in relation to a radio beacon in Tulua. The aircraft was equipped to provide that information electronically, but according to sources familiar with the investigation, the pilot apparently did not know how to access the information.
  • 6 August 1997 - Korean Air Flight 801, crashed at Nimitz Hill, which was located 3 miles away from Guam International Airport . During the approach to the airport, the captain’s failure to conduct a non-precision approach properly led to this misfortune accident. The National Transportation Safety Board announced that the possible factor of captain’s poor decision was fatigue. Out of 254 passengers and crews 228 were killed due to the crash.
  • 12 October 1997 – Singer John Denver died when his newly-bought Rutan Long-EZ home-built aircraft crashed into the Pacific Ocean off Pacific Grove, California. The NTSB indicated that Denver lost control of the aircraft while attempting to manipulate the fuel selector handle, which had been placed in a hard-to-reach position by the aircraft's builder. The NTSB cited Denver's unfamiliarity with the aircraft's design as a cause of the crash.
  • 16 July 1999 – John F. Kennedy, Jr. died when his plane, a Piper Saratoga, crashed into the Atlantic Ocean off the coast of Martha's Vineyard, Massachusetts. The NTSB officially declared that the crash was caused by "the pilot's failure to maintain control of his airplane during a descent over water at night, which was a result of spatial disorientation". Kennedy did not hold a certification for IFR flight, but did continue to fly after weather conditions obscured visual landmarks.
  • 31 August 1999 – 65 people died after Lineas Aéreas Privadas Argentinas (LAPA) flight 3142 crashed after an attempted take-off with the flaps retracted.
  • 31 October 2000 - Singapore Airlines Flight 006 was a Boeing 747-412 that took off from the wrong runway at the then Chiang Kai-Shek International Airport. It then collided with construction equipment on the runway, bursting into flames and killing 83 of 179 occupants.
  • 12 November 2001 – American Airlines Flight 587 encountered heavy turbulence and the co-pilot over-applied the rudder pedal, turning the Airbus A300 side to side. Due to the excessive stress, the rudder failed. The A300 spun and hit a residential area, crushing 5 houses and killing 265. Contributing factors included wake turbulence and pilot training.
  • 24 November 2001 – Crossair Flight 3597 crashed into a forest on approach to runway 28 at Zurich Airport. This was caused by Captain Lutz descending below the minimum safe altitude of 2400 feet on approach to runway 28 at Zurich.
  • 15 April 2002 – Air China Flight 129, a Boeing 767-200, crashed near Pusan, South Korea killing 128 of the 166 people aboard. The co-pilot had been flying too low.
  • 25 October 2002 – eight people, including US Senator Paul Wellstone, were killed in a crash near Eveleth, Minnesota. The NTSB concluded that "the flight crew did not monitor and maintain minimum speed."
  • 26 February 2004 – a Beech 200 carrying Macedonian President Boris Trajkovski crashed, killing Trajkovski and eight other passengers. The crash investigation ruled that the accident was caused by "procedural mistakes by the crew" during the landing approach.
  • 3 January 2004 – Flash Airlines Flight 604 dived into the Red Sea shortly after take off. All 148 people were killed. The captain had encountered vertigo, his control column was slanted to the right, and the captain did not notice. The 737 banked until it was unable to stay in the air. It is Egypt's worst air disaster.
  • 14 August 2005 – the pilots of Helios Airways Flight 522 lost consciousness, most likely due to hypoxia caused by failure to switch the cabin pressurization to "Auto" during the pre-flight preparations. The Boeing 737-300 crashed after running out of fuel, killing all on board.
  • 3 May 2006 – Armavia Flight 967 performed a CFIT, killing all on board, after the pilot lost spatial awareness during a simultaneous turn and climb.
  • 27 August 2006 – Comair Flight 191 failed to become airborne and crashed at Blue Grass Airport after the flight crew inadvertently attempted takeoff from a much shorter secondary runway rather than the intended takeoff runway. 49 of the 50 on board, including all 47 passengers, were killed.
  • 1 January 2007 – Adam Air Flight 574; The crew's preoccupation with a malfunction of the inertial reference system diverted their attention from the flight instruments and allowed the increasing descent and bank angle to go unnoticed. Appearing to have become spatially disoriented, the pilots did not detect and appropriately arrest the descent soon enough to prevent loss of control. This caused the aircraft to hit the water at high speed and a steep angle and disintegrate, killing all 102 people on board.[29]
  • 7 March 2007 – Garuda Indonesia Flight 200; poor Crew Resource Management and the failure to extend the flaps led the aircraft to run off the end of the runway after landing. Twenty-two of the 140 occupants were killed.
  • 12 February 2009 - Colgan Air Flight 3407 flying as Continental Connection entered a stall and crashed in to a house in Clarence Center, New York due to lack of situational awareness of air speed by the captain and first officer and the captain’s improper reaction to the plane’s stick-shaker stall warning system. All 49 people in the plane died, along with one person inside the house.
  • 1 June 2009 - Air France Flight 447 entered a stall and crashed into the Atlantic Ocean following pitot tube failure and improper control inputs by the first officer. All 216 passengers and 12 crew members died.
  • 10 April 2010 – 2010 Polish Air Force Tu-154 crash; during a descent towards Russia's Smolensk North Airport, the flight crew of the Polish presidential jet ignored automatic warnings and attempted a risky landing in heavy fog. The Tupolev Tu-154M descended too low and crashed into a nearby forest; all of the occupants were killed, including Polish president Lech Kaczynski, his wife Maria Kaczynska, and numerous government and military officials.
  • 22 May 2010 - Air India Express Flight 812; overshot the runway at Mangalore Airport, killing 158 people. The plane touched down 610 metres from the usual touchdown point after a steep descent. CVR recordings showed that the Brit-Serb captain of the aircraft was sleeping and woke up just minutes before landing. His lack of alertness made the plane land very fast and steep and it ran off the end of the tabletop runway.
  • 28 July 2010 – Airblue Flight 202 crashed into the Margalla Hills due to the pilot going the wrong way, killing all 152 occupants aboard.
  • 20 June 2011 - RusAir Flight 9605 crashed onto a motorway while on final approach to Petrozavodsk Airport in western Russia, after the intoxicated navigator encouraged the captain to land in heavy fog. Forty-three people died in the crash, while only five survived.
  • 6 July 2013 - Asiana Airlines Flight 214 tail struck on the runway 28L at San Francisco International Airport, breaking the seawall on initial impact as the plane slid down the runway. The crash caused 3 fatality and 187 injured passengers, becoming one of the 6 Boeing 777 crash in history. Investigators report that the accident has been caused from too low approach speed and approach path during the final approach of the landing.
  • February 6, 2015, one of TransAsia Flight 235’s engines experienced a flameout. As airplanes are able to fly on one engine alone, the pilot then shut down an engine. However, he accidentally shut off the correctly functioning one and left the plane powerless, at which point he unsuccessfully tried to restart both engines. The plane then clipped a bridge and plummeted into a Taiwanese river as the pilot tried to avoid city terrain, killing 37 of the 53 on board.[citation needed]

See also

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. 3.0 3.1 Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Foyle, D. C., & Hooey, B. L. (Eds.). (2007). Human performance modeling in aviation. CRC Press.
  8. 8.0 8.1 Lua error in package.lua at line 80: module 'strict' not found.
  9. 9.0 9.1 Lua error in package.lua at line 80: module 'strict' not found.
  10. 10.0 10.1 10.2 10.3 10.4 10.5 Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found.
  12. Lua error in package.lua at line 80: module 'strict' not found.
  13. 13.0 13.1 Wiegmann, D. A., & Shappell, S. A. (2001). Human error perspectives in aviation. The International Journal of Aviation Psychology, 11(4), 341-357.
  14. Stacey, Daniel (15 January 2015). "Indonesian Air-Traffic Control Is Unsophisticated, Pilots Say". The Wall Street Journal. Retrieved 26 January 2015
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. Wickens, C. D. (2002). Situation awareness and workload in aviation. Current directions in psychological science, 11(4), 128-133.
  17. Paxson, P. (2002). Have you been injured? The current state of personal injury lawyers’ advertising. The Journal of Popular Culture, 36(2), 191-199.
  18. 18.0 18.1 Lua error in package.lua at line 80: module 'strict' not found.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. 20.0 20.1 Lua error in package.lua at line 80: module 'strict' not found.
  21. 21.0 21.1 21.2 Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. 23.0 23.1 Lua error in package.lua at line 80: module 'strict' not found.
  24. 24.0 24.1 24.2 Lua error in package.lua at line 80: module 'strict' not found.
  25. 25.0 25.1 25.2 Lua error in package.lua at line 80: module 'strict' not found.
  26. Lua error in package.lua at line 80: module 'strict' not found.
  27. 2005 Joseph T. Nall Report
  28. PlaneCrashInfo.com accident statistics
  29. http://www.dephub.go.id/knkt/ntsc_aviation/baru/Final_Report_PK-KKW_Release.pdf Aircraft Accident Investigation Report of Indonesian's National Transportation Safety Committee