Human factors in Aviation
Aeroflot Flight 593 Crash
The Russian airline flight crashed on March 23, 1994, flying from Moscow Seremetyevo Airport (SVO) to Hong Kong Kai Tak International Airport (HKG). 75 people were killed including 12 crew members. It crashed in the Mezhdurechensk region near Novokuznetsk in Russia. The flight suddenly descended while navigating an altitude of over 10,000m. Investigations into the aviation accident revealed the presence of three unauthorized persons in the pilot cabin. The three included two teenagers who were the children of Kudrinsky, the flight’s relief captain. He left the teenagers unsupervised inside the cockpit while the co-pilot also got distracted. So, one of the teenager’s actions ultimately led to the accident. The aircraft involved in the crash Aeroflot Flight 593 was manufactured by Airbus and logged its maiden flight on September 11, 1991 (Aviation Accidents, 2017). Investigators blamed the plane crash on the gross negligence of the pilots.
Chain of events and error sequence leading to the crash
The relief captain Kudrinsky took over the flight about 30 minutes after taking off when Captain Danilov went into the passenger cabin. During this period, the plane was put into autopilot mode. The aircraft then flew smoothly for the next four hours until the last 18 minutes when Kudrinsky allowed Eldar, his 15-year-old son, and Yana, his 13-year-old daughter, together with a captain Makarov, who was flying as a passenger. The first error was Kudrinsky allowing unauthorized persons into the pilot cabin, which was flouting basic flight safety rules. Then, he left his two children, who were minor and untrained in his seat, which was on the left-hand side. He was in charge of the plane at the moment but grossly neglected being charge of the pilot’s cabin. Further, Kudrinsky didn’t even formally hand over to the other co-pilot Piskarev. By leaving his working place to unauthorized persons, Kudrinsky contravened crucial pilot safety regulations. The rules state that captains are not required to leave their working area during their entire flight. It is only allowed for a short duration and during a perfect flying state (Szilagyi & Davis). However, in Kudrinsky’s case, he careless left his working place in the hands of two untrained persons, let alone minors.
Kudrinsky then took his daughter and tried to show her the controls of flying an airplane. In the process, he turned off the autopilot navigation mode. His actions during this period led to Kudrinsky deviating from the authorized flight path. This action also placed the aircraft in danger of colluding with other planes in the air. Yana continued sitting in the captain’s seat for over seven minutes. During this time frame, the plane lacked pilot control. Kudrinsky was more concerned with impressing his daughter about the prestige of flying an aircraft. This action placed the aircraft in complete danger of contravening any new instructions from ATC. To make the situation worse, the co-pilot was also distracted and neglected to operate the aircraft or stopping Kudrinsky’s careless actions. He then allowed his son to take his seat. Then, Kudrinsky permitted his son to turn the control wheel. During this time, Makarov was filming Eldar in the captain’s seat. Eldar continued to exert more force on the control wheel. His constant moves interfered with the AP commands (Szilagyi & Davis). This caused the AP to disengage and follow the control wheel’s direction.
No one in the cockpit was aware of what was unfolding in the plane. However, it still kept its pre-programmed speed and altitude. Eldar’s moves on the control wheel caused the plane to make a right turn without Kudrinsky and co-pilot noticing. Kudrinsky was busy discussing leisure pursuits once he arrives in Hong Kong. He wasn’t in control of the plane’s instruments, so he didn’t have any knowledge of the change in the flight’s conditions. The lack of attention by the pilots led them to believe the claims of Makarov that the plane had gone into a holding pattern. This was another error by the pilots as a commercial never goes into holding patterns once its pre-programmed destination route has been set. The only exception is when a commercial plane enters a restricted zone, which wasn’t the case. These errors caused the plane to lose control and veer off its programmed speed and altitude. As a result, in the last 2 minutes, the pilots struggled to exert control of the flight. The flight was now flying in an almost vertical position. Only Eldar was in a position to control the aircraft but had no flight operations knowledge. Eventually, Kudrinsky did manage to get into his captain’s seat; however, it wasn’t too late (Szilagyi & Davis). The lack of sufficient altitude to stabilize the plane caused it to hit the ground and kill everyone.
SHEL model
Using the SHEL model, this section will identify and evaluate the contributing human factors to the Aeroflot Flight 593 Crash. The human element (liveware) is at the center of the SHEL mode. It entails the software, hardware, environment, and liveware components. Liveware-Liveware
This encompasses the interpersonal interactions of the central human operator, which in this case, was the pilot and other flight crew members. In this incident, there was a breakdown in many factors involved in the liveware to liveware interactions. This includes leadership, cooperation, coordination, and teamwork. The human operator is subject to limitations and variations in performances. Individual factors such as stress, attitude, skills, and communication can affect the performance level of the flight crew. Even the best-trained flight crew are vulnerable to lapses and limitations. (Kozuba, 2011). Kudrinsky allowed his interactions with his children to distract him from performing his tasks. There was also no cooperation with the co-pilot who lost focus of the flight while Kudrinsky was engaging his children. This lack of cooperation and teamwork ultimately led to the pilot’s failure to prevent the crash. They allowed Makarov, who was an unauthorized person to offer an incorrect suggestion regarding the holding pattern of the aircraft. This fallacy misguided the pilots while the plane was in trouble. Therefore, the mismatch at liveware-liveware occurred through errors in communication and coordination by flight personnel resulting in the accident.
Liveware-Software
These are non-physical aspects that determine the operation of an aviation system, the organization of information. They include the Aviation safety procedures and rules, manuals standard operating procedures, norms, or practices that crew members are expected to adhere. Aviation systems must ensure that pilots operate under strict flight procedures to prevent costly errors (Kozuba, 2011). In this incident, a liveware-software mismatch occurred, leading to the crash. The flight’s relief captain Kudrinsky failed to follow the basic safety procedures while in the pilot cabin. First, flight rules mandate that pilots must be in their working place during the entire flight. Kudrinsky broke this rule by allowing his two children to sit in his captain’s seat. Second, he transgressed basic flight safety rules, such as not allowing unauthorized persons into the pilot cabin. He allowed three passengers to be in the cabin. As a result, he got distracted and lost control of the flight. Further, Kudrinsky didn’t follow the regulations of handing over the flight to the co-pilot. This would have saved the flight from crashing while Kudrinsky was distracted (Kozuba, 2011). So the liveware-software mismatch happened due to procedural safety errors that led to the loss of flight control.
Liveware-Hardware
This is the interaction between aviation machinery and the human component. It’s the first thing that must be matched in the SHEL model. It includes aircraft features such as designs of cockpit controls and displays, information processing systems, warning systems. These hardware features must be matched to the human characteristics to prevent any poor designs
(Kozuba, 2011). In the Aeroflot flight 593 crash, Eldar’s actions led to the autopilot disengaging for a couple of seconds. A mismatch occurred when the pilots failed to notice a silent indicator light, which was meant to alert them of this change. They were unable to see because the warning signal wasn’t audible enough to catch their attention. The pilots became confused when the screens showed a 180-degree turn similar to a holding pattern. The steep turn made at the prevailing altitude was beyond the aircraft’s design. This caused the plane to descend quickly. This made it difficult for pilots to regain control. However, when later Piskaryov tried to correct the stalling of the aircraft, it was too late as the lower altitude made it difficult for recovery. This liveware-hardware mismatch occurred when pilots failed to heed the warning signals leading to the plane hitting the ground.
Liveware-Environment
This entails how the internal and external environments interact with the human operator. It helps pilots operate in an optimal environment. This includes internal aspects such as noise, lighting, and temperature, and external elements like weather conditions and visibility (Kozuba, 2011). In this incident, investigations revealed no mismatch as the pilots operated in a suitable environment inside and outside. The weather condition was right before the crash.
Classification of errors
Skill-based errors
Skill-based errors occur when a pilot suffers attention or memory lapses, which makes them ignore vital warning signs. Skill-based errors can cause pilots to omit key procedures during the flight. A pilot also is distracted and fails to prioritize attention on key flight controls. This situation can cause a pilot to over control an aircraft or inadvertent use of flight controls (Shappell et al.,2017). In this incident, distractions caused the pilots to fail to recognize the disengagement of the autopilot mode. They didn’t notice the indicator light. These lapses in concentration led to errors committed in their attempts to regain flight control.
Decision errors
These are intentional decisions made by pilots with the right motive to save a situation. However, their actions cause poor choices due to insufficient knowledge. It can include exceeding ability or inappropriate maneuvers (Shappell et al.,2017). This situation was evident when Piskaryov pulled the plane out of the dive. However, he overcorrected the situation, which led to the plane stalling descending. Also, the pilots didn’t know that the autopilot could have prevented stalling if they had disengaged the controls.
Perceptual errors
This is when pilots’ perception of reality causes errors. It is caused by spatial disorientation, visual illusions. This degraded sensory input causes pilots to make erroneous decisions. They misjudge vital aspects such as terrain, altitude, or aircraft speed (Shappell et al.,2017). For example, in-flight 593, before the crash, the pilots were not aware of the altitude. As a result, they were unable to recover from the low altitude they had descended. Also, the night conditions caused a temporary loss of spatial orientations.
Significance of findings to understanding human factors
Investigation findings help aviation experts establish the probable cause of a crash. They reveal whether a pilot transgressed any regulation that ultimately compromised flight safety. In the flight 593 crash, investigators determined that even the best or experienced pilots are subject to contravening standard operating procedures. These insights help aviation authorities enforce even stricter rules to prevent contravening of any rules or regulations experts. For example, in the incident, contravening the rule of unauthorized persons led to critical errors being made in flight control. This brought into sharp focus the importance of limiting pilot interaction to the only flight crew. Findings helped investigators determine human reactions in the final minutes of the crash. It gave insights into the pilot actions during emergency situations. By analyzing cockpit data and voices, the investigators can identify the common errors pilots make in emergencies. Such a piece of knowledge helps technicians invent new equipment such as warning signals to aid in an emergency. For instance, the findings helped investigators realize that during the final minutes, the pilots failed to notice warning signals because they were inaudible. This situation helped engineers design systems that are audible to humans in the specific model aircraft and prevent a future occurrence of such an incident (Latipulhayat, 2015). Therefore, aviation findings will continue to be crucial in improving in-flight human behavior that will enhance safety.
Official investigation report
The official investigation report of Flight 593 revealed that the crash was caused by a stall and spin, which resulted in a plan impacting the ground. It blamed the crash on a combination of factors. One, the hasty decision by Kudrinsky to allow an untrained and unauthorized individual to operate the flight. The report also faulted Kudrinksy for not operating the autopilot mode in his seat. He and his co-pilot failed to detect the autopilot was disengaged by the outsider. The report identified the weakness in the flight, such as lack of an appropriate warning system to alert the pilots that the autopilot mode was cut out. It cited inadequate information in the flight manual regarding the system, making it hard for the crew to detect it by their senses. It also found out the pilots were late in controlling the aircraft. They were focused on correcting the why the plane banked to the right. It stated that a stronger alert signal would have prevented the plane from exceeding the allowable operating bank angle. The investigation report also identified the sitting posture of the co-pilot to be less than ideal, which limited his ability to enter the control loop earlier. The findings singled out the unpreparedness of the flight crew in acting in such a crisis to the inadequate drills during their flight training program. The report concluded by highlighting some shortcomings before the flight. One was inadequate provisions to regulate the entrance of foreign aircraft models into the Russian airspace. Another was inadequate training of Russian civil pilots in recovering from unusual altitudes or maintain spatial orientation. Also, mentioned was insufficient monitoring of flights by the Russian Aviation Authority (Mashkivsky, 1994). Some of the things the official investigation report failed to mention was that the plane manufacturer and airline ought to take full responsibility for the crash. The airline, because one of their best pilots, failed to follow basic safety rules while in-flight. This highlighted some weaknesses in their pilot training program. While the manufacturer, Airbus, was to blame for failing to adequately train flight operators on the controls and warning systems installed in the aircraft. The investigation report may have failed to address this issue because there were also lapses on the Russian aviation authorities. So, some form of responsibility was also on their part.
Lessons from the Aviation accident
The accident showed that giving pilots absolute authority may lead to critical situations. When Kubrinsky was given control of the pilot, he abused his authority. Crew members should be given more authority to pinpoint any pilot errors or misgivings. If the flight members fear to raise any objections regarding the conduct of the pilot, then it increases the chances of critical incidences. This incident helped airlines give pilots the ultimate authority to control an aircraft but within certain limits. Flight members also had a voice over the operation of the flight. The incident also brought to the attention the risk of placing a high level of trust in pilots. Despite their expertise and experience, they are also human and fallible. They are subject to making errors. Crews should not become complacent with pilots (Anderson, 2011). The crash made airlines institute flight crew training programs that emphasized on human relationships. The incident encouraged proper social interaction among crew members. Increased social interaction has been shown to prevent critical situations in flights.
Another crucial lesson is the provision of adequate information support to pilots. When there is insufficient information to perform a flight control, the pilot may not complete the action. This was evident in the flight 593 crash, where pilots had inadequate information regarding the allowable operating bank angle. Also, they didn’t have enough information and indicators regarding the autopilot system of that particular plane model. This lack of correct information made it difficult to save the flight from a crash. So, this incident has helped airlines provide sufficient information support in the required format. This helps pilots have the correct information about critical tasks during flight operations (Holden, Ezer, Vos, 2013). These indicators and safeguards have helped prevent several crashes.
Policy recommendations
A policy shift is essential in improving aviation safety. One recommendation is changing the organizational approach to safety. Airlines need to institute more influential safety cultures within their structures. This ensures that their employees, such as flight crew members, have a greater awareness of flight hazards. This involves managers and supervisors regularly conducting training programs meant to update crew members on better safety practices while in flight. Flight crew members need to be enlightened on any new standard operating procedure or rules and regulations regarding flight operations (Chen et al., 2013). Such information support ensures they uphold strict security practices while in flight.
These interventions are necessary since any crew member can prevent their colleague from engaging in costly flight errors. For example, in the Flight 593 crash, if the co-pilot had a more robust safety culture, he could have prevented Kudrinsky from engaging in those careless actions. The co-pilot would have stopped the captain’s actions and ensured he focuses on his duty of controlling the flight. However, a complacency of flight safety allowed the situation to escalate into a crash. This policy will also encourage increased interaction between crew members and other flight personnel hence boost situational awareness (Chen et al., 2013). The incident of Flight 593 highlighted the importance of situational awareness by other crew members who would have intervened.
Teamwork is necessary to boost flight safety. An organization that ensures the double-checking of crucial safety considerations will help minimize any pilot errors. Periodic training of staff on air safety behaviors will provide the highest levels of aviation safety within an organization. These training sessions should create an incentive-based system meant to create rewards for safe practice. Such a system will reinforce safety behavior and motivate the crew to stick to the safety principles in the organization. Another critical issue in-flight safety is to redesign safety procedures and checklists and make them more precise and less ambiguous (Hawkins, 2017)
. When airline authorities encourage stricter implementation of flight safety procedures, then it will minimize human factors in aviation.
.
References
Szilagyi, G., & Davis, P. Flight Safety in Recent Russian History.
Kozuba, J. (2011). Impact of human factors on the likelihood of aircraft accidents. Archives of Transport System Telematics, 4, 29-36.
Shappell, S., Detwiler, C., Holcomb, K., Hackworth, C., Boquet, A., & Wiegmann, D. A. (2017). Human error and commercial aviation accidents: an analysis using the human factors analysis and classification system. In Human error in aviation (pp. 73-88). Routledge.
Latipulhayat, A. (2015). The Function and Purpose of Aircraft Accident Investigation According to the International Air Law. Mimbar Hukum-Fakultas Hukum Universitas Gadjah Mada, 27(2), 312-324.
Anderson, B. L. (2011). The Psychology of Safety.
Aviation Accidents. (2017, June 17). Russian Airlines – Airbus A310-308 (F-OGQS) flight AFL593. Retrieved from https://www.aviation-accidents.net/russian-airlines-airbus-a310-308-f-ogqs-flight-afl593/
Mashkivsky, I. (1994). he investigation into the crash of A310-308, registration F-OGQS, on March 22 1994 near the city of Mezhdurechensk. Aircraft Accident Investigation Commission. Retrieved from https://reports.aviation-safety.net/1994/19940323-0_A310_F-OGQS.pdf
Holden, K., Ezer, N., & Vos, G. (2013). Evidence report: risk of inadequate human-computer interaction.
Chen, J. C., Chi, C. F., & Li, W. C. (2013, July). The analysis of safety recommendation and human error prevention strategies in flight operations. In International Conference on Engineering Psychology and Cognitive Ergonomics (pp. 75-84). Springer, Berlin, Heidelberg.
Hawkins, F. H. (2017). Human factors in flight. Routledge.