Accidents are not just incidents or random happenings, they are the result of series of mistakes which may or may not have been done intentionally. Okoh and Haugen (2013) defines major accidents as adverse events such as major leaks/releases, fire, explosions or loss of structural integrity, leading to serious danger/damage, to multiple deaths and/or major damages to the environment or properties. All major industries are prone to accidents and resulting consequences, and the factors leading to them are more or less the same.
Contributing factors for accidents in oil & gas industry, nuclear, healthcare, and aviation have been briefly discussed below. The oil & gas industry has seen been persistent to several calamitous accidents which have resulted in several deaths as well as injuries. Investigations and statistics have shown that most of these are attributed to human factors (Theophilus, Esenowo, Arewa, Ifelebuegu, Nnadi, & Mbanaso, 2017). All major oil & gas industry accident has been found as a result of direct or indirect human factor failing. Yang & Huagen (2018) explains the man-made disaster theory which states that accidents arise because of interaction between human and the organizational arrangement in the socio-technical system.
It further points out that accidents develop through a long incubation period which is characterized by the accumulation of an unnoticed set of the event which is at odds with accepted beliefs about hazards and the norm for their avoidance. Furthermore, information flow, communication, misunderstandings, plan quality and overview of activity have been highlighted as some of the factors that lead might lead to accidents. The Bhopal gas tragedy is one of the worst disasters in the gas industry which resulted in deaths of over 6000 fatalities and injuring over 200,000. Over the years several reasons have identified as potential causes for the tragedy, particularly, citing human errors like ignorance when the supervisor did not listen to worker’ complaint about the leak in the tank, not using the warning system after initial detection of leak, and lack of proper safety training to production worker in an event of a leak (Singh, Rehlia, 2016).
Nuclear industry just like any other industry can be subjected to potential accidents due to human error. In order for a better understanding of any accidents at a nuclear plant, the International Atomic Energy Agency and the OECD Nuclear Energy Agency in the 1990s developed an international event rating scale with an objective to rate nuclear and radioactive events. Events are classified as on the scale at 7 levels. Levels 4 – 7 are termed accidents whilst level 1 – 3 are termed incidents. Events without any safety significance are classified as Below Scale/Level 0 (Högberg, 2013). Over the years, many reasons have been put forwards that were considered as one of the reasons for nuclear disasters. For example, accident at Chernobyl brought into light the importance of safety culture, management and organizational factors in the nuclear industry and how negligence in any of aforementioned area can lead to a catastrophic accident.
There are several contributing factors that can potentially pave way for a disaster, for example, inadequate knowledge and training, communication failure, over-reliance on the machine, and poor decision-making capabilities (International Atomic Energy Agency, 2014). In Chernobyl disaster human error played a key role in terms of insufficient understanding by the operator with respect to adequate operational safety procedure when the operator brought the reactor into an unstable state, violating the prescribed operating limit (Högberg, 2013). Healthcare industry in another area where medical errors can lead to unfortunate fatalities. Makaray & Daniel (2016) defines medical error as an unintended act either by omission or commission, or one that does not meet its desired outcome, the failure of a planned action to be completed as intended, the use of wrong plan to achieve an aim, or deviation from process of care that may or may not cause harm to the patient.
A study was carried out to understand incident reporting system between two hospital divisions. The study found out that there were several differences in the reporting system that could be related to human error; in terms of how reports were being investigated, the content of a report, i.e., outcome-based v/s communication and near-missed based, and finally, the feedback that staff got from the report (Hewitt, Chreim, & Forster, 2016).
Any misunderstanding or miscommunication in any of the above factor would seriously affect a patient’s condition. Another factor which can lead to a serious medical error is misreading of diagnostic results/tests. Lee, Nagy, Weaver, & Newman-Toker (2013) defines diagnostic error as a diagnosis that has been misread, missed, wrong or delayed as detected by some subsequent definitive test or finding.
Improper communication between two healthcare providers can also lead to someone’s death. For example, in January 2001, Josie was admitted to Johns Hopkins after suffering first and second-degree burns from climbing into a hot bath. She was treated at the hospital, healed well and within weeks was about to be discharged, but 2 days before her discharge, she died of dehydration and misused narcotics. Her death was primarily linked to lack of communication between the different healthcare providers involved (Human Error: Systems and Human Factors in Patient Safety, n.d.
). In an industry when there are lots of variables involved, the number of contributing factors that can lead an accident or a crash largely increase. It becomes difficult to assign a single cause for an accident when there can be several contributing factors.
(Oster, Strong, & Zorn, n.d.) have defined several ways to identify the cause of a crash. One of which is to identify and select the cause that initiated the sequence of events that accumulated into a crash or an accident. Other contributing factors that can be associated with plane crashes are equipment failure, pilot error, inadequate training, miscommunication, violations, judgment, and fatigue. Out of which the last three can closely related to human error.
Various studies have been carried out to understand what leads to air crashes, and percentages vary, but about 60-80% of accidents, in some form, have relation with respect to human error (Shappell et al., 2006). HFACS describes human factors at 4 levels: the Unsafe act of crew, air traffic controllers, unsafe supervision, and organizational influences. Particularly, in terms of errors, HFACS categorizes error into three types, namely: decision error, skill-based error, and perceptual error. One example of a skill-based error which lead to an air crash is the crash of American Airlines flight 587 in New York City in 2001.
In this accident, the tail section of the A300 came right off the fuselage. The investigation found out that excessive pressure was being applied on the rudder pedal by the first officer in order to stabilize the plane which was experiencing heavy turbulence behind a 747. Skill-based errors usually come into existence with very little or no conscious thought. The pilot was trained to use the rudder pedals in order to stabilize the plane without even realizing the consequences of what extreme rudder movements would do to the tail section which, eventually, broke off because of extreme pressure.