I met Professor Dekker couple of weeks ago during his visit to Engineering System Division at MIT.
He is definitely one of the most intelligent and sharp thinking persons I have ever met. I was impressed with his clarity and speed of thought. I could catch with his speed of thinking only after he made an iterative loop in his argument.
There are two major themes that are reflected throughout his book. First is that human error is not the cause but a symptom of a trouble. Accepting this notion is first step towards New view of human error, which can never be the conclusion of an investigation, but rather its starting point. Understanding error means reconstructing the context in which such decision was made. "Human error problem is an organizational problem. This means that understanding human error hinges in understanding the organizational context in which people work."
People tend to react post factum and backwardly reconstruct sequence of events in a linear manner. In the aftermath of an accident, one can easily list logical arguments how and why people should have foreseen and prevented the upcoming events. We easily judge people, who failed to take proper actions. We focus on their personal shortcomings such as absence of proper training and experience, health conditions and hours of proper sleep, etc. We usually tend to focus our attention on people who happened to be closest to the accident in term of time and space.
Second theme of the book is forward looking metal model, which focuses on preventing accidents in the future rather than analyzing the accident with a hindsight bias. Hindsight gives the investigators better and more complete information in comparison to people who made the decision prior to the failure. It provides with facts that become midpoints in the linear logic flow that the investigator reconstructs. As we walk backwards, each facts is perceive not as an "intersection point" with a list of equally valid choices, but rather as point of the process when/where incorrect decision was recorded. Hindsight bias exaggerates the importance of the recorded facts versus other events that are not directly related to the specific accident.
The author compares three accident models (1) the sequence-of-event, (2) the epidemiological model and (3) the system model. He argues that the latter is a holistic approach that looks at accidents as emerging form interactions between system components and processes, rather than failures within them.
Dekker warns against "quick fixes" and misuse of technical labels, which do not describe the gap between reality and our judgments, whereas "safety improvements come from organizations monitoring and understanding the gap between procedures and practice." He concludes that "a safety culture is a culture that allow the boss to hear bad news."
No comments:
Post a Comment