Approximately 5.7 million workers are injured annually in the U.S. In the healthcare field alone, the Institute of Medicine (IOM) reported in 1999 that medical error was accountable for between 44,000 and 98,000 deaths per year.
Human error will almost certainly be a contributor to such undesirable outcomes because human decision-making will determine one's behavior. When looking at the impact of decision errors we can easily see that poor decision-making results in preventable deaths, costly equipment downtime, poor quality of our products, and hence, reduced profitability.
In 2004, 91 percent of fatal work injuries occurred in private industries. Within private industry, 47 percent of those deaths occurred in the service industries and 44 percent occurred in the goods industries. Also, using published 2004 statistics from the Bureau of Labor and Statistics, we learned that construction industry deaths were up 8 percent to 1,224.
These statistics indicate an upward trend in accidents involving fatalities. However, think about all of the accidents that occur that do not rise to the level of a fatality. Sometimes such incidents result in degrees of financial loss and sometimes they result in degrees of risk.
In 1969, Frank E. Bird analyzed 1,753,498 "accidents" reported by 297 companies and found the new ratio of 600:30:10:1. This means, for every 600 near misses, there will be 30 property damage incidents, 10 minor injuries, and one major injury.
So when is good performance good enough? At what point do we rest on our laurels and relax our defenses? As you would imagine, never. Many would say this happened at NASA prior to Challenger (and again prior to Columbia).
When we start to think that there is not much room for improvement, we should remind ourselves of the following:
- If we were 99.99 percent accurate, we would still experience:
- Two unsafe plane landings per day at O'Hare Airport
- 500 incorrect surgical operations each week
- 50 newborn babies dropped at birth by doctors everyday
- 22,000 checks deducted from the wrong bank account each hour
- 32,000 missed heartbeats per person, per year
- 114,500 mismatched pairs of shoes shipped each year
- 200,000 documents lost by the IRS this year
The truths of human error
A decision error triggers existing conditions in our work environment to cause a series of physical consequences. Ultimately, if this sequence is permitted to continue, an undesirable event will occur in which we have no choice but to address it. This "undesirable event" will be deemed a failure of a certain severity and magnitude. Given a basic understanding of this error chain, we can see that many people have preconceived beliefs about "failure" that impact decision making (actions and inactions).
The facts are:
- Good people make honest mistakes
- A fast-paced, ever-changing world sometimes outsmarts us
- We can never work error free
- If we lower human error, we will lower failure rates
- Systems are not basically safe
- People design safety into systems, they do not come prepackaged that way
- Most people come to work prepared and have the relevant knowledge to be successful
Having the knowledge to be successful and applying that knowledge successfully are two different scenarios. We must possess the knowledge in the first place, organize the knowledge in a manner that makes it useable, and then know when to apply it.
As human beings we assume that bad outcomes equal bad processes. Although this can be true under certain circumstances it should not be generalized as to the "way it is." Often when there is a bad outcome, those involved were doing what they have done in the past that has been successful and safe for them. Their actions remained the same while the conditions they work in may have changed.
Many believe that to ensure that people "get it right," we should force more rules, and people will follow them. Unfortunately, this is not always true.
Adding more rules and tightening procedures adds more complexity to our working environments. As a result, the gap between procedure and practice widens instead of narrows. When that happens, we increase our risk of safety incidents.
Although practice should not be expected to equal procedure, a narrow gap between the two is acceptable. No procedure can possibly encompass every conceivable event and condition that could occur. Therefore, a certain amount of judgment must be afforded in our procedures. Without such room for judgment we handcuff our workers and do not allow them to apply their internal knowledge to situations that arise and are not accounted for in the procedure.
For those of us with experience in union environments, we know of common tactics such as "work slowdowns." which is simply when staff follow procedures exactly-- nothing more and nothing less. This will drop overall productivity quickly for the reasons described above.
The views of human error
Most industries would agree that 70 percent to 80 percent of mishaps are due to human error. Is this a correct perception? It is a very important question because the answer will depend on how we react to such "mishaps."
In the old school of human error research, it was believed that human error is a cause of accidents and incidents. Investigators would focus on the people involved and seek to explain the failure. This focus led to interrogations that focused on the inaccurate assessments made by those involved. As a result the investigators made bad judgments and ensuing wrong decisions.
The new school of human error has progressed to try and understand why people make the decisions they do by understanding deeper problems that exist in the organizational systems. When undesirable outcomes occur there is usually a poor decision made somewhere in the error chain. We must believe that the person who made the poor decision did not intend on the outcome.
Human error is not random. We can trace decision making patterns and trends to previous behavior. Human error is not the ending point of the analysis, it is the starting point.
Rule-based errors typically occur for three reasons:
- The rule itself was not correct and we followed it
- The rule was not correct therefore we applied it incorrectly
- The rule and the information regarding it were correct; we had a problem complying with it
Knowledge-based errors occur when situations arise which we have not been prepared to address (no rules exist). In this instance we must rely on our basic knowledge and apply it to the new situation. Many of the first responders to the Twin Towers on 9/11 experienced this type of error. There were many command and control problems due to the type, severity, and magnitude of the event and lack of procedures available covering the situations present.
To put the Generic Error Modeling System into proper perspective, we can draw the following generic conclusions:
- New hires are more prone to knowledge-based errors
- As we gain more years of experience we are more prone to become complacent with our jobs
- Highly experienced employees typically are less prone to skill-based errors however, they will become complacent and overconfident and cause such errors
- When a major skill-based error occurs and results in an undesirable outcome, then the system tends to "reset" back to zero
Robert Latino is executive vice president of the Reliability Center, Inc., in Hopewell, VA. For more information, visit www.proactforhealthcare.com.
This column first appeared in the December 2007 issue of the monthly newsletter Briefings on Patient Safety, published by HCPro, Inc., the parent company of HealthLeaders Media.
For information on how you can contribute to HealthLeaders Media online, please read our Editorial Guidelines.