Researchers find multiple human factors pervasive in never events
A new study finds that approximately four to nine human factors contribute to each never event.
Although systems management plays a distinct role in preventing never events, a new study shows that multiple human factors are at play when a never event occurs.
Researchers at the Mayo Clinic in Rochester, Minnesota recently published a study in Surgery that analyzed 69 never events occurring in 1.5 million invasive procedures, and found that 628 human factors contributed to those errors. Each event included approximately four to nine human factors.
The researchers utilized a method typically reserved for military plane crashes that specifically evaluates factors tied to environmental, organizational, job, and individual characteristics. Based on their analysis, researchers categorized the 628 human factors into four categories:
- Preconditions for action (including overconfidence, stress, mental fatigue, or an inability to see the bigger picture)
- Unsafe actions (including confirmation bias in which surgeons act on something they thought they saw)
- Oversight (inadequate supervision or staffing deficiencies)
- Organizational influences (cultural or systemic problems)
Lead author of the study, Juliane Bingener, MD, a gastroenterologic surgeon at the Mayo Clinic, spoke with Patient Safety Monitor Journal about the results of the study and how hospitals should address human factors that contribute to medical errors.
(Editor's note: The following has been edited for clarity.)
Q: Why did you and your colleagues decide to explore this issue, and why did you decide on this approach that is traditionally used to investigate military crashes?
A: These surgical never events are something we measure. We don't want to have any of them, but we do have some of them. People have looked at them before; we aren't the first ones to look at never events. People say we have to educate better, but we actually had the privilege of having someone who trained in aviation work here. He had introduced this method, the human factors system, to many of the quality analysts who work here [co-author Joseph M. Nienow].
Joe had the foresight to collect this over five years, so when we realized we had this data trove and we were looking at what else we could do, we said wow this is something we should probably look into and see what is in there, and is it true what we all think, that communication is the thing we all have to do better in order to prevent those things.
Q: So you had that data sitting there for your use?
A: Essentially someone had the foresight to understand why that was so important in another high-risk industry, and introduced it here.
Q: Can you explain what that method is and how that translates from the military to healthcare?
A: Many examples are very similar [from one industry to the other]. For example, you think you see something that's not really there, but you have convinced your mind that it is there, and you do something based on what you think you see. Then you do the wrong thing and it doesn't matter if you don't land the airplane correctly on the aircraft carrier deck or you cut something that's not there, that's confirmation bias.
Overconfidence is one of the factors we looked at. You can imagine if you have a pilot who says, "I can do this," or a surgeon that says, "I can do this," the same principles apply. Human factors are present in us all even if the work environment is different. The way the whole system was set up was more based on how humans work rather than the work environment.
Q: Were you surprised at the number of human factors that impacted never events?
A: I think The Joint Commission has looked at closed case reviews and things like that, and often communication was the one thing that was cited as the most significant contributor to these events happening. People don't talk to one another and no wonder something happens.
What I was most surprised with was the high proportion of cognitive work factors that were present.
Q: Your research showed that two-thirds of never events occurred during minor procedures. Why is that?
A: I think this has something to do with how much everybody is very concerned about the really difficult things. When everyone is aware that this is very difficult, everyone tries to be in proper form. But when you think this is just a minor case you might not have the same attitude. Just like most of the car accidents happen within five miles of home, you think you know how to get home and then something happens.
Also, I think many of those minor cases have laterality, which means they are either on the right or the left side; that's certainly one of those things where something can happen.
Q: And that goes back to overconfidence?
A: Overconfidence could be one of the things. Confirmation bias is another thing. Maybe someone didn't really understand that there was a difference between this part and this part of an area. You thought the upper thigh is the upper thigh, but this part of the upper thigh is a little bit different than another part and you didn't really understand those two things.
Q: If a hospital is looking at these kinds of issues associated with never events, what can they take away from this study?
A: I think making people aware of those things [is important]. Many hospitals have formalized how you communicate because we've learned communication is an important thing. There are certain ways of looking at how you communicate. I think what we have looked at here is there are other things, like cognitive workflow and how do you match how many patients are scheduled, and what staff is available, and how many residents do you teach. How do you understand the cognitive capacity of the team and match that with the cognitive load they get because of frequency, time, and pressure? That would be my approach. Others may think differently about it. Maybe we should just look at the cases that went well and see how human factors were distributed there and learn from those examples.
Q: In general, should more hospitals be looking at human factors when it comes to quality and patient safety?
A: This is something new. All hospitals review their sentinel events but very few of them are lucky enough to have human factors analysis that is happening almost simultaneously with the event, or very briefly after. Others that have looked at human factors glean a year or two later what might have happened. The unique opportunity we had was someone who was trained in that had taken down the notes as the people that were involved in the events spoke about them. And I think that is something that is relatively novel.
It's almost real-time collection. You're not standing next to them while the error happens, but if you're reviewing it within a week or so, people still have a fresh memory of what happened, and that's when you can collect it.
I think overall we strive to get better and better and this may be another tool in our toolbox. There are other things like systems improvements, but I think human factors are one of the contributors and we need to pay attention to them.