Just Culture Enables a Culture of Safety

Elizabeth Duthie, Director of Patient Safety at Montefiore Medical Center, discusses how important a just culture is to creating a culture of safety. While just culture has always had critical importance within safety culture, many are particularly aware of this concern today, given the recent criminal conviction of a former nurse for a medical error that occurred in 2019.
Just Culture Enables a Culture of Safety
Featuring:
Elizabeth Duthie, RN, PhD, CPPS
Elizabeth Duthie is the Director of Patient Safety at Montefiore Medical Center. Her professional focus is on applying the science of patient safety to clinical practice with special interest in redesigning systems to prevent errors. She has published, researched, studied, and lectured extensively about patient safety concepts and human error theory.

Dr. Duthie received her PhD from New York University using James Reason’s human error theory. She has a certificate in Human Factors Engineering from the University of Wisconsin in Madison. She graduated from the first class of the AHA Forum’s Patient Safety Leadership program and is certified in patient safety through the Institute for Healthcare Improvement.
Transcription:

David Feldman MD, MBA, FACS (Host): Patient safety and high reliability continue to be elusive goals in healthcare. Creating a culture of safety that will allow us to achieve these goals, requires a number of important elements. In this series, we'll address what we believe are the essential ingredients in developing such a culture; mutual respect, teamwork and collaboration, human factors engineering and just culture.

I'm David Feldman, MD, Chief Medical Officer for The Doctors Company and Healthcare Risk Advisors, part of TDC Group. On today's episode, we're happy to welcome Elizabeth Duthie, Director of Patient Safety at Montefiore Medical Center. Her professional focus is on applying the science of patient safety to clinical practice, with special interest in redesigning systems to prevent errors. She's published, researched, studied, and lectured extensively about patient safety concepts and human error theory. Today, Beth will discuss how important a just culture is to creating a culture of safety. While just culture has always had critical importance within safety culture, many are particularly aware of this concern today, given the recent criminal conviction of a former nurse for a medical error that occurred in 2017, we'll touch on that later in this podcast.

This is the Leading Voices in Healthcare podcast.

So let's begin Beth. Tell us, how would you explain what a just culture is to a new physician or nurse just starting to practice in your hospital?.

Elizabeth Duthie, RN, PhD, CPPS (Guest): I always like to tell the clinicians who are joining us, that we expect that despite their best well-intentioned efforts, that errors are going to occur. Because that's what happens to humans. We don't expect them to happen. And they're unintended.

However, when things go wrong, we need to understand what happened, because when we understand what happened that prevented them from doing what they wanted to, we are then able to understand the systems that failed them.

When we know the systems that failed them, we can then go ahead and fix them. And that's really the ultimate goal, after an error happens. Find out what went wrong that didn't support them and then fix it, so it won't happen again. So what I really want new clinicians to understand is that it should be safe for you to report when you make an error. If something bad happens, we want to learn from it. And this is what is known as psychological safety. And we're striving to achieve that at our organization.

Host: Right. Makes sense. And everybody's struggling with that now in particular, you know, I, I often think about you know, health care having these two things that you have some kind of control over ;it's the system and that people that work within it. And when something goes wrong there's usually some combination of those things happening. Right? But there's some aspects of the system. There may be aspects of what people are doing. So, the way we review things tends to be around both peer review and this idea of a root cause analysis, critical incident review. So we understand all these different elements interacting with each other.

But at the end of the day, trying to understand what the people involved, what their role in all of this is just, is often difficult to do. And I know and you have, I'm sure I feel the same way as a clinician, we often internalize things, right. I mean, as a clinician, I guess I would worry if a physician or a nurse told me that they didn't take it personally. If a patient has a bad outcome that we didn't expect, you almost expect people to do that.

That's what makes it so difficult to have those conversations with people. Because as clinicians, we tend to think it's personal because they're, they are people, right. Very difficult, you know, very difficult.

Elizabeth: And it is difficult. It is difficult, especially because the higher the performer, the higher their standards and the more they strive for excellence, the harder it is for them to accept that this failure occurred much less, it is horrifying to them that it's their fault. Although they do assume that and you bring up another really excellent point.

And that is, is that it is not about the person or this system, it is about how the person interacted with the system. And I think that that is one of the failures that may have led Vanderbilt astray because I've seen this in other instances where institutions say, well, it was the person, therefore we don't have to look at the system or they say, it's the system there, therefore, we don't need to worry about the person. And yet the patient's safety science says, us says it is always both. You need to know and understand how the person interacted with the system to be able to effectively make changes prevent harm.

Host: Totally agree. So Beth, you know, you've spent many years in patient safety. Why do you think the just culture concept is so critical in developing a safety culture? Why is it such an important part of this?

Elizabeth: So what happens is, is that, and again, this comes from James Reason's work. He is the father of human error theory, and he says that if you're going to be in a patient safety culture, it is ever changing and consistently learning.

The only way we can learn is if we understand where the errors are arising and how our systems failed to support the clinicians. They also need to feel safe if they report an error, but a just culture, isn't a get out of jail free card.

You have to have standards and rules that are going to apply. And the staff need to know where that is clearly, where the line is drawn and to understand what their responsibility is. And at the same time, if they're well-intentioned efforts do not result in the outcome that they wanted in achieving safe care, they have to report that. We can't fix what we can't see. And so when staff don't feel psychologically safe, that they can report errors, then those errors are going to fly under the radar. And we're not going to ever know about them until really serious harm occurs. And the goal in patient safety is to detect them before serious harm occurs.

Host: Yeah, I love the idea of a learning culture. You know, we do call it the practice of medicine, right. We're constantly where we have this obsession or we should, right. But what do they say in high reliability organization, there's an obsession with failure. The obsession with, things going wrong and always looking for that.

And it's a learning environment and we have to constantly be asking ourselves, are we doing the best that we can? So I think that's such a great point. Beth, in your five years of Montefiore, how have you tried to inculcate these concepts into hospital culture?

Elizabeth: Okay. So the first thing is you have to have psychological safety. The second thing is if you're going to power a reporting culture, staff need to understand the value of their reporting. When they put in an event report and it goes into a black hole and they never hear back from it, it says to them what I had to say doesn't matter.

And so one of our focuses is that when we improve and change to make things safer, to ensure that we include the clinicians in what we did so that they understand that their voice was heard, and they were valued. As a result, we have gotten people to believe that reporting is a valuable opportunity to make things better. I think we're a little weak on doing it at a larger level, going beyond just the local level. And after hearing about what happened at Vanderbilt, it underscores for me that we need to develop mechanisms where we can share events across hospitals and across services. If something happens on pediatrics, we have a tendency to make sure pediatrics knows about it, but expanding that learning. Our system changes will go across the system, but the communication to the staff that this occurred as a result of error reporting is I think something where we can improve.

Host: Right. And you know, it's interesting, you, you mentioned the idea of getting back to people. I think that's so critical, right? If people don't know, it's like, you know, I used to run the operating rooms, at a hospital a number of years ago, we'd say, well, at the end of the case, let us know what didn't work.

You know, even the phone doesn't work. But if nobody's fixing these things, or taking care of it, if people don't feel like you're doing something about it, then people say, well, why bother saying anything if nobody's taking care of it? And this, then this topic, this is critical. People have to realize if they're going to speak up and say something that somebody is acting on it. I just think that's so important.

Elizabeth: Yeah, I agree.

Host: So important, so important. So let's get to this Redondo verdict, and we know now that, thankfully she's not serving any prison time, which is, I guess, some kind of relief. The whole thing is, is very concerning. But given this verdict what have you started to focus on to address these concerns? I mean, what's, you know, what are you guys doing about it now? Everybody's really concerned about what this is going to do to people's willingness to speak up.

Elizabeth: Right. So my greatest concern is that we've lost the trust of the staff because Redondo was honest and ethical and reported her mistake immediately, reached out to our nurse manager, filed an event report. She did everything that we said she should do. And in return, she was fired, her license, excuse me, her license was revoked and she was criminally prosecuted.

So now we have a message that what we said to you, just isn't true. And so the tidal wave of fear that I am hearing, and the reactions are build a wall of silence. Stop reporting the errors because no good will come of it. And so I really admire her courage, because despite being shamed, demeaned, and punished at the end of the conviction, she said, if I had to do it all over again, I would still have reported the error because it might've made a difference in that patient's life.

And she said, can't not, you can't report it. Yeah. And that's my biggest problem is that if people don't report the errors at the time that they occur, will a life be lost because we delayed intervention. So I want to honor the courage that Redando showed and make it safe to do the right thing when bad things happen.

And I want to honor Charlene Murphy who lost her life to this error by making sure the event that occurred at Vanderbilt can never happen here. And that means looking at all of our processes, doing a gap analysis against what the ISMP has brought forward and looking at all of the CMS findings to say, are we going to be meeting the highest standards that we possibly can as a result of this event?

So we're focusing right now very strongly on the Vanderbilt gaps and errors that occurred there, that they were cited for by the ISMP or CMS to see if we can make everything as tight here as possible. So at least that one event we can make sure it'll never happen again.

Host: Great. Wonderful. Well, good luck with that. It's a, it's a life's work as they say, right? But you've got the right people there.

Elizabeth: Yeah, it was 105 page report from CMS. So you can imagine, I feel like I'm drinking a water from a fire hose, the amount things that they have identified that we can learn from.

Host: Lot to do. So Beth, thank you so much. And just a final question for, for our audience. For those beginning their just culture journey, what tips would you give them, especially in light of this recent case? Where do you start with this if you're just beginning on this just culture journey?

Elizabeth: Yeah. So I'm a little skeptical that people haven't started the just culture journey. I think most of us are already on our way, but I think what I learned from the criminal conviction of this nurse, the real message is that it just culture isn't enough. Although again, the termination punishment, I don't think it was a part of the just culture. It might've met legal standards. But even though we have psychological safety to report errors, if you don't do anything about them, you will be no safer. And so the whistleblower reported the event 10 months later in October and CMS came in in November. So 11 months after the event, they found no changes had been made at all.

And so I really think what happened is the focus on the person. And she was the problem. There were no systems issues we need to look at because no one else would have committed, this type of an error is where our real opportunity comes. We have to say again, it's never just the person. It's never just the system.

It's how they interact together. And so this has strengthened my resolve more than ever to go for zero harm. First of all, if you have zero harm, you have no prosecution. And looking again at this event and how I can make sure that it never happens again, is my real goal and how I'm responding to this. And when patients are safe, staff will also be safe.

David Feldman MD, MBA, FACS (Host): Yeah, interesting

Host: that David Marks that we all know is another just culture guru in his comments in the last few weeks, his number one comment about what do we do about this? Reduce the rate of. Right.

If nothing goes wrong, we won't have to worry about this anymore. We should all, we should all be able to achieve that over time for the benefit of our patients and and everyone else.

Right. So, but thank you so much for joining us today and thank you all for listening to our leading voices in healthcare podcast, visit our website@thetdcgroup.com to learn more about the services we provide to healthcare professionals.