The Other Worlds Shrine

Your place for discussion about RPGs, gaming, music, movies, anime, computers, sports, and any other stuff we care to talk about... 

  • NASA astronaut turned patient safety expert

  • Somehow, we still tolerate each other. Eventually this will be the only forum left.
Somehow, we still tolerate each other. Eventually this will be the only forum left.
 #149383  by Kupek
 Wed Oct 13, 2010 12:47 am
After posting the ICP silliness, I figured I should post one of the best things I've read all month: The Wrong Stuff: James Bagian—NASA astronaut turned patient safety expert—on Being Wrong

"The Wrong Stuff" is a blog maintained by a woman who wrote a book on being wrong. She's interviewed various people about different aspects of being wrong. This guy is a former test pilot, former NASA astronaut, former anesthesiologist, and a few other things. Now he's the head of patient safety for the VA hospital system. It's long, but worth it: everything this guy says is insightful.

Some choice quotes:
f at the end of the day all you can say is, "So-and-so made a mistake," you haven't solved anything. Take a very simple example: A nurse gives the patient in Bed A the medicine for the patient in Bed B. What do you say? "The nurse made a mistake"? That's true, but then what's the solution? "Nurse, please be more careful"? Telling people to be careful is not effective. Humans are not reliable that way. Some are better than others, but nobody's perfect. You need a solution that's not about making people perfect.

So we ask, "Why did the nurse make this mistake?" Maybe there were two drugs that looked almost the same. That's a packaging problem; we can solve that. Maybe the nurse was expected to administer drugs to ten patients in five minutes. That's a scheduling problem; we can solve that. And these solutions can have an enormous impact. Seven to 10 percent of all medicine administrations involve either the wrong drug, the wrong dose, the wrong patient, or the wrong route. Seven to 10 percent. But if you introduce bar coding for medication administration, the error rate drops to one tenth of one percent. That's huge.
You can't change the culture by saying, ‘Let's change the culture.' It's not like we're telling people, "Oh, think in a systems way." That doesn't mean anything to them. You change the culture by giving people new tools that actually work. The old culture has tools, too, but they're foolish: "Be more careful," "Be more diligent," "Do a double-check," "Read all the medical literature." Those kinds of tools don't really work.
In theory, punishment sounds like a good idea, but in practice, it's a terrible one. All it does is create a system where it's not in people's interest to report a problem.
If a patient is harmed by something we've done, we tell them. We explain what happened, we tell them that they're eligible for monetary compensation, and we tell them they can sue us. I don't know any other place that says, "Here's how to bring a tort claim against us." We do. We figure that if we hurt you, whether through malfeasance or not, we should make restitution.
So we make it easy. We just tell them. And we end up getting more torts filed, but the aggregate payment is less, because people aren't trying to get revenge. Most people just want us to pay for something specific, to take care of the problem we created. And a lot of people say, "Thanks for telling me, I'm not glad it happened, but I understand that it wasn't intentional." And that's that.


And the Nerves of Steel Award, for his reaction to the Challenger explosion:

People who hadn't been around the high risk stuff themselves, it changed their whole appetite for it. Others looked at it much as I did: It's a shame but it happens, let's go on. I had worked at a test pilot school and some of my best friends were killed while I was there, so it was not an abstract concept to me that people I worked with would be killed doing the job I do.
 #149386  by SineSwiper
 Wed Oct 13, 2010 8:05 am
Kupek wrote:
f at the end of the day all you can say is, "So-and-so made a mistake," you haven't solved anything. Take a very simple example: A nurse gives the patient in Bed A the medicine for the patient in Bed B. What do you say? "The nurse made a mistake"? That's true, but then what's the solution? "Nurse, please be more careful"? Telling people to be careful is not effective. Humans are not reliable that way. Some are better than others, but nobody's perfect. You need a solution that's not about making people perfect.

So we ask, "Why did the nurse make this mistake?" Maybe there were two drugs that looked almost the same. That's a packaging problem; we can solve that. Maybe the nurse was expected to administer drugs to ten patients in five minutes. That's a scheduling problem; we can solve that. And these solutions can have an enormous impact. Seven to 10 percent of all medicine administrations involve either the wrong drug, the wrong dose, the wrong patient, or the wrong route. Seven to 10 percent. But if you introduce bar coding for medication administration, the error rate drops to one tenth of one percent. That's huge.

If only there were a set of business practices that identify failure points and fix the root causes...
 #149393  by Kupek
 Wed Oct 13, 2010 10:22 am
As he points out many times, it's not enough to say at a high level "We're doing X now." You have to change the culture in which people work. And that doesn't happen just by saying it will.

One of his main points is that other fields - aviation, nuclear power, chemical industry - don't have the same kinds of problems that the healthcare industry does. There are already known ways to handle this stuff. His contribution is applying those ways to healthcare, but it's not easy because of enormous cultural differences.

Also, did you even read what you linked?
Six Sigma originated as a set of practices designed to improve manufacturing processes and eliminate defects, but its application was subsequently extended to other types of business processes as well. In Six Sigma, a defect is defined as any process output that does not meet customer specifications, or that could lead to creating an output that does not meet customer specifications.
That doesn't really apply to the healthcare industry, nor would it apply to, say, nuclear power or the space program.
 #149400  by Imakeholesinu
 Wed Oct 13, 2010 2:34 pm
I used to work in the IT department on the corporate side of a major hospital and all management talked about was Six Sigma this, Six Sigma that. Of course it ended up confusing the hell out of us since we had processes already in place that worked but had to change because of Six Sigma. I'm so glad I don't work for that company any more. Not only that but we had people in management who thought they were better than others because they were a "yellow belt" or "black belt".
 #149411  by SineSwiper
 Wed Oct 13, 2010 10:09 pm
Kupek wrote:Also, did you even read what you linked?
Six Sigma originated as a set of practices designed to improve manufacturing processes and eliminate defects, but its application was subsequently extended to other types of business processes as well. In Six Sigma, a defect is defined as any process output that does not meet customer specifications, or that could lead to creating an output that does not meet customer specifications.
That doesn't really apply to the healthcare industry, nor would it apply to, say, nuclear power or the space program.
It started out in the manufacturing business, but it's branched out everywhere. Hell, I think NASA uses Six Sigma right now. We have several people who are well trained in Six Sigma, Lean, Kaizen, FMEA, Poka-yoke, etc., including my boss and my PM. My boss came from a healthcare industry before he was building a nuclear power plant, so yes it does apply. It's basically boils down to fault prevention/reduction, root cause analysis, improving processes, etc. I see a paragraph like this:
If at the end of the day all you can say is, "So-and-so made a mistake," you haven't solved anything. Take a very simple example: A nurse gives the patient in Bed A the medicine for the patient in Bed B. What do you say? "The nurse made a mistake"? That's true, but then what's the solution? "Nurse, please be more careful"? Telling people to be careful is not effective. Humans are not reliable that way. Some are better than others, but nobody's perfect. You need a solution that's not about making people perfect.

So we ask, "Why did the nurse make this mistake?" Maybe there were two drugs that looked almost the same. That's a packaging problem; we can solve that. Maybe the nurse was expected to administer drugs to ten patients in five minutes. That's a scheduling problem; we can solve that. And these solutions can have an enormous impact. Seven to 10 percent of all medicine administrations involve either the wrong drug, the wrong dose, the wrong patient, or the wrong route. Seven to 10 percent. But if you introduce bar coding for medication administration, the error rate drops to one tenth of one percent. That's huge.
And I think, "Gee, that's the kind of problem solving we do every day." It's a basic root cause analysis, something a FMEA could solve for on a broader scale, which in turn could be discussed in a 5-day Kaizen session. I work with process engineers, metric analysts, and SS people everyday in my department, so these concepts are familiar to me.

I just find it funny that this chick thinks she's got this totally unique concept, when she's merely reinventing the wheel. They are more that just terms. They are tools that successful businesses use all the time.
Imakeholesinu wrote:I used to work in the IT department on the corporate side of a major hospital and all management talked about was Six Sigma this, Six Sigma that. Of course it ended up confusing the hell out of us since we had processes already in place that worked but had to change because of Six Sigma. I'm so glad I don't work for that company any more. Not only that but we had people in management who thought they were better than others because they were a "yellow belt" or "black belt".
That strikes me as mgmt that didn't understand why they were implementing these standards, and putting it in place because then they can say they are a "Six Sigma" company for their shareholders or something.

We actually NEED that sort of practices where we work. We're just starting to implement these kinds of things, and the information gained is helpful. Right now, it can get hairy because mistakes happen that could have been prevented had we documented what could have gone wrong, ensuring we have proper failovers, backout plans, expected SLAs, etc. Yes, it's more work, but this sort of thing (including proper PMI project management) requires more resources to ultimately do more work in the longer term. When you fix it right the first time, you do less firefighting, and when you fix something even better, there's efficiency gains everywhere.
 #149423  by Kupek
 Thu Oct 14, 2010 10:48 am
Sigh.
I just find it funny that this chick thinks she's got this totally unique concept, when she's merely reinventing the wheel. They are more that just terms. They are tools that successful businesses use all the time.
The chick is the person conduction the interview. The interviewee is a man, and if you actually read the damn thing, you'd realize he's taking ideas from mission-critical industries that he's been a part of and applying them to healthcare. How you think that is "merely reinventing the wheel" is baffling; nuclear power and aviation have been dealing with these issues longer than Six Sigma has been popular.

You have a tendency to dismiss anything new as some form of previous knowledge you already had.
 #149434  by SineSwiper
 Thu Oct 14, 2010 8:53 pm
Kupek wrote:The chick is the person conduction the interview. The interviewee is a man, and if you actually read the damn thing, you'd realize he's taking ideas from mission-critical industries that he's been a part of and applying them to healthcare. How you think that is "merely reinventing the wheel" is baffling; nuclear power and aviation have been dealing with these issues longer than Six Sigma has been popular.
Yes, and then they formalized their processes. It's not just Six Sigma; it's the entire suite of tools and methodologies for fault management, reduction, and prevention, as well as proper project management.

Sure, Six Sigma was creating in the 80's, but the precursors to that were well before that. Lean was developed by Henry Ford in the early years of Ford and later adopted and refined by Toyota and the aviation industry. NASA and DoD invented the "work breakdown structure", which is the centerpiece to modern project management.
Kupek wrote:You have a tendency to dismiss anything new as some form of previous knowledge you already had.
What...this is old hat to me. Part of my job is fault reduction. We've been doing this for years.
 #149440  by Kupek
 Fri Oct 15, 2010 2:24 pm
Sigh. It's more than just a process. If you think it is, you're at the risk of cargo-cult methodology.

Consider that this guys has worked in half a dozen high risk industries. Also consider that his experience as a test pilot, anesthesiologist, astronaut and the head of patient care for a national hospital system has given him more insight into the nature of risk management than your seminars with Six Sigma. Everyone has "fault reduction" as a part of their job. Not everyone has a job where the consequences of a "fault" is death.
 #149447  by SineSwiper
 Fri Oct 15, 2010 6:52 pm
Kupek wrote:Consider that this guys has worked in half a dozen high risk industries. Also consider that his experience as a test pilot, anesthesiologist, astronaut and the head of patient care for a national hospital system has given him more insight into the nature of risk management than your seminars with Six Sigma.
You're basically arguing that one guy has more experience than many different people and companies who have designed and developed these ideas for years and years. (And yes, it is more than a process. It's a way of thinking.)
Kupek wrote:Everyone has "fault reduction" as a part of their job. Not everyone has a job where the consequences of a "fault" is death.
It doesn't change the methodology, just the risk calculation rates and goals. For example, FMEA accounts for severities that include risk of death. If you have a lot of those potentials, it means implementing many more failsafes (or smarter ones) to bring the potential risk way down.

Even in industries were "faults are death", the fault rate is never 0%. Zero faults is worth reaching for, but realistically, people die in hospitals (and other "deadly" industries) because of mistakes every year.
 #149465  by SineSwiper
 Sat Oct 16, 2010 8:18 pm
Kupek wrote:No, I'm arguing that you cannot absorb all of these insights from a few seminars. Just read the interview.
What seminars? What are you talking about?