Our Systems are Failing Us. Who’s Responsible?

by | Apr 7, 2022 | Blog

Share this article!

Imagine a large plane, perhaps a 747, packed with passengers. When the pilot of that plane takes off, each of those 400 people have placed their trust in that pilot for safe travels. But it’s not just the pilot, is it? Those passengers are trusting the airline, and the entire airline industry to keep them safe as they hurtle through the air.  Because in reality, the pilot is only one part of a much larger system of people, processes and technology that work together to ensure the plane gets from point A to point B safely.

Planes are designed to optimally provide efficient, safe, and comfortable flights. Flight routes are planned and coordinated. Planes are continually assessed for safety and compliance. Weather conditions are monitored. The important thing is that there is a network of safety nets designed to make sure everyone stays safe, because flying is high stakes. I’m willing to bet that anyone reading this article can think of at least one time they were on a flight that was delayed at the last minute because of a safety or maintenance check that failed and needed to be addressed.  That’s because a plane, with even the smallest safety concern, will not take off until it has been addressed.

In addition to these safety measures, pilots expect the airline will provide them with the best possible planes and technology to help make the flight as smooth as possible. With more seniority, some pilots even choose the specific types of planes.

If something does go wrong and the pilot does the best they can, do we expect the pilot to take all the blame? Or do we also look at other factors, such as mechanical issues, unexpected weather conditions, or air traffic control problems? Additionally, if there is a problem that is discovered prior to takeoff, the pilot and team have the authority to delay or cancel the flight all together.

If there is a problem that occurs during a flight, everyone knows about it and learns from it. Case in point – remember the plane that made the emergency landing on the Hudson River? That happened over 13 years ago, and everyone survived. But you have still likely heard about it. That event was not hidden.  The passengers were not forced into non-disclosure agreements.  Instead, the information is widely available and shared with not only other airlines, but with the public. In fact, each event that occurs – or even a near-event in the airline industry – undergoes a retrospective review to ensure it doesn’t happen again.

How Does Plane Safety Relate to Healthcare?

Now, why the long story about pilots? I want to tie this into the case about Ms. RaDonda Vaught, a nurse who was convicted of gross neglect and negligent homicide of a patient. First, I think it is worth acknowledging that Ms. Vaught made a terrible mistake, with specific things that could have been done differently, and she should face appropriate consequences. However, holding her criminally responsible for this seems extreme for two reasons. First, this was a mistake. She was not practicing under the influence of substances, and there was no malicious intent. Yes, it was a grave mistake. But the key word there is mistake. Second, this error happened because of a failure of our health system, yet Ms. Vaught is the only one being held responsible in this way.

While we are privy to only parts of the situation, from the evidence that I have been able to read through, it seems there were multiple safety nets in the system that failed her as a clinician. First, this error occurred in a clinical setting that Ms. Vaught (and any average nurse) would not have as much experience working in – the radiology department. Because medications are not routinely dispensed in radiology, some of the basic technology and tools used to ensure medication safety were not available there, such as a patient bracelet scanner to ensure a match between the patient’s orders and what is dispensed from the medication cabinet. 

Additionally, from what I understand, the new clinical information system in the hospital may not have been properly communicating with the medication dispensing units.  It is unclear why a paralytic agent was available in the radiology department.  Undoubtedly, Ms. Vaught was experiencing significant alert fatigue from the systems, as any clinician can empathize with, and as is well described throughout medical literature. Like a pilot who expects that the aviation team and safety checks are all properly working and ensuring the utmost safety of the plane, Ms. Vaught expected her environment to have adequate checks in place.

Now, this is where people say that she chose to ignore and override alerts. Yes, that is true. I’m not saying she should have ignored the alerts and that she is blameless. However, if you aren’t in healthcare, it is hard to understand what alert fatigue feels like. Here is my best way to describe it to you.

Understanding Alert Fatigue

Imagine you need to get work done on your phone. As you work, you get a few hundred notifications – texts, emails, app alerts, etc. Now instead of the notifications showing up and then quickly leaving the screen like they normally do, these alerts stay on the screen, and pile up on top of each other.  Then, in order to keep doing your work, you have to individually remove or answer each one. To make it even worse, each of these alerts looks exactly the same. They are the same color, text font, and format. In fact, they are all red, with capital font. Now try to imagine what that must be like, day in and day out.  Do you think you might get numb to the effect of the alert? Perhaps not the first time….or even the 50th time… but what about after 100? 1000? With countless alerts disrupting your workflow, it can be hard to differentiate between an email about a clothing sale, a gif in your friend group chat, and a call from school saying that your child fell off the monkey bars.

This sounds stressful. Now imagine getting these alerts while making dozens of life and death decisions for numerous patients throughout the day, in addition to countless other clinical, administrative, and repetitive tasks. Again, I am not saying that she should have ignored the alerts, but I can certainly imagine and understand the situation that leads to something like this happening.

When it comes down to it, those alerts came from somewhere.  Someone decided it was ok to implement them. Someone designed them.  Someone thought it was ok for them all to look the same. So, are each of those people also responsible for errors like this occurring?  We must start looking at the system as a whole that takes things to and lets things pass a critical breaking point, instead of looking at the last person in a chain of events and pointing fingers at them individually.

Another difference between Ms. Vaught’s scenario and the scenario with the pilot is that pilots have agency. Pilots are empowered…even expected, to delay or cancel a flight if there is a potential problem. Nurses and other clinicians do not have this choice. If the medicine cabinet or the new tech system isn’t working, clinicians are told to keep working, be careful and “it will be fixed in a future release.” At the same time, they are expected to provide the highest quality care and to do it efficiently. Clinicians do the best they can in the environment that they are put into, but they do not get a say in changing that environment to help make their jobs easier…or safer.

Lasting Impacts

What I am really concerned with is the repercussions that this case has. First of all, how will this impact error reporting? Ms. Vaught immediately self-reported the error once she realized what had happened. She never tried to hide the circumstances. If clinicians think they will be criminally punished, and potentially go to jail, do you think they will willingly report errors? Will nurses stop doing certain tasks, such as getting medications in an emergency because of fear of repercussions? Will clinicians leave the clinical field, in search of a different career path with less severe consequences to things out of their control?

If we want to make healthcare safer for everyone, we must be more open with reporting errors. If we report errors, even those that may be below the threshold for reporting sentinel events, we can learn how to prevent these errors in the future.  Not just for that clinician, but in that department, that facility, that health system, and in the broader medical community. However, in order to get more transparency and honesty in reporting errors, we need to create a system where clinicians feel safe, and do not fear criminal punishment for a true mistake.  

This can not just be something that health systems simply say they will do better. We need less talk, and more action, because words alone are not enough. There needs to be a focus and prioritization on prevention, instead of simply responding to errors after they happen.

This case needs to be a wake-up call where we take a true look at how systems are failing patients and clinicians.  We all want better healthcare. So let’s take a hard look at where this problem is coming from and hold ourselves and our colleagues accountable for taking meaningful action.

Loading...