Designing Safer Systems
Your body is covered from head to toe in protective equipment, and it’s 115 degrees Fahrenheit inside your outfit. Your hands sweat under two pairs of gloves. An ill-fitting hood creeps down your forehead and nearly covers your eyes, but you cannot touch your head to shift it back up. To top it off, the exterior of your protective garb is partially covered with bodily fluids from a patient with Ebola.
The time comes to leave the patient’s room without leaving a trace of the virus on you or carrying it outside the room. You are anxious and tired but careful to follow infection-control protocols as you remove a disposable face shield, hood, mask, booties and other garb. Yet as you do, a piece of hair falls down over your eyes. In a split second and without thinking, you brush it back, and an infected glove glances across a misty eye. The virus is in you.
In a health care world where infection control methods are part of daily practice, Ebola requires a new level of vigilance. Even scrupulous health care providers could get infected without the proper training and support on how to put on and take off personal protective equipment, or PPE.
The new Ebola protection guidelines for PPE use issued this month by the U.S. Centers for Disease Control and Prevention are a critical step in protecting patients, health care professionals and populations in the United States and abroad. Highlights of these new guidelines include leaving no skin exposed and requiring a trained observer to actively help the health care provider to follow the protocols for donning and removing their protective equipment. Because there is very little that one can do to prevent infection after exposure to this unforgiving virus, it is essential that clinicians know this guidance inside and out, become competent in putting on and removing PPE and have a “buddy” to coach them to ensure that they adhere to each and every step.
But like many clinical guidelines, following them is like following the directions for assembling an IKEA chair. They need to be translated into something that providers can easily absorb, that points out the most important steps and provides guidance for implementing them in real time, under real-world constraints. We need to create systems that help health care workers to follow these guidelines, and clinicians need to practice until they are competent. Health care workers also need to know how to respond in those instances when something doesn’t go according to plan.
With that in mind, the CDC asked the Johns Hopkins’ Armstrong Institute for Patient Safety and Quality to convene a team to produce a series of interactive, online training programs on following the new PPE guidelines. Released this afternoon, the videos provide guidance for putting on and removing PPE and allow users to select training specific to the type of respirator and body covering that they will be wearing. Another video module, which guides trained observers on how to be effective in their roles, will be available in days. The free course will also be available on iTunes U.Read More »New Ebola Training Modules Will Help Safeguard Patients, Providers, and the Public
Despite spending $800 billion on technology last year, health care productivity is flat and preventable patient harm remains the third leading cause of death in the U.S.
One reason is that health care is grossly under-engineered: medical devices don't talk to each other, treatments are not specified and ensured, and outcomes are largely assumed rather than measured.
Other industries rely much less on heroism by individuals and more on designing safe systems and using technology to support work. Today a pilot’s cockpit is much simpler than 30 years ago; it is far more error-proof, and built-in defenses enhance safety. By comparison, hospital intensive care units, which contain anywhere from 50 to 100 pieces of separate electronic equipment, appear unchanged.
Changing this will require unprecedented collaboration between health care’s many stakeholders. That’s one reason why this fall the Armstrong Institute and the World Health Organization convened health care leaders, consumers, providers, regulators and private-industry partners to discuss such topics as how to design safer systems at the Forum on Emerging Topics in Patient Safety held in Baltimore.
One effort to design safer systems at Johns Hopkins is Project Emerge. Supported by a $9.4 million grant from the Gordon and Betty Moore Foundation, Emerge is tapping into the wisdom of a diverse team of engineers, nurses, doctors, bioethicists, and patients and family members — 18 disciplines in all from across Johns Hopkins University— to design safer care in ICUs.
Last week the Armstrong Institute, along with our partners at the World Health Organization, had the privilege of hosting more than 200 clinicians, patient advocates, health care leaders and policy makers for our inaugural Forum on Emerging Topics in Patient Safety in Baltimore.
The event featured presentations by international experts in a dozen different industries, including aviation safety expert Captain Chesley “Sully” Sullenberger, a former space shuttle commander and the chief medical officer of the Centers for Medicare & Medicaid Services. Other speakers shared their expertise in education, sociology, engineering, nuclear power and hospitality to see what untapped lessons such fields may hold for health care.
Their collective expertise was breathtaking. What was even more impressive was the obvious enthusiasm and spirit of collaboration embodied by a group joined by a common and noble purpose: to overcome the complex challenges that allow preventable patient harm to persist.
At Johns Hopkins, we’ve already seen what’s possible when health care adopts best practices from other industries. Our work to reduce central line-associated blood stream infections (CLABSI) presents a powerful example. By coupling an aviation-style checklist of best practices to prevent these infections with a culture change program that empowers front-line caregivers to take ownership for patient safety, the program, detailed recently on Health Affairs Blog, has reduced CLABSI in hospital intensive care units across the country by more than 40 percent. Similar results have been replicated in Spain, England, Peru and Pakistan.
That effort succeeded because we challenged and changed paradigms traditionally accepted by the health care community. We helped convince teams that patient harm is preventable, not inevitable. That health care is delivered by an expert team, not a team of experts. And, most importantly, that by working together, health care stakeholders can overcome barriers to improvement.
But if there are to be more national success stories in quality improvement, I believe the health care community will need to examine a few of its other beliefs.
For example, health care can learn much from the nuclear power industry, which has markedly improved its safety track record over the last two decades since peer-review programs were implemented. Created in the wake of two nuclear crises, these programs may provide a powerful model for health care organizations.
Following the famous Three Mile Island accident, a partial nuclear meltdown near Harrisburg, Pennsylvania in spring 1979, the Institute of Nuclear Power Operators (INPO) was formed by the CEOs of the nuclear companies. That organization established a peer-to-peer assessment program to share best practices, safety hazards, problems and actions that improved safety and operational performance. In the U.S., no serious nuclear accidents have occurred since then.
This week marks a step that holds tremendous promise for patients and clinicians. On Monday the Masimo Foundation hosted the Patient Safety Science & Technology Summit in Laguna Niguel, California, an inaugural event to convene hospital administrators, medical technology companies, patient advocates and clinicians to identify solutions to some of today’s most pressing patient safety issues. In response to a call made by keynote speaker former President Bill Clinton, the leaders of nine leading medical device companies pledged to open their systems and share their data.
Today, an intensive care unit patient room contains anywhere from 50 to 100 pieces of medical equipment made by dozens of manufacturers, and these products rarely, if ever, talk to one another. This means that clinicians must painstakingly review and piece together information from individual devices—for instance, to make a diagnosis of sepsis or to recognize that a patient’s condition is plummeting. Such a system leaves too much room for error and requires clinicians to be heroes, rising above the flawed environment that they work in. We need a heath care system that partners with patients, their families and others to eliminate all harms, optimize patient outcomes and experience and reduce waste. Technology must enable clinicians to help achieve those goals. Technology could do so much more if it focused on achieving these goals and worked backwards from there.
We’re pleased to announce that the Gordon and Betty Moore Foundation has awarded Johns Hopkins’ Armstrong Institute a grant of $8.9 million to design safer… Read More »Big news today from the Armstrong Institute
In the world of patient safety, we’re constantly reinforcing the importance of teamwork and communication, both among clinicians and with patients. That’s because we know that patient harm so often occurs when vital information about a patient’s care is omitted, miscommunicated or ignored.
Yet for all we do to improve how humans work together, clinicians compete against an environment in which there is very little teamwork or communication among the technologies that they need to care for patients. And there’s little that clinicians or hospitals alone can do about it.
Take, for example, the plethora of alarms from cardiac monitors and other devices that compete for clinicians’ attention. Vendors act as if we are in an alarm race, with each making their devices’ beeps more annoying but no clear prioritizing of the most important alarms. A study on one 15-bed Hopkins Hospital unit a few years ago found that a critical alarm sounded every 92 seconds. As a result, nurses waste their precious time chasing an ever-growing number of false alarms—or becoming desensitized to false alarms and ignoring them. Across the country, this has had tragic consequences, as patients have died while their alarms went unheeded. (Read a 2011 Boston Globe series about this issue.)
In most other high-risk industries, such as aviation and nuclear power, technologies are integrated. They talk to each other, and they automatically adjust based on feedback. Indeed, because of systems integration, pilots fly a small amount of a flight, and even in some treacherous situations, they hand over the reins to the autopilot. Although Southwest Airlines or the U.S. Air Force can buy a working plane, you cannot buy a working hospital or ICU. You must put it together yourself.Read More »Why can’t the ICU be more like a cockpit?
I recently gave a talk to the American Medical Student Association. The energy in the room was palpable. The students were excited, passionate and hopeful. We spoke about the urgent need to reduce preventable harm and to enhance value, and we discussed that they will need to be the ones to lead these efforts.
Yet, in speaking with them, I had to confront the sad reality that most of them will graduate ill-prepared to lead the improvements of quality and safety our health care system needs. They no doubt will know chemistry, biology and physiology, but they may not know about human factors, implementation science or performance measurement—the language of quality improvement. They will know orthopedics and genetics but they won't know teamwork and systems engineering. They likely know about German scientist Rudolph Virchow, the father of cell theory, yet they do not know John Kotter, the father of change theory whose model for leading change is highly effective and widely used. Without a doubt, these students will need to lead change.