My refrigerator gave me the idea for this column.
It's an expensive French door model from Sears, the kind with an alarm that goes "beep" after the door is left ajar more than one minute.
Saturday morning, the refrigerator cried for a cleaning. I opened the doors, removed the food and sponged down all the shelves and drawers.
Beep-beep-beep. Beep-beep-beep. Beep-beep-beep.
Good thing I bought the model with the option to turn that annoying sound off, I thought.
Later the next day, something didn't smell right. I had forgotten to turn the alarm back on. If I had, I would have known that the doors hadn't completely closed. Now the ice had melted. The milk was sour. The beer was warm. And the tuna salad would have to go. All because I erroneously deprogrammed my error detection system.
Maybe my refrigerator should have come with another sound—perhaps a bell—that would have alerted me to turn the alert back on? Or better still, maybe refrigerator designers should make any beep deactivation automatically expire after 30 minutes, enough time to clean those shelves.
It's that kind of "mistake-proof" thinking—of course on a much bigger scale—that's now on the minds of designers of healthcare systems, medical devices, and processes.
It certainly must be on the minds of those at Cedars-Sinai Medical Center, where 206 patients received CT brain scans with excess radiation exposure, as well as those officials for GE Healthcare, which manufactured the scanners.
In a letter to the U.S. Food and Drug Administration, Cedars-Sinai CEO Thomas Priselac has suggested some changes in the auto default settings on the scanners, among other design modifications.
GE officials say there's nothing wrong with their machines, but they are undoubtedly thinking of ways to produce additional error-proof features on their next equipment models.
A decade after the Institute of Medicine's famous report, "To Err Is Human," the Agency for Health Research and Quality continues to give mistake-proofing a tremendous amount of attention. This week, an AHRQ official pointed me to an illuminating catalogue containing 155-pages of error-proofing solutions. It is titled "Mistake-Proofing the Design of Health Care Processes."
The document was compiled by John Grout of Berry College in Rome, GA, an associate professor of business administration who has spent the last 12 years thinking about mistakes, and how to program into the process a system failure that will stop the mistake from being made.
"The traditional approach within medicine has been to stress the responsibility of the individual and to encourage the belief that the way to eliminate adverse events is to get individual clinicians to perfect their practices," he writes. "This simplistic approach not only fails to address the important and complex system factors that contribute to the occurrence of adverse events, but also perpetuates a myth of infallibility that is a disservice to clinicians and their patients."
The AHRQ document treats the field of mistake proofing as a scientific pursuit, a way of understanding the essential pathway to the mistake. When are mistakes made? How are mistakes made? And how can health providers lock in systems to prevent mistakes from occurring or from causing harm?