The federal government seems to want to keep safety data on eight hospital-acquired conditions (HACs) secret at the same time that it boasts how transparent it's being.
A favorite phrase from my college days is this, "How can you be in two places at once when you're not anywhere at all?"
That's what seems to be happening at the Centers for Medicare & Medicaid Services, which is trying to keep hospital-acquired condition safety data secret at the same time it releases it and boasts how transparent it's being.
The sensitive information under discussion involves the latest update on the rates of eight hospital-acquired conditions (HACs) per 1,000 patient discharges, for incidents that caused serious patient harm over the two-year period ending June 30, 2012. These never-supposed-to-happen events include surgical tools left inside patients, pressure ulcers, avoidable falls and trauma, incompatible transfusions, air embolisms, urinary tract and vascular infections, and poor glucose management.
Healthcare consumers, health plans, and members of any hospital's team, as well as their donors and trustees, should be able to see these numbers by hospital and compare rates with other hospitals in the region. And they can—sort of —but only if they know the multiple secret places to look.
The new data is tucked away in a site embedded within data.cms.gov, where few would ever know to search. And unless you can decode the "Provider ID," a six-digit sequence known mainly to hospital insiders and coders, you won't be able to find your hospital; the facilities are not listed by name.
To pair the data with the hospital's name, you have to know how to insert that Provider ID into a Hospital Compare URL, which you would do after selecting any hospital and substituting the six-digit sequence after "ID=" in the url.
Or you can call up another memory-intense website database from Medicare and crosswalk that information back and forth, back and forth. Are you dizzy yet?
After doing all that, plus exporting the files into sortable spreadsheets, one can learn, for example, that Lakeway Regional Medical Center in Lakeway, TX, had the highest rates of catheter-associated urinary tract infections, 15.873 per 1,000 patient discharges. That was more than double the rate at Brigham City Community Hospital in Brigham City, UT, which had the next highest, 6.042.
For foreign objects left inside surgical patients, Pine Creek Medical Center LLP in Dallastopped the list with 2.577 items per 1,000 patient discharges. Next was Houston Orthopedic and Spine Hospital in Bellaire, TX, with 1.362 per 1,000 discharges.
And serious pressure ulcers were most common at Bibb Medical Center in Centreville, AL, with 3.546 per 1,000 patient discharges, followed by Motion Picture & Television Hospital in Woodland Hills, CA, which had 3.3 per 1,000 patient discharges.
This time-intensive and brain-boggling process didn't have to be so complicated. The agency could have released this on its consumer-friendly Hospital Compare site, as it used to do, and as many were promised it would. This spring in an interview with me, agency officials said HAC data was no longer included under the agency's pay-for-performance reporting program and would be publicly available. But that response didn't explain why the data CMS continues to collect couldn't continue to be posted on Hospital Compare.
CMS appears to have released this update without really releasing it, in part to straddle the fence in a long-standing and acrimonious disagreement between hospitals and consumer and employer groups that purchase insurance. Their debate is over whether these potentially embarrassing exhibitions of patient harm pass the statistical "small numbers" smell test, especially with some types of HACs that are extremely rare.
The American Hospital Association says the data isn't reliable, and points to a report commissioned by CMS that the industry group says indicated as much. That November 2011 report "show[s] these measures are unreliable and have not been validated as Medicare calculates them," Nancy Foster, the AHA's vice president for quality and patient safety policy, said in an email Wednesday in response to a request for comment.
"In fact, the multi-stakeholder Measure Application Partnership (MAP) urged CMS not to publish these data due to these problems with the measures. It is unfortunate that data of poor quality is being published and used to unfairly rate hospitals,” Foster says.
The report says three of the eight HACs—foreign object retention after surgery, air embolism, and blood incompatibility—"have very low reliability on the basis of their extreme rarity in reported data."
The fourth and fifth HACs, poor glycemic control and falls and trauma, "exhibit low reliability over any of the time periods presented," the report says.
The last three HACs, catheter associated urinary-tract infections, pressure ulcers, and vascular catheter-associated infections, "have moderate reliability thresholds" because denominators are usually large and "they exhibit moderate reliability for half or more of hospitals" when the period reported is at least 21 months, the report says.
The Leapfrog Group, which issues a "hospital safety scorecard" twice a year that grades hospitals from A to F, and numerous chapters of national Business Groups on Health, disagree vehemently with the AHA, says Leah Binder, Leapfrog's President and CEO.
Her organization uses the HAC statistics to compile its twice-yearly hospital safety scores.
"No measure is perfect," Binder says. "And there is not a single measure that's ever been endorsed or used for which there wasn't enormous amount of debate about the scientific merits, the perfection, or imperfection."
Binder, who in July authored an article saying the AHA was trying to block data transparency, headlined “Bone-Chilling Mistakes Hospitals Make and Why They Don't Want You to Know,” says the hospital advocacy group just doesn't get it.
"I would think that from the hospital association's view, the level of reliability has to be 99.99999%. But for a consumer, they just want to know whether there was any foreign surgical object left in a patient in the hospital down the street,” she told HealthLeaders Media.
"These 'never events' happened," she adds. "And the public deserves to know about them happening. We [Binder and members of the Leapfrog quality task force] believe these measures are reliable enough to give a fair assessment."
Binder adds that she's disappointed CMS has made the data so difficult to find. "Obviously the CMS site was not designed to be easily accessible to consumers." But she says Leapfrog's safety score will publish them next month.
Asked for a comment, a CMS spokeswoman wrote in an email that "the data on data.cms.gov are accurate. Hospitals were given time to review these calculations and notify CMS of any discrepancies with the data."
Here's the way I see it: Either the data are reliable and accurate, as CMS claims they are, and in that case they should be clearly posted on Hospital Compare like they used to be. Or they are corrupted by flawed methodology, and should be dumped altogether until smarter people figure this out.
Consumers and healthcare competitors deserve to have confidence in their data, but CMS is not exactly showing us that. It seems like the agency wants to just be in two places at once, leaving those of us who want more information confused. And really, nowhere at all.