Skip to main content

12 Things I'd Change On Hospital Compare

 |  By cclark@healthleadersmedia.com  
   January 09, 2014

Hospital Compare is a valuable, yet deeply flawed tool for reporting on hospital quality measures. It doesn't have to be that way.

The most complete website for quality reports for the nation's 4,500 acute care facilities is Hospital Compare, which is managed with a lot of taxpayer resources by the Centers for Medicare & Medicaid Services. The agency has gradually expanded the number of measures it reports on the site from a few dozen three years ago to almost 100, and there are many more to come.

But Hospital Compare remains a troublesome and deeply flawed system in many ways. In many conversations with hospital and policy officials during the course of my reporting, the wails of complaints come in sighs and groans. These are some of the words I've heard used to describe Hospital Compare: Outdated, clumsy, misleading, confusing, overly complicated, and irrelevant.

So as the new year is just getting under way, it seems like a good time to ask: If I were in charge of Hospital Compare, what would I change, add, or take out, to make the site more useful?

I had some helpful suggestions from some of the smartest researchers and experts I know in public reporting. Of course, I asked CMS to comment, but was told their officials could not respond in time to meet my deadline. If I do hear from them, I may address their comments in an upcoming column.  

For now, here's what I'd change on Hospital Compare:

1. Lake Wobegon Syndrome
Almost all hospitals appear to perform at a national average, with too few appearing to be "worse than" or "better than." For example, for the measure on hospitals' rates of mortality within 30 days of discharge for patients admitted with a heart attack, only 0.7% of 2,634 hospitals with enough heart attack cases reported, were worse than the U.S. national average. Only 2.9%, were better than the national average. All the rest were average.

Likewise, for the measure rating hospitals rates of complication and death following hip and knee surgery. Of 2,750 hospitals that had sufficient numbers of joint replacement surgeries, 95% were average, 2.5% better and 2.5% worse.

One reason given for the wide range of hospitals in the average portion of the curve is that except for the few that are very, very good and the few that are very, very poor performers, everyone else is statistically the same. If that's true, then reporting the measure doesn't serve anyone except payers and patients of those few dozen hospitals at either end of the spectrum. CMS should improve on the measure to give the public useable information.

And if those hospitals in the lowest decile or quartile are really that bad, maybe they should come under review that questions their ability to continue receiving federal reimbursement for care of those diagnostic groups of patients. As one physician suggested—and I don't think he was kidding—for those extremely poor performers, perhaps CMS should "post a stop sign, something that says, 'Watch Out.' "

2. Report by Bricks and Mortar
Healthcare systems with more than one building where acute care is provided are often allowed to report under one provider number, even though the buildings are miles apart, with different staffing, different physicians, different equipment, and different patient populations. In San Diego, where I live, UCSD Medical Center has two locations, one in wealthy La Jolla and the other in Hillcrest in close proximity to an area with a large number of the city's poor.

Quality of care at many of these system-wide facilities varies dramatically. Not reporting that variation by individual facility defeats the purpose of having quality reporting. Payers and providers should want these performance measures delineated.

3. Military Hospitals and the VA
The public has a right to know about the quality of care in hospitals financed by the Department of Defense. Yet more often than not, a quality measure for the nation's 153 VA hospitals shows up as "Not Available." Likewise for military hospitals such as National Naval Medical Center in Bethesda, MD., or Naval Medical Center San Diego, or Naval Hospital Camp Pendleton, CA.

Military hospitals covered under the TriCare and other Department of Defense payment systems should be required to report quality measures for adult care the same as non-military hospitals do now.

4. Let Hospital Compare Compare
One of the most frequent complaints I hear about the site is that users are unable to compare more than three hospitals at a time. If both state and national data are sought, only one hospital can be seen at a time.

5. Follow the Money
For an increasing number of measures now posted on the site, performance is paired with a financial bonus or a financial penalty. For those measures, for example, 30-day readmissions, which carries a penalty up to 3% of a hospital's base Medicare DRG payments next year, CMS should include an explanation of whether that hospital's readmission rates avoided or incurred that penalty, and how much.

Coming soon, penalties for hospital-acquired conditions of 1% and value-based purchasing incentive payments of 2%. CMS should explain whether a hospital's score deserved a penalty or reward in those categories too.

6. Report Measures of Cancer Care
Care for patients with cancer eats up $1 in every $10 federal dollars spent on healthcare, yet there are no measures now reported nationally for cancer care. For hospitals that have been accredited to treat patients with cancer from reputable organizations such as the Commission on Cancer, show those hospitals' accreditation status, for which forms of cancer.

R. Adams Dudley, MD, a quality measures expert at UCSF, suggests that the National Cancer Institute's Surveillance, Epidemiology, and End Results Program data can be used to show cancer hospitals' rates of patient survival.

"It's do-able," he says, "and it's much more an issue of political will than of technical capability" that prevents such information from being posted now.

7. Stop Avoiding Children
Except for 3 measures covering childhood asthma care, there are no measures for pediatric care. A search for several children or infant hospitals by name produced no results, such as Rady Children's Hospital or Sharp Mary Birch Hospital for Women and Newborns, both in San Diego, or Cincinnati Children's Hospital.

Because pediatric care is often covered under Medicaid, reporting of acceptable Medicaid quality measures should be required and reported too so parents can judge quality of children’s hospitals.

8. Update More Frequently
Hospital officials often wave away poor performance data by saying the numbers are old. They're doing much better now, they insist. Unfortunately, we won't find out if that's true for years.

Performance periods for many reported measures on Hospital Compare ended as much as 18 months ago and covered a period that began long ago as 4.5 years. The most recent performance periods ended March 31 of 2013 or 10 months ago.

Although CMS updates the site every three months, most measures are on a less frequent schedule. More recent rolling performance periods and shorter intervals for hospitals to check their data before public posting would help.

9. Trust but Verify
CMS now includes a page on the site for Data Sources, which explains how the data gets from the hospital's patients' charts to the federal database. But there is no explanation on whether the data is checked or audited, or how often, and the degree of sampling necessary for verification.

10. Don't Obfuscate
For those interested in using the power of spreadsheets to compare hospitals, CMS offers links to portals where one can download dozens of databases for various measures. Often, however, those hospitals are identified only by their Medicare provider number, and not by name, requiring extra steps to link the two.

11. Organize By Symptom
R. Adams Dudley, MD, of UCSF says CMS should consider organizing Hospital Compare with the patient rather than the practitioner in mind.
"Get inside the consumer's head and organize our thinking by symptom" or by patient function, he says.

Perhaps—and this is just an idea—CMS should consider linking to a separate website that shows hospitals for their proficiency in treating major reasons for hospitalization.

12. Advertise and Promote
Most of the people I know who aren't in healthcare have never heard of Hospital Compare. The few that have heard of it have never used it. Maybe part of the reason is that they don't understand why the data is so important.

A lot of money is going into compiling these statistics. Isn't it about time that we spent some resources to get people to use it, and benefit from it?

I've never heard of a CMS campaign to get the word out about this site. I've never heard that CMS has done any research to find out how many people use Hospital Compare, who uses it, how often, for what purpose, what parts of the site they spend the most time on, or how the data may be expanded or improved.

I'm hoping that in the next week, CMS will weigh in on whether these items should be fixed, and whether they can be. I'll let you know as soon as I do.

Tagged Under:


Get the latest on healthcare leadership in your inbox.