Skip to main content

Joint Commission Top Hospitals List Shuns Academic Medical Centers

 |  By cclark@healthleadersmedia.com  
   September 15, 2011

Joint Commission president Mark Chassin on Wednesday issued the agency's first collective assessment of the best hospitals in the nation, naming 405 systems based on achieving a 95% compliance score for risk-adjusted process measures, such as care for patients with surgical care, heart failure, heart attack and childhood asthma.

And, Chassin criticized other highly popular "Best Hospitals" lists published annually, saying they use "flawed" methodologies.

The performance statistics reported Wednesday by the commission have been available on the agency's website and on Medicare compare for some time. But Chassin said that the public "expects even greater transparency. They want to know how the hospitals where they receive care are performing" and lists them "all in one place."

These 405 hospitals make up 14% of the nation's hospitals that the Joint Commission accredits, and covers care processes related to heart attack, heart failure, pneumonia, children's asthma.

To the surprise of many, hospitals with some of the most prominent national reputations – such as those held up as a model for health reform including the Mayo Clinic, the Cleveland Clinic, Geisinger Medical Center, and Johns Hopkins University – are not on the Joint Commission's list. 

Neither are Intermountain Medical Center, Stanford University Medical Center, New York Presbyterian Hospital, Sanford Health or Massachusetts General.  Academic medical centers, for the most part, are noticeably absent.

There are no hospitals listed in New York City, none in Baltimore and only one in Chicago.

Asked why well-known hospitals that usually top the lists published by others are not there, Chassin said it was because they didn't measure up in the data collected for 2010. "I would suggest asking the hospital that thinks [it]  should have been on [the Joint Commission's] list why they think they're not on the list," he said during a news briefing.

"It's certainly true that larger hospitals, particularly if they are reporting on more measures than other smaller hospitals...have a lot more work to do in order to make sure (that care for) all of their patients meet this very high level of performance. But on the other hand, they have more resources than smaller hospitals to do that work. It may be a question of priority setting, and they may have been very happy with 85%," as opposed to this report's 95% cutoff.

When those hospitals see that 14% of other organizations have been able to do better, "that may be a motivator" for them too, Chassin said.

Instead of well-known hospital giants, a large and disproportionate number of smaller hospitals are on the list.

Chassin asserts that popular reports that rank best hospitals, such as  U.S. News & World Report and HealthGrades, use flawed data.

"U.S. News & World Report has a very heavy dependence on the reputations of hospitals as determined in a survey as opposed to performance," he said. "And HealthGrades, for example, uses a methodology for risk adjustment in mortality that omits almost all important severity factors in the conditions that they are trying to measure. The Medicare mortality data has similar flaws as well."

What the Joint Commission's new series of reports do, he added, is "focus very sharply on a group of quality measures that we have the highest confidence, because they pass a very rigorous set of tests that when hospitals improve on these measures, outcomes for patients get better directly because of that work. We've weeded out measures that are faulty either in their design or their implementation."

The agency, which accredits 80% of the nation's hospitals, which include 96% of licensed beds, evaluated hospitals on the basis of 22 measures. But it avoided the use of outcome measures because, Chassin said, it is hard to appropriately risk adjust for them. With process measures, which are tightly linked to good outcomes, "you don't need to do risk adjustment...It should be followed all of the time, or close to 100%."

Outcome measures may be included at a future point, but only if they "pass a stringent test," he said.

Asked why the report includes so many smaller, non-academic hospitals, Chassin acknowledged that small and rural hospitals are overrepresented and academic medical centers are underrepresented.
"I think that should be, number one, a notification or announcement that you don't have to be a big hospital to do well, and if you're a big hospital, it doesn't mean you're doing well (if you're not) paying attention to these very important processes of care..."

He said it should be "a wake-up call to larger hospitals to put more resources into these programs, and a recognition that small, rural and community hospitals can do an excellent job."

Chassin said that in reporting these process measures over the last several years, performance has remarkably improved because of these measures and the improvements they stimulate.  For example, care for heart attack patients 98.4%, up from 86.9% in 2002, for process measures including giving aspirin to each patient at arrival and discharge, fibronolytic therapy within 30 minutes and percutaneous coronary intervention within 90 minutes.

The Joint Commission criteria for performance excellence will continue to expand, Chassin said. For example, it is now in the second year of measuring performance for inpatient psychiatric care and in the first year of reporting results for patients with stroke and the prevention of venous thromboembolism.

Next year, a new set of measures will focus on care of moms and babies before, during, and after childbirth.

Tagged Under:


Get the latest on healthcare leadership in your inbox.