Quality e-Newsletter
Intelligence Unit Special Reports Special Events Subscribe Sponsored Departments Follow Us

Twitter Facebook LinkedIn RSS

Joint Commission Top Hospitals List Shuns Academic Medical Centers

Cheryl Clark, for HealthLeaders Media, September 15, 2011

"It's certainly true that larger hospitals, particularly if they are reporting on more measures than other smaller hospitals...have a lot more work to do in order to make sure (that care for) all of their patients meet this very high level of performance. But on the other hand, they have more resources than smaller hospitals to do that work. It may be a question of priority setting, and they may have been very happy with 85%," as opposed to this report's 95% cutoff.

When those hospitals see that 14% of other organizations have been able to do better, "that may be a motivator" for them too, Chassin said.

Instead of well-known hospital giants, a large and disproportionate number of smaller hospitals are on the list.

Chassin asserts that popular reports that rank best hospitals, such as  U.S. News & World Report and HealthGrades, use flawed data.

"U.S. News & World Report has a very heavy dependence on the reputations of hospitals as determined in a survey as opposed to performance," he said. "And HealthGrades, for example, uses a methodology for risk adjustment in mortality that omits almost all important severity factors in the conditions that they are trying to measure. The Medicare mortality data has similar flaws as well."

What the Joint Commission's new series of reports do, he added, is "focus very sharply on a group of quality measures that we have the highest confidence, because they pass a very rigorous set of tests that when hospitals improve on these measures, outcomes for patients get better directly because of that work. We've weeded out measures that are faulty either in their design or their implementation."

The agency, which accredits 80% of the nation's hospitals, which include 96% of licensed beds, evaluated hospitals on the basis of 22 measures. But it avoided the use of outcome measures because, Chassin said, it is hard to appropriately risk adjust for them. With process measures, which are tightly linked to good outcomes, "you don't need to do risk adjustment...It should be followed all of the time, or close to 100%."

1 | 2 | 3

Comments are moderated. Please be patient.

6 comments on "Joint Commission Top Hospitals List Shuns Academic Medical Centers"


Richard A. Robbins, M.D. (11/1/2011 at 2:30 PM)
My colleagues and I have examined the Joint Commission's performance measures in terms of outcomes and find no relationship. The manscript was posted on 10-30-11 at Southwest Journal of Pulmonary and Critical Care (http://www.swjpcc.com/). There is also an accompanying editorial posted on 11-1-11.

Todd (9/22/2011 at 6:44 PM)
I disagree with the comment that The Joint Commission has a conflict. I think its more of a problem in the other direction. They have a TON of data about hospitals both in the US and Internationally yet fail to disclose it. They don't want to bite the hand that feeds. Perhaps now they're starting to release information since Press Gainey is now comparing hospital data.

chloe (9/15/2011 at 11:37 PM)
By the way, there is also a conflict of interest with HEALTHGRADES. Healthgrades sells its consultancy services to hospitals who, surprise! Always get top ratings. Also a huge percentage of Healthgrades information is inaccurate. Finally, the hospitals that earn high ratings have to pay Healthgrades for the privilege of advertising that fact to the public. It's all a crock.