Quality e-Newsletter
Intelligence Unit Special Reports Special Events Subscribe Sponsored Departments Follow Us

Twitter Facebook LinkedIn RSS

Joint Commission 'Best' List Draws Skepticism from C-Suite

Cheryl Clark, for HealthLeaders Media, September 22, 2011

To be clear, no one is saying process measures aren't important. They have one critical advantage in that they're not affected by severity of illness or economic demographics, so they don't need a complex risk adjustment algorithm. The question is merely, was the thing done or not? 

Also, if these process measures tracked, we should see the same hospitals scoring well on HospitalCompare and on the new Joint Commission list. That's not generally the case.

I think hospital executives have another reason to be confused by The Joint Commission's report. The Centers for Medicare & Medicaid Services' is signaling – both with reimbursement policies scheduled to take effect in 2014 and with the release of data on its HospitalCompare website, that it plans to aggressively emphasize outcomes, such as 30-day readmission rates for heart attack, pneumonia and heart failure and 30-day mortality rates those same conditions.

Consumers, employers, payers, and providers alike are in the very early stages of understanding how to measure quality. Patients are just beginning to realize that their decisions can be based on real metrics rather than the beautiful artwork in the hospital lobby or the place where a neighbor volunteers.

The Robert Wood Johnson Foundation now lists 224 reputable state, federal,  and consumer group lists that rank hospitals in a variety of ways. If healthcare executives are confused about all of this, imagine how confusing it is to those who have to make a choice of where to seek care, the patient.


Cheryl Clark is senior quality editor and California correspondent for HealthLeaders Media. She is a member of the Association of Health Care Journalists.
Twitter
1 | 2 | 3 | 4

Comments are moderated. Please be patient.

4 comments on "Joint Commission 'Best' List Draws Skepticism from C-Suite"


C.L.Jones (9/23/2011 at 9:46 AM)
There are many elements of disconnect here. First-it's like comparing apples to oranges when comparing these two unique and different rankings and trying to make the same conclusion. The JC is trying to use bsic and core measures to rank basic standards of care. USNWR is a publication,informative however, and not a peer reviewed medical science based journal. There are some interesting survey and measure techinque in the USNWR methodology- that fortunately have improved over the years- but have a lot of opinion based information, from research companies owned by physicians of these major "report" headliners. Bottom line- consumer beware.

Daniel Fell (9/22/2011 at 11:40 PM)
Sadly, it's the patient attempting to make informed decisions about his or her healthcare who is faced with how best to interpret another set of conflicting quality measures. While the lack of standards surely helps some hospitals to compete in the marketplace, long-term it continues to erode consumer confidence and trust. The industry doesn't need more healthcare ratings, rankings and awards - it needs more consensus on which ones matter.

mila michaels (9/22/2011 at 3:29 PM)
Not surprising C Van Gorder is perplexed. Scripps boasts the highest number of fines in San Diego county by the state licensing board. TJC has again proven that a true and unbiased rating shouldn't be bought.