Skip to main content

TJC Names Top Performers, Notes 'Dramatic' Quality Improvements

 |  By John Commins  
   November 18, 2015

Despite some differences based on size, academic medical centers, community hospitals, and for-profit and not-for-profit hospitals have all seen quality improvements since 2002, says the head of The Joint Commission.

Nearly one-in-three hospitals that submitted quality and performance data to The Joint Commission earned a coveted spot on the accreditor's Top Performers on Key Quality Measures list.

"Achieving top performer status is not easy and for many hospitals it took years of hard work," Joint Commission President Mark R. Chassin, MD, said on a Tuesday conference call. "More than ever, hospitals are focusing on what counts. This represents real progress. We clearly have a long way to go on these and other measures of healthcare quality."

The Joint Commission's 2015 report used 49 accountability measures to determine how hospitals perform on evidence-based care processes for conditions such as heart attack, surgical care, inpatient psychiatric services, perinatal care, pneumonia and stroke.

The report will not be published in 2016, Chassin said. "For 2016 we've decided to take a one-year hiatus and assist our accredited hospitals in managing the journey and evolution of electronic clinical quality measures. In January the Joint Commission will launch Pioneer in Quality, a program focused on helping customers reach top performer status in the electronic clinical quality measures world."

Of the more than 3,300 hospitals that submitted data for the 2015 report, 1,043 hospitals (31.5%) made the Top Performer list, which required them to achieve a cumulative performance of 95% or above across all reported accountability measures.

Chassin cautions that while the Top Performer program identifies hospitals that provide superior performance on specific measures, it is not a blanket endorsement of all services provided at any particular hospital.


Choose Hospitals on Performance Data, Consumers Urged


"The evidence is crystal clear that quality varies quite a lot within individual hospitals from one service to another and from one measure to another," he says. "That is why we are very careful to specify in this report exactly which measures resulted in each top performer achieving their recognition."

This year's report added accountability measures for tobacco treatment and substance use, and inpatient psychiatric services.

Chassin says academic medical centers, community hospitals, and for-profit and not-for-profit hospitals all are seeing quality improvements.

"There are remaining some differences between hospitals depending upon how large or small they are. But the main trend is that everyone has improved dramatically, especially if you go back to 2002," he says.

Chassin said he doesn't think the financial incentives imposed by CMS have played a huge role in improving quality.

"The time during which the most dramatic improvement in performance occurred (was) well before the financial penalties were put in place by CMS," he says. "The only financial penalty that was in place in this period was the penalty for not reporting any data at all. Specific penalties attached to specific measures came long after hospitals had achieved the vast majority of this improvement. And that temporal observation is consistent with a bunch of research that shows that financial penalties tied to specific measures didn't really add much to the improvement that had already taken place."

Instead, Chassin credits The Joint Commission with providing the platform that empowers and incentivizes hospitals to measure and monitor their quality markers.

Standardized Measures
"The principle driver is that The Joint Commission going back to 2002 established the first standardized set of information that hospitals could collect across the nation," he says. "Indeed, we required them to as a condition of accreditation to measure quality in a selective number of conditions, and that number has gone up over the years. We were the first organization to publicly report the results of those data collection on quality measures. [Centers for Medicare & Medicaid Services] followed shortly after, and for a number of years in the 2000s, The JC and CMS were nearly perfectly aligned in the definition of measures, in the public reporting of those measures, and the public reporting really drove a huge amount of improvement."

Since then, he says, a lot of other organizations have jumped on the hospital ratings bandwagon.

"Now we have lots and lots of organizations reporting on quality," he says. "Leapfrog, Health Grades, Medicare's measures derived from their billing data, which we don't believe are valid measures of quality. So there are a lot more sources of information. We believe these accountability measures are still the best, but that public reporting period in the early and mid-2000s was the major driver that got hospitals' attention and they learned how to improve on these measures."

How TJC Criteria Differ
Chassin says the Joint Commission has a strict set of criteria that distinguish its ratings from other hospital raters.

"For example, we will not use measures like CMS uses that are derived from data from hospital bills because they don't pass our criteria for accurately identifying complications, or some patient subgroups," he says.

"We also will not use outcome measures like CMS and Healthgrades and others use because they rely on billing data to do risk adjustment. Those billing data don't have any information on the severity of the condition which is one of the most important things you have to adjust for in comparing different patient populations."

"Another player out there likes to label hospitals with one letter grade. That's demonstrably misleading. Quality varies enormously within hospitals, from one service to another, and from one measure to another," he says. "It just flies in the face of decades of hospital research which shows that variability does not allow that kind of measure to be accurate."

"There is a lot of noise out there in the hospital quality measurement field," he says. "We encourage hospitals to tune out the noise and to focus on measures that are most important to their own patient populations. The most important quality improvement that hospitals can do is to understand what risks their patients are facing, what improvements are necessary for their patients, and to act on those incentives."

John Commins is the news editor for HealthLeaders.

Tagged Under:


Get the latest on healthcare leadership in your inbox.