Skip to main content

Stop Ignoring Low Quality Ratings

 |  By cclark@healthleadersmedia.com  
   September 06, 2012

This article appears in the August 2012 issue of HealthLeaders magazine.

For many hospitals, it's bound to happen at some point, if it hasn't already. In an avalanche of public quality scorecards, there will be at least one that reveals—in vivid detail—how your hospital's quality or safety score is "worse than" those at other hospitals. 

Or a competitor across town will show up as "better than," leaving your doctors, staff, trustees, and donors, not to mention patients, all asking why you don't measure up. A reporter will seize the data, and the news will be on a front page.

Expect another data update on Hospital Compare this month, when the Centers for Medicare & Medicaid Services plans to launch a much more extensive, consumer-friendly comparative display, followed by another wave of media reports on who's the worst.

"I think few hospitals will not have something that doesn't look good on their report card," says John Lynch, MD, chief medical officer with the 1,270-licensed-bed Barnes-Jewish Hospital in St. Louis.

"There may be some straight A students out there, but because the breadth of the type of things they're reporting on, probably everybody—many, many of your CEOs and other hospital leaders—are going to face this issue."

CMS now publicly posts hospital-specific results for 84 measures, with more expected in the next two years. Along with each measure, the public can download spreadsheets showing data for each hospital all in one file; one can see who's better or worse even within a region, state, county, or ZIP code.

These rating systems alert employers, community leaders, and health plans, for example, whether your patients got the right antibiotic at the right time, how long the hospital made patients wait in the ED, and the rate of central line bloodstream infections, or numbers of foreign objects left inside body cavities during surgery. Even the hospital's cost for an episode of care is held up for public view.

And Hospital Compare is only one set of data out there. The Leapfrog Group in June announced its Hospital Safety Score, in which it grades hospitals from A to F on patient safety efforts. In the first round, 132 hospitals didn't pass and 1,111 got a mere C. An update is due
in November.

Charles Ornstein, president of the Association of Health Care Journalists and an investigative healthcare reporter with ProPublica, says hospital executives should get used to heightened attention. The AHCJ is beefing up efforts to educate reporters on how to find and interpret quality statistics about healthcare providers, where to see inspection reports, and how to compare patient experience, readmission, and mortality rates.

"More reporters are realizing the treasure trove of information they can find," he says. "For decades, hospitals fought to keep this information out of the public domain. But now that it is public, we as journalists have an obligation to make it relevant."

Patients don't usually look up quality data before choosing a hospital, acknowledges James Conway, senior vice president for the Institute for Healthcare Improvement through 2011, but they do look at newspapers. And there are significant efforts under way to educate the public on how to use these measures, and forces are aligning to do that, from Consumer Reports to the Agency for Healthcare Research and Quality to the Commonwealth Fund's  "Why Not The Best" website to the Informed Patient Institute.

So what should a hospital chief executive do when that dreaded call comes? Barnes-Jewish is one hospital that went through that experience with two negative front-page headlines in 2010 and 2011 centering on its "worse than other hospitals" 30-day readmission rates in all three disease categories. In grappling with a headline in the St. Louis Post-Dispatch last August that shouted, "Barnes-Jewish faces cut in pay," and the organization has learned some lessons in the process.

"It was a pretty big exposé when it came out," Lynch says. "Front page."

Internally, Lynch says, the hospital didn't think Medicare's label of "worse than" was fair, because Barnes-Jewish is an academic medical center and a major safety-net hospital serving a low-income, difficult population. "We believe that most of the readmission risk is beyond our control. But we believe some of it is within our control, and that there is opportunity for improvement, and we've had a focused effort on trying to improve that over the past three or four years."

So the hospital worked with the reporter over a series of interviews to explain what goes on behind the scenes of the discharge process, and the problems inherent in keeping patients from coming back not just in 30 days, but 60 days or more. They allowed the reporter to follow a nurse manager into a discharged patient's home.

Practically speaking, Lynch advises, "it's actually quite difficult to try to explain away the data. We know that because of the complexity of these types of calculations, error bars, and the limitations of claims data. You could potentially come across sounding very defensive."

Barnes-Jewish took a different approach, conveying that it took the scoring seriously, and acknowledged it could do better. Since it had known for more than a year the readmissions were high, it had launched several programs to get them down, and it itemized each one.

Those programs include a pilot program that provides a 30-day supply of low-cost medications to discharged high-risk patients, referrals to low-cost medication suppliers, and a postdischarge clinic specifically for low-income patients that ensures they see a primary care physician within the first seven days.

"We turned this into, ‘We are concerned about our patients' outcomes, and we are in a mode of continuous improvement,' " Lynch says.

After the first story in 2010, a lengthy letter from Barnes-Jewish President  Rich Liekweg  was posted on the hospital's website and sent to clinicians highlighting the positive; for example, that the hospital has "significantly better" rates of 30-day mortality in two disease categories. The letter explained many of the initiatives being taken at the hospital to fix the problem.

"Our advice is not to run from it, and that has seemed to work well. To have your leadership out there and visible is really important for the community and our staff," Lynch says.

One lesson they learned is that it's wise to be proactive with the board of directors and employees before these disclosures become public. "Some of these stories are difficult to tell, and kind of complex. So I think it's important to have an early dialogue with community leaders, business partners, and the medical staff, and we're certainly having a much more open and timely dialogue with our board members around this."

Kenneth Sands, MD, senior vice president for quality at Beth Israel Deaconess Medical Center in Boston, which also shows up as "worse than" in 30-day readmissions for all three disease categories, agrees that just hoping the numbers will go away "is unlikely to be a good strategy."

"It's fair to say that there may be reasons why you think your hospital's performance is not fairly represented, but at the same time, it's best to acknowledge there's always the potential to do better," Sands says

Beth Israel has chosen to not be put in the defensive about such things the moment the data goes up on someone else's website, but to be transparent about its scores on its own website all the time so there are no surprises.

In a way, he's suggesting that if there's negative news out there, the hospital should scoop the media. "Don't wait for the news story to start communicating with all your important constituencies," he says.

Under "Quality and Safety," Beth Israel's website shows the latest available rates of adverse events such as falls, ventilator-associated pneumonia, or pressure ulcers. The rates are measured against comparable hospitals and against the hospital targets. It's clear that the hospital does not just reveal the positive. For example, the hospital's ICU central line–associated bloodstream infection rates are shown as double what it has targeted, and it offers information how the organization is addressing the issue.

Conway, former chief operating officer for Dana-Farber Cancer Institute, says that to be truly effective, hospitals should not hesitate to "welcome this level of scrutiny and public accountability. They should acknowledge that they're lagging and that they're early in their journey. But they should say particularly that they've dedicated resources, with specific action plans, to improve outcomes."

Hospital leaders can quibble with the data if they really, honestly have credible information that it's wrong. But they "must be prepared to talk honestly and in a way that can be understood about the gaps in their performance," he says.

"When someone picks a fight about a measure—say one on infections—there's not a lot of public sympathy, because the public wants these damn things eliminated, and they're frustrated," Conway says. Besides, if hospital executives are truly honest, even if they don't agree with their poor scores, they know they can improve. Some hospitals have even offered to send patients to another facility while they fix the problem.

Conway adds that hospital chiefs get frustrated with public reporting because it's based on data that is so old. If there's new data showing improvement, hospitals should let that be known.

Another institution that has chosen transparency is the Roswell Park Cancer Institute, a 133-licensed-bed hospital in Buffalo, N.Y., which as a prospective payment system cancer center is not yet required by CMS to report quality data.

"The driving force behind this was an increasing call from our board of directors that we should have a single place to report cancer quality data," says Stephen Edge, MD, chair of Roswell Park's health services and outcomes research. So in May 2011, the center published a 130-page booklet with details, including some comparative treatment and survival rates, on five types of cancer.

"Ultimately it was our CEO, Donald L. Trump, MD, who said he doesn't want to talk about it anymore, he just wants it to happen," Edge says.

The center's report, downloadable from its website, includes the bad with the good. For example, it acknowledges it has had long average wait times for chemotherapy, 90 minutes in 2009 and 54 minutes in late 2010, still too far off the goal of 30 minutes. Measures include how often patients' primary care doctors received reports on patient treatment, goals to reduce the number of times a patient must have blood drawn, and rates of use of breast-conserving surgical techniques in breast cancer patients.

Conway says that whatever they do, hospitals should not do what they used to when negative stories arose, "which was lay low, hope it vanishes, and take a ‘this too shall pass' attitude. Or, if there's data that says, for example, your coronary artery bypass graft profile is horrible, historically what hospitals have done is to discredit the data. It's sort of like a pigeon in a shooting game.

"But what we've learned is that the organization must ask a critical question: Could this data be right?"


This article appears in the August 2012 issue of HealthLeaders magazine.

Reprint HLR0812-8

See Also:

Pages

Tagged Under:


Get the latest on healthcare leadership in your inbox.