Skip to main content

How Making Hospital Quality Data Public Affects Providers

 |  By  
   May 08, 2014

Healthcare leaders are finding that posting quality scores and outcomes, both good and bad, can be beneficial for staff as well as patients.

This article appears in the May 2014 issue of HealthLeaders magazine.

The public should get to read what patients think about the quality of care they receive from their doctors, leaders of the University of Utah Health Care in Salt Lake City believe. It's important for the organization's transparency, and serves as a tool that pushes doctors to improve their care.

That's why the four-hospital, 597-bed system not only solicits reviews from their patients with comments about doctors' performance, it also has posted nearly 20,000 patient comments on those physicians' Web pages. It has deleted only those comments that are libelous, profane, or violate patient privacy, about 3%.

Some 14 months after the program began, comments that include scathing rebukes and complaints as well as praise are viewable for 376 of the 450 physicians whose patients were surveyed, and for whom there have been at least 30 comments apiece. Every week, another 250 comments are posted. Here are some examples:

  • "One of the finest doctors I have ever dealt with … thorough in her examinations and is very careful to explain …" wrote one patient about a nephrologist.
  • This doctor "is the bomb! Very professional, gives you a clear understanding of what is going on, and what is to come," a patient wrote about a gynecologic surgeon.
  • And about an orthopedic surgeon, "Horrible patient care." And, "Worst physician I'd ever seen. He did not seem to have any interest … I felt like I was a number."

Posted comments describe wait times that were too long, visits that were too short, physicians who had poor bedside manner or came across as insulting, and treatments that didn't help or made matters worse. All are all on display, posted just beneath a one- to five-star scorecard that rates doctors on each of nine questions.

From Salt Lake City to Cleveland and Boston to Miami, hospitals are baring their data souls, showing the best and the worst about their quality of care. Instead of touting promotions and platitudes, their websites are now tools for education and transparency, and leaders say quality of care has only gone up as a result.

Thomas Miller, MD, chief medical officer for University of Utah Hospitals & Clinics, says the physician scorecard effort was a high-level policy decision. It emphasized that the website is now a tool for disclosure of quality through "truth in reporting."

The effort, admittedly easier because of the system's closed staff model, evolved after several physicians angrily complained about commercial, proprietary review sites in which a few anonymous people had written what the physicians considered to be some hateful and unjustified remarks.


Chrissy Daniels, director of the Utah system's strategic initiatives, says that before the project began, anyone who wanted to look up a Utah physician would go to Google. She recalls an instance when a "very critical" comment was posted about a university physician. That doctor asked the university to "fix it," which, of course, it could not.

"The problem was," Miller says, "you don't know who's putting up those comments, whether it's the physician in the practice writing positive comments or disgruntled employees and friends who are writing nasty comments.

"We felt the best defense was an offense," Miller says. "We have data, tons of it, through a patient satisfaction survey administered by Press Ganey showing our patients rate us very highly. Why not use that information. We believe that physicians should know what their patients think of them and how they're treated."

Miller sees third-party sites rating doctors as "a tidal wave that is coming at healthcare providers," who he says are one of the "last bastion of professionals to be scored publicly online. You're going to be judged by the crowds whether you like it or not. It's out of our control, so the best thing we can do is take the great data that we have and use that to tell our story, and make that an open and
honest story.

"We're surprised that other healthcare institutions haven't followed suit," Miller says. Improvement is the goal. Doctors who have work to do in their practices can see that with familiar themes in the comment thread; for example, if a physician spends only a few minutes with his or her patients or if treatment protocols fail to fix a problem, patients will comment about it.

Since the project launched in December 2012, physicians throughout the Utah system have worked to improve their practices. That shows because the overall numerical scores have gone up, and some physician practices have increased their numbers of stars, Daniels says.

Some physicians' pages have more than 100 comments, obviously increasing visibility for the system.

The Utah system does not yet post inpatient quality scores on its website or link to other sites that do, such as Hospital Compare. But it is publishing outcomes data from patients with brain or spinal cord injuries, or stroke, who were treated in the system's rehabilitation center.

Here, too, the organization is not averse to publishing data that is not overwhelmingly positive for the organization. For example, according to data collected from rehabilitation center patients 90 days after discharge, 57% "strongly agreed" with the statement that the rehabilitation program prepared them for going home, with 41% agreeing somewhat and 2% having some disagreement.

Miller and Daniels acknowledge that physicians and others within the organization were not initially ecstatic that physicians were going to be rated and critiqued on the hospital's own website, or that other quality data would be posted; some thought the new era of candor was ill-advised.

"There are people who think you shouldn't put negative things on a doctor's website," Daniels says. As many as a dozen were "vehemently opposed," Miller says.

But hospital leadership launched a convincing campaign, specifying certain elements the program had to include. One, Miller says, is that the highest levels of hospital and physician leadership had to give support, which they did. Second, "it's essential to have an office that's in charge of managing the information that comes back from the survey." And third, that there would be an appeals process for physicians to make a case that a comment is unfair.

The system is in the process of posting other quality measures on its site as well, such as links to Medicare data posted on Hospital Compare. Key to taking that action, says Brian Gresh, University of Utah's senior director of interactive marketing and Web, is to make a decision and stick to it, and not post measures just when scores are favorable, and dump them when they're not.

"We have said we're not going to cherry-pick the information, and I am adamant about that," Gresh says. "Consumers are smart; they have access to a ton of data, and if we are going to present it on our site we have to be as transparent as possible. Because if we aren't, they can go to another source and find data that contradicts that or puts it into question."

At 649-bed Beth Israel Deaconess Medical Center in Boston, the road to transparency began in 2003, says Chief Quality Officer Kenneth Sands, MD.

Commercial reputation seemed to suggest BIDMC's competitors were better hospitals, "but when we looked at the data, it didn't look that way to us; it looked like we were the same or better. So we felt there was nothing to lose by creating a more level playing field, by making the data available."

Not only are links to dozens of patient quality and safety measures posted on the hospital's home page, Beth Israel also links to The Joint Commission's survey report.

"We wrote down criteria that would qualify to be shared publicly," Sands says. The data "has to be relevant to patient decision-making, reliable, reproducible, understandable, and valid. And we have a 'no cherry-picking' rule, too."

Some physicians were eager to have more data posted, measures that are relevant to a small subspecialty, for example. Those would meet the criteria, Sands says.

BIDMC also focuses on harm prevention, including a list of patient harm incidents by quarter, where the organization has had problems, and with specifics, what it is doing to improve.

Why is online transparency by providers so important when many surveys suggest patients make decisions on where to get care based on their health plan, or recommendations from family, friends, and physicians, not based on what they see on a hospital website?

"The hits on these websites are not that high," Sands acknowledges. "But transparent reporting's strongest impact has been internal. There's the overall message that we're confident enough in our performance to share information publicly, and the accountability that it signals. It's generated a series of conversations about what we want to make sure we're doing well at, that we're tracking it."

Payers as well as patients say they like going to a hospital that's prepared to show that data," Sands says. And it's even inspired a friendly spirit of competition among service lines to improve measures where the hospital may have fallen short in the last reporting period.

Sands knows that other hospitals will soon adopt transparency strategies for their websites, but they need to take special care on one important point: accuracy.

"If you're going to maintain credibility with your own clinicians, you have to be excruciatingly accurate, because they take this very seriously. If you're saying you did 47 carotid endarterectomies last quarter, you better be sure your vascular surgeons agree that it was not 46 or 48." That may not seem important, "but our learning shows that it is."

At the 4,450-bed Cleveland Clinic Health System, Chief Quality Officer J. Michael Henderson, MD, takes pride in the fact that the clinic's home page has dedicated real estate for five types of quality reporting, from data available on Hospital Compare to the organization's "Outcome Books," now prepared for 14 service lines.

When they were launched five years ago starting with heart and vascular care, Henderson says, the books "were really prepared for our referring doctors" and some of the employer groups the clinic contracts with directly for patient care. "But by putting them on the website, we're making them available to patients as well."

These PDF documents are posted online and show hundreds of data points, including surgical volume by campus, outcomes, use of new technologies, mortality rates by procedure, and process measures such as door-to-balloon times for cardiac care. It is a much deeper dive into quality measurement, more information than Medicare requires, Henderson says.

But the result has been improvement in processes and outcomes for most specialties, he says.

Henderson says that there was "a bit of pushback from a few of the groups that didn't have the best numbers initially. But that didn't come from our leadership, not from the board, and not from our marketing people."

The clinic also is one of 113 hospitals participating in an American College of Surgeons' surgical improvement program that have agreed to let Hospital Compare post its rates of death or serious complications within 30 days for three procedures—even though for one of them, lower-extremity bypass, the Cleveland Clinic's rates put it among three hospitals that were "worse than average."

Cleveland Clinic did that, he says, "because our lower-extremity bypass surgeries need improvement."

To hospitals thinking of similar initiatives, Henderson says, "Do it. I have not seen a downside. It's valuable for patients and it's valuable to help push us toward improvement. As soon as you start measuring these things in detail, you find gaps and opportunities" to get better.

In mid-2013, 1,713-bed Baptist Health South Florida, a Coral Gables–based six-hospital system serving South Florida, revamped and expanded its website to dramatically change the way it presents four types of quality data for its six adult hospitals: outcomes, patient experience, safety, and accountability.

Now, it shows in bright green and yellow charts in what areas the hospital is best, where it's average, and what types of care falls "below average," and hospitals within the system compete with each other for better scores.

"It's a healthy sibling rivalry among our senior and performance improvement leadership teams,"
says Emily Ruwitch, assistant vice president for Baptist's Center for Performance Excellence. "Our CEO, Brian Keeley, told us a couple of years ago that we need to be bold and audacious and put our performance data out there."

But there was "trepidation on a number of levels," she says. "Some asked if people will be able to understand it, and if we're not performing as well as we could be, highlighting an opportunity where we need to improve—is that something we really want to put out there?" she says. "There were definitely conversations about posting mortality rates. That can be a scary thing to look at. "

But the plan prevailed, in part because the same type of information was available elsewhere. "And it would look kind of funny if they can find it somewhere else but we don't have it on our website. We wanted our revamped website to be a single source for patients and their families to get a true sense of our performance in those four domains."

As hospital leaders say all the time, a culture change is underway, and it's apparent in quality data transparency. "Five years ago, like most places, this was not at the top of our radar," says the Cleveland Clinic's Henderson. Hospital organizations believed they were good, and they didn't need to say why or how.

Now, that attitude is changing. "The culture change is acknowledging and accepting that no matter how good you are, you have opportunities to improve if you know where they are."

Reprint HLR0514-8

This article appears in the May 2014 issue of HealthLeaders magazine.


Tagged Under:

Get the latest on healthcare leadership in your inbox.