Quality e-Newsletter
Intelligence Unit Special Reports Special Events Subscribe Sponsored Departments Follow Us

Twitter Facebook LinkedIn RSS

Are Hospital Rankings Popularity Contests or Measures of Quality?

Janice Simmons, for HealthLeaders Media, April 22, 2010

Since 1990, U.S.News & World Report annually has ranked more than 5,000 hospitals in 16 adult and 10 pediatric specialties—whittling them down to the top 50 in each category—to help consumers "find the one that's best for you and your family."

However, a study appearing in the April 20 Annals of Internal Medicine says the standings of those top hospitals reflect more "subjective reputations" over more objective measures of quality. So what should we think?

The work of Ashwini Sehgal, MD, a Cleveland-based kidney specialist, often "focuses primarily on trying to improve the quality of care" for his patients. "One day I was just curious to know how U.S. News does their rankings of quality of care for kidney disease . . . studying not only how they rank hospitals for kidney disease care but in general and for other specialties," he says in an interview.

That curiosity led the physician, a kidney specialist at Metro- Health Medical Center and professor of medicine at Case Western Reserve University, to study the methodology used by U.S. News to for the past 20 years to find "America's Best Hospitals."

His findings nearly made him fall "out of my seat when I completed the analyses and saw how much of the rankings were based on reputation and how little reputation correlated with objective quality," he says.

"I found that if you look at the top-ranked 50 hospitals, about 75% of the ranking comes from its reputation and not from objective quality of care. And if you look at the top five hospitals in each specialty, nearly 100% of its ranking comes from its reputation and not from objective quality of care," he says.

"I was thinking that if we did professional football this way, we would just say let's give the Super Bowl trophy to the Dallas Cowboys without them playing any games because they have a national reputation for being a good team," Sehgal says. "I think the main message . . . is that the rankings are not really good measures of quality of care. They are simply measures of national reputation."

It's not that U.S. News "purposefully tries to have reputation have a predominant role. "It's a quirk of the way they do it," Sehgal says. In the study, Sehgal notes how the magazine's editors survey 250 physicians around the country in a specialty and then ask the doctors to name the five best hospitals in their specialty. "And the result of that is that only a handful of hospitals that have a national reputation get named with any frequency," he adds.

However, Avery Comarow, who edits U.S.News' health rankings, doesn't entirely agree with the study's assessment. The goal of the rankings is not to try to find out who provides the best routine care. Instead, the rankings are trying to find the top referral centers the hospitals that handles the most challenging patients and performs the more difficult procedures, he says.

"It's the reason that we look at specialties—and not procedures," Comarow says. The magazine actually contacts 600 specialty board certified medical providers. "The selection is randomized, and it's done by region with weighing, so there's no one region that outweighs any other—so you don't get a "Boston effect" or a "Los Angeles/San Francisco" effect."

In addition, writing in Boston's Beth Israel Deaconess CEO Paul Levy's blog "Running a Hospital" earlier this week, Comarow says he agrees that reputation can be subjective, and that U.S. News is "taking steps" to reduce its impact. "But it's not as squishy as might be thought. The scores, based on the most recent three years of surveys, are remarkably stable over time. Nor are they volatile year to year," he writes.

Comments are moderated. Please be patient.