Do Hospital Rankings Really Matter?

Numerous organizations rate hospitals—but doing so is an imperfect science.

Mass General

Photo by Samantha Carey

Gary Young, director of the Northeastern University Center for Health Policy and Healthcare Research, recalls a classic joke at healthcare conferences: A patient in the throes of a heart attack drags himself across the room, sits at a computer, and frantically starts Googling the best cardiology hospitals in the area.

It’s ridiculous—a situation that would never happen. But it begs the question: If hospital rankings won’t help a patient at his or her most vulnerable hour, why do them in the first place?

“There’s been this big movement for the last 20, 25 years of improving transparency and developing quality measures and making them available through these rankings,” Young says. “A lot of money, a lot of time has been invested in building this kind of infrastructure.”

Hospital rankings are useful, in exactly the way they seem useful—they give patients information about the institutions they trust with their lives. But for all the time and money and effort invested in hospital rankings, they’re still an imperfect science. They attempt to classify something that’s exceedingly difficult, perhaps even impossible, to classify. Can an institution like Massachusetts General Hospital, which deals with everything from neurosurgery to nephrology, be summarized in any meaningful way?

Elizabeth Mort, Mass General’s senior vice president of safety and quality, says ratings have value, but she’s dubious of their simplicity.

“Is there really a difference between four and five [stars], three and four, two and three, one and two?” she asks. “When you artificially slice a distribution curve of measures into categories, you run the risk of misclassifying people.”

Still, institutions care how they do, and celebrate when they’re classified well. When U.S. News & World Report named Mass General best hospital in 2012, for example, it famously baked staff 14,000 cookies decorated with “#1 Thanks to You.”

The problem is that the ratings, once they reach a certain level, are somewhat arbitrary. The difference between the first-ranked hospital and the 100th-ranked hospital is fairly significant. But what about the difference between Mass General’s first place finish in 2015 and its third place finish this year?

“That shouldn’t make any difference at all,” says Ben Harder, chief of health analysis at U.S. News. “People like to talk about the Honor Roll [U.S. News’ list of the top 20 hospitals], and it does have its use, but look past the Honor Roll as quickly as possible.”

What Harder means is that the sexy rankings—the ones that are easy to digest, and the ones that are easy to put in a headline—aren’t really the most useful. The Honor Roll highlights hospitals that are uniformly excellent, but a patient who needs a hip replacement shouldn’t care that a hospital ranks among the top 20 if it got there because it excels in 10 unrelated specialties.

“I would encourage readers to look at the results that are most relevant to the clinical need they have,” Harder says. “Look at the most granular set of evaluations that’s relevant.”

Mort adds that a numerical or star ranking often can’t adequately capture the many, many aspects of a hospital’s performance. A good report, she says, would analyze a hospital’s infrastructure, processes, and outcomes; not rely too heavily on measures like readmission rates, which are not always fully controllable; and avoid over-simplified classifications, like the star ratings used by the Centers for Medicare and Medicaid Services (CMS). That rating, for the record, gave Mass General four stars; the only Boston hospital to get five was New England Baptist Hospital.

“Any system that does try to put hospitals or providers into categories should do it using the very best scientifically sound methods as possible,” she says. “It’s a very complex set of concepts, quality and safety. So it’s a lot to ask to really, accurately characterize hospital quality and put it in a five star system.”

That doesn’t stop people from trying. U.S. News and CMS aren’t alone in the rankings game. Every year, companies including Consumer Reports, Leapfrog, and Healthgrades rank hospitals, flooding the ratings world with a cacophony of competing voices.

“The rankings aren’t always consistent,” Young explains. “Only a small percentage of hospitals rank at the high end of those rankings across those different rankers.” For example, only one of U.S. News‘ 2016 top five hospitals (the Mayo Clinic) got five stars from CMS.

Slight differences in methodology can produce huge differences—both from one organization to another, and from one year to the next. This year, U.S. News analyzed nine new areas of care, and slightly rearranged the way specialty performance is weighted when determining the Honor Roll. As early as April, Harder warned readers that the result would be a reshuffled Honor Roll.

U.S. News has been open about the changes, but ratings are still opaque. “[With the Honor Roll,] what you end up with is an impressionistic sense of how good a hospital might be,” Harder says, “but you’ve just obscured all the variation within the hospital that’s actually very important.”

“With some of the commercial rankings, there’s a little bit less transparency, so we don’t really know, in a very precise way, how these measures and the rankings are actually being computed,” Young adds. “A hospital that does well on one measure may not do well on another measure, and it becomes very confusing for consumers to try to sort all of that out.”

Patients can’t be blamed, then, for approaching hospital rankings with trepidation—if they approach them at all.

Unlike university rankings, which high school seniors read voraciously, Young says “it’s not clear that consumers pay attention to” medical rankings, instead relying heavily on word of mouth, opinions of friends and family, and physician referrals. Mort agrees, noting that the rankings can be prohibitively difficult to sift through, and that a more useful measure would prioritize patient experiences.

“I would like to see, over time, a consumer-facing vehicle to provide the best information possible, and it would be an amalgamation of these different measures,” she says. “There really is no agency serving the patients, and I think that’s really where we have an opportunity.”

Both Young and Mort guess that Internet-savvy Millennials may usher in that change. For now, though, medical consumers will have to do the best they can with what they’ve got—Googling best cardiology hospitals, but also listening to friends, family, and doctors.

“I would look for some consistency,” Young says. “If a hospital is doing well across multiple rankings and my physician recommends them, that’s going to make me feel more confident that this is the place to go.”