Just a few days after the Center for Medicare Services (CMS) released its first quarterly Hospital star ratings, U.S. News and World Report released their annual version. Of the 20 hospitals ranked as Best Hospitals by the magazine, CMS only agreed with one, giving it a 5-star rating. Nine got 4 stars and nine got 3 or 2 stars. CMS did not rank Johns Hopkins.
Two scoring systems came to some very different conclusions. What does that mean for you?
As a vascular surgeon, I reflexively look at hospital rankings for procedures in my field of expertise. So I looked at the U.S. News report on Aortic Aneurysm surgery. Abington Memorial Hospital’s rating caught my eye.
High performers were in the top 18% of the 1,163 hospitals rated. How can you be high performing and have a "worst survival rating?" For what I will assume (rather than know, despite four separate courses in statistics) to be statistical reasons, mortality was not considered “because it was inappropriately correlated with the latent variable.”
How accurate can a survival rating be when death is not a factor?
Re-admissions were considered but only for the first seven days after discharge. Multiple studies show that while many re-admissions occur within 14 days, 50% occur from day 16 to 30, a period considered by CMS but not U.S. News.
U.S. News also weighed variables differently and placed them in different categories from CMS. Why? It's unclear. But perhaps the greatest difference between the two was that U.S. News surveyed physicians and their opinion was 27.5% of the final score.
The question asked of doctors was to name the five best hospitals in their specialty. What is bothersome to me, and now I am speaking as a physician, is their rationale for using physician opinion, and weighing it so highly.
“An appropriately qualified physician who identifies a hospital as among the “best” is, in essence, endorsing the process choices (who to admit, how to evaluate, courses of treatment, length of stay) made at that hospital, and we regard the nomination of hospitals by board-certified specialists as a reasonable process measure,” they wrote.
I humbly disagree. Few physicians can describe the same process measures for their hospitals, let alone high-volume academic centers.
It is all reminiscent of the quotation attributed to George Box,
“All models are wrong, some are useful.”
These kinds of ratings are supposed to help patients find appropriate care, but when mortality is not considered important in a survival rating and physician opinion is a proxy for hospital process, how useful is the guidance?