For surgery, big and famous hospitals aren't always the best

NEW YORK Wed Jul 31, 2013 8:25am EDT

Doctors and nurses assist in the preparation of a patient during a breast implant and skin reduction surgery in Monmouth, New Jersey October 30, 2007. REUTERS/Lucas Jackson

Doctors and nurses assist in the preparation of a patient during a breast implant and skin reduction surgery in Monmouth, New Jersey October 30, 2007.

Credit: Reuters/Lucas Jackson

Related Topics

NEW YORK (Reuters) - Patients going to a hospital for surgery care about many things, from how kind the nurses are to how good the food is, but Consumers Union (CU) figures what they care about most is whether they stay in the hospital longer than they should and whether they come out alive.

In the first effort of its kind, the nonprofit publisher of Consumer Reports magazine released ratings of 2,463 U.S. hospitals in all 50 states on Wednesday, based on the quality of surgical care. The group used two measures: the percentage of Medicare patients who died in the hospital during or after their surgery, and the percentage who stayed in the hospital longer than expected based on standards of care for their condition. Both are indicators of complications and overall quality of care, said Dr John Santa, medical director of Consumer Reports Health.

The ratings will surely ignite debate, especially since many nationally renowned hospitals earned only mediocre ratings. The Cleveland Clinic, some Mayo Clinic hospitals in Minnesota, and Johns Hopkins Hospital in Baltimore, for instance, rated no better than midway between "better" and "worse" on the CU scale, worse than many small hospitals. Because CU had only limited access to data, the ratings also underline the difficulty patients have finding objective information on the quality of care at a given facility.

Nevertheless, "this is a step in the right direction," said Paul Levy, former president of Beth Israel Deaconess Medical Center in Boston, who was not involved in the project. "To whatever extent you can empower patients to get better care and become partners in pushing the healthcare system to make improvements is to the good."

CU's ratings are based on Medicare claims and clinical records data from 2009 to 2011 for 86 kinds of surgery, including back operations, knee and hip replacements, and angioplasty. The rates are adjusted to account for the fact that some hospitals treat older or sicker patients, and exclude data on patients who were transferred from other hospitals. These are often difficult cases that, CU felt, should not be counted against the receiving hospital.

Although the ratings do not explicitly incorporate complications such as infections, heart attacks, strokes, or other problems after surgery, the length-of-stay data captures those problems, said Santa.

Some of the findings are counterintuitive. Many teaching hospitals, widely regarded as pinnacles of excellence and usually found at the top of rankings like those of U.S. News & World Report, fell in the middle of the pack.

"This isn't the first time we've seen this sort of surprise," said Dr Marty Makary, a surgeon at Johns Hopkins Hospital and author of the 2012 book, "Unaccountable: What Hospitals Won't Tell You and How Transparency Can Revolutionize Health Care." "For a complex procedure you're probably better off at a well-known academic hospital, but for many common operations less-known, smaller hospitals have mastered the procedures and may do even better" with post-surgical care.

NOT 'A TRUE PICTURE'

The Cleveland Clinic's chief quality officer, Dr Michael Henderson, said CU's methodology, which gave his hospital a middle-of-the-scale rating below that of such Ohio hospitals as the Fulton County Health Center in Wauseon and the Institute for Orthopaedic Surgery in Lima, "doesn't give you a true picture" of the quality of surgical care. Much better, he said, is actual outcome data - how well patients undergoing any given procedure fare - which Cleveland is a pioneer in making public via its website.

Experts at other big-name hospitals whose CU ratings fell short of their reputations also questioned the methodology. "The accuracy of claims data," like that CU used, "is very low or unknown," said Dr Peter Provonost of Hopkins.

CU also found that several urban hospitals did well despite serving many poorer, sicker patients, including Mount Sinai Hospital in New York and University Hospitals Case Medical Center in Cleveland. Rural hospitals did better, on average, than other hospitals, and many hospitals practically unknown beyond their zip code outranked famous ones, including Kenmore Mercy near Buffalo, New York; Arrowhead in Glendale, Arizona; Sacramento Medical Center in California; and Arkansas Heart in Little Rock.

Hospital choice matters more for some procedures than others. Length of stay for hip and knee replacements and back surgery varied widely, for instance, while hospitals' scores for colon surgery and hysterectomy were more similar to one another.

Like other experts pushing for greater "medical transparency" - that is, reporting data on how patients fare after treatments - CU's Santa said available data, including that used by CU, is far from perfect.

The American College of Surgeons collects data on surgical outcomes, such as the rate of infections at the surgical site and urinary tract infections, through its National Surgical Quality Improvement Program. The group will not release the data to the public because it promised confidentiality to hospitals providing the data, said Dr Clifford Ko, a cancer surgeon at UCLA Jonsson Comprehensive Cancer Center who is involved in the project. However, 102 of about 500 participating hospitals voluntarily report some of their data to the federal Center for Medicare and Medicaid Services.

"I think the public would be surprised at all the data they're not allowed to see," said Santa. "One of the reasons we did this was to stimulate debate and irritate people" enough to force hospitals and others to be open about the quality of care they provide. Many critics of CU's methods agree with that goal. Hopkins' Provonost, for instance, has called for a medical version of the Securities and Exchange Commission to require hospitals to report patient outcomes, just as the SEC requires public companies to report financial data.

Until and unless that happens, the lack of transparency can be expensive, not only in lives but also in dollars. Last week the Leapfrog Group, whose employer-members provide health insurance to workers, released a calculator of "hidden hospital surcharges," the amount that errors, accidents, infections and injuries cost payers.

On average, said Leapfrog president and chief executive Leah Binder calculates, a patient treated at a hospital with a grade of "C" or lower on an A-to-E scale of safety incurs $7,780 in costs due to medical errors.

The CU report is available at www.ConsumerReports.org/cro/hospitalratings0913.

(Reporting by Sharon Begley; Editing by Prudence Crowther)

FILED UNDER:
We welcome comments that advance the story through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can flag it to our editors by using the report abuse links. Views expressed in the comments do not represent those of Reuters. For more information on our comment policy, see http://blogs.reuters.com/fulldisclosure/2010/09/27/toward-a-more-thoughtful-conversation-on-stories/
Comments (13)
Cascadia wrote:
Where is the actual report? The link is broken and you can only find links to the methodology on the consumer reports site

Jul 31, 2013 7:22am EDT  --  Report as abuse
carlo151 wrote:
I question this survey, when you are measuring healing and cure rates, or patients returning for follow ups or complications, I do not think you can use the same methods as comparing repair rates on automobiles or complaints about a dishwasher.
People chose the best hospital many times when they believe their condition is extremely complex and they need the best. These hospitals are sometimes far away but for many they feel it could be their best chance of survival and are willing to take time out of their life to travel to the best hospitals. Other times they may feel a local hospital can deal with a more minor emergency.

This is a study that requires full access to the data and some serious number crunching. Hopefully our medical centers will some day be able to supply the full data needed.

Jul 31, 2013 7:53am EDT  --  Report as abuse
epijay wrote:
A classical statistical error seems to exist in this analysis – the one of selection bias. It is a well known fact that more serious and complicated cases go to the more “established/reputed” institutions such as Johns Hopkins and Mayo et al. I don’t see any data that evaluates a patient’s condition before admission to show that these conditions are comparable across all hospitals surveyed.

Jul 31, 2013 8:01am EDT  --  Report as abuse
This discussion is now closed. We welcome comments on our articles for a limited period after their publication.