When choosing a hospital or doctor, find sources you can trust
MAYWOOD, Ill. - The plethora of hospital ranking systems can be confusing when you need to select a physician or hospital. Tempting as it is to use one of the "best of" listings, how do you know which ratings are legitimate?
Loyola University Health System's Center for Clinical Effectiveness, which coordinates the institution's quality improvement programs and initiatives, says to be sure the rankings are open to peer review and based on scientifically valid measures because many are not.
"More than 20 companies and government agencies report, rank or rate physicians and hospitals, but the methodology and data sources vary considerably," said Dr. William M. Barron, center director and Loyola's vice president of quality and patient safety, Maywood, Ill. "As a result, one gets different results. That is why a hospital that achieves a stellar grade from one ranking system may not even appear on a different system's list."
So how can you tell which ratings measure up?
"First," said Barron, "choose a source that is open about its methodology. Find out exactly what they are measuring, the data sources and how old the data is."
If it is Medicare-only data, keep in mind that this involves a specific age group, people over the age of 65. That information may not be applicable to younger adults and children. Does the ranking system use billing data only or is it supplemented with clinical data? In addition, results can be quite different unless they appropriately take into account how sick patients are and what other serious medical conditions are present - this is a very difficult and complex science.
Consumers should be extra cautious if the firm that does the rankings requires a hospital or physician to pay a fee in order to be ranked or if there is a charge to see the rankings.
Some entities rank hospitals based on nationally recognized measures of outcomes and key processes of care, and generally, that is good. Other companies base their listings on the reputation or popularity of hospitals and/or physicians.
Barron said that the public should not base healthcare decisions on physician popularity polls. "Reputation is not a scientific measure of quality," he said. "Therefore, such rankings have no scientific validity."
Castle Connolly, Chicago magazine, Consumers' CHECKBOOK Guide to Top Doctors, MDNationwide and Best Doctors base their rankings on reputation, according to Barron. U.S. News & World Report bases its rankings partially on reputation.
"Some companies, such as HealthGrades, will not fully reveal what their rankings are based on and how the final rankings are calculated," said Barron, professor of medicine, and obstetrics & gynecology, Loyola University Chicago Stritch School of Medicine, Maywood. "That's a concern."
Other surveys, including The Leapfrog Group, rely on self-reported data, said Barron, noting that this can be a problem because there is no validation of the information. "To the extent that providers are honest, you get an accurate reading. But some providers, sometimes inadvertently, may try to make themselves look better," he said. Among other entities that produce reports are WebMD Quality Services, J.D. Power and Associates and the Thomson 100 Top Hospitals (formerly The Solucient 100 Top Hospitals).
"Several insurance companies have jumped on the bandwagon, too, launching their own online hospital comparison information tools for employers, members and hospitals," said Barron. "Most such report cards do not restrict measurement to nationally validated measures of quality. Each insurance company uses a unique set of measures, organized in a unique way.
"Not only is this scientifically suspect, but it is extremely confusing to the public," he added.
"Measurement of quality is a brand new science, dramatically ramping up in the past five years as the public has demanded more information," said Barron. "Look for transparency, where information is complete and fully visible. That transparency provides the opportunity for the scientific community to review, edit, modify and even change the methodology."
The National Quality Forum (NQF) is a public/private partnership charged with the task of creating a national quality measurement and reporting system. "NQF collects and reviews measures and puts them through a rigorous evaluation process," said Barron. "It endorses those determined to be scientifically valid.
"Loyola is supportive of the idea that the public should focus on using report cards or measures that have been endorsed by the NQF," he said. "The Joint Commission and the U.S. Department of Health & Human Services' Centers for Medicare and Medicaid Services (CMS) use NQF measures."
Barron said that Loyola has been putting quality measures on its Web site www.LoyolaMedicine.org/quality for three years, including measures that are not available publicly elsewhere.
The U.S. Department of Health and Human Services CMS ranks Loyola University Medical Center among the best hospitals in the nation for the treatment of heart failure. The report on quality and cost of healthcare shows that Loyola is in the top 38 hospitals nationwide and is one of only two hospitals in Illinois with the lowest death rates from heart failure