If you have ever tried to choose a physician or hospital based on publicly available performance measures, you may have felt overwhelmed and confused by what you found online. The Centers for Medicare and Medicaid Services, the Agency for Healthcare Research and Quality, the Joint Commission, the Leapfrog Group, and the National Committee for Quality Assurance, as well as most states and for-profit companies such as Healthgrades and U.S. News and World Report, all offer various measures, ratings, rankings and report cards. Hospitals are even generating their own measures and posting their performance on their websites, typically without validation of their methodology or data.
The value and validity of these measures varies greatly, though their accuracy is rarely publically reported. Even when methodologies are transparent, clinicians, insurers, government agencies and others frequently disagree on whether a measure accurately indicates the quality of care. Some companies’ methods are proprietary and, unlike many other publicly available measures, have not been reviewed by the National Quality Forum, a public-private organization that endorses quality measures.
Depending where you look, you often get a different story about the quality of care at a given institution. For example, none of the 17 hospitals listed in U.S. News and World Report’s “Best Hospitals Honor Roll” were identified by the Joint Commission as top performers in its 2010 list of institutions that received a composite score of at least 95 percent on key process measures. In a recent policy paper, Robert Berenson, a fellow at the Urban Institute, Harlan Krumholz, of the Yale-New Haven Hospital Center for Outcomes Research and Evaluation, and I called for dramatic change in measurement. (Thanks to The Health Care Blog for highlighting this analysis recently.)