You can’t make this stuff up. When CMS conducted a study to identify hospitals that game the quality data reporting system, it used an approach that made the data reporting look better than it actually was. That bottom line is indicated in the second half of the title of a May 4 study released by the Office of Inspector General of HHS: CMS Validated Hospital Inpatient Quality Reporting Program Data, But Should Use Additional Tools to Identify Gaming.
As required by law, CMS conducted a study to validate 2016 reporting of inpatient quality data—the data used to adjust payments to hospitals on the basis of quality. CMS selected 443 hospitals for the study, and only six of them failed. That’s a pass rate of nearly 99%.
So why wasn’t the OIG impressed by that spectacular pass rate? Because CMS used an approach for selecting the 443 sample hospitals that actually “made it less likely to identify gaming quality reporting.” For example, “CMS made limited use of analytics that can help identify suspected gaming” and “did not include any hospitals in its targeted sample on the basis of aberrant data patterns.”
No word yet on CMS’s response. Maybe it was training for future studies on the theory that it takes a gamer to spot a gamer.