Skip to main content
  • MIPS 2019—Quality: Your Performance Rate Will Be Compared Against a Benchmark

    This content was excerpted from EyeNet’s MIPS 2019; also see the Academy’s MIPS hub page


    When you report a quality measure, CMS first determines whether you met the two data submission thresholds—the case minimum requirement (at least 20 patients) and the data completeness criteria (at least 60% of applicable patients). If you did, CMS will see how your performance rate stacks up against the measure’s benchmark as shown below. 

    A quality measure can have three different benchmarks. Quality measures can have separate benchmarks for claims-based reporting, for reporting via manual data entry into a registry portal, and for EHR-based reporting (whether via IRIS Registry integration or via your EHR vendor). However, some measures can’t be reported by all those collection types and therefore have fewer than three benchmarks.

    Your achievement score (3-10 points) for a measure will depend on how you perform against the measure’s benchmark. Each benchmark is broken into deciles. If your performance rate falls within:

    • deciles 1 or 2, you score 3 achievement points
    • deciles 3 through 9, your score will typically depend on where you fall within that decile (e.g., if you fall in the third decile, you can earn between 3.0 and 3.9 achievement points)
    • decile 10, you typically score 10 achievement points.

    Some benchmarks are subject to scoring limitations. A benchmark becomes subject to a 7-point cap (see Tables 6A and 6C) once it is in its second year of being “topped out.” Scoring for some measures “stalls” before the 10th decile (see Tables 6B and 6C). 

    What is a topped out benchmark? CMS considers a benchmark to be topped out if the average performance rate is very high (e.g., when the average performance rate of a process measure is 95% or higher). CMS is concerned that such benchmarks provide very little room for improvement for most of the MIPS eligible clinicians who use those measures.

    The end of the line for some topped out benchmarks. If a benchmark is in its second year of being topped out, it will—as mentioned above—be subject to a 7-point cap. If a benchmark is topped out for three consecutive performance years, CMS will consider eliminating it in the fourth year. Furthermore, if CMS finds that a benchmark is extremely topped out (e.g., average performance rate is 98% or higher), it may eliminate it the following year.

    Some measures don’t yet have benchmarks. CMS used 2017 performance data to try to establish 2019 benchmarks for quality measures. If there aren’t enough performance data from 2017 to establish a reliable benchmark for a measure, or if the measure didn’t exist in 2017, CMS will try to establish a benchmark retroactively using 2019 performance data. However, CMS won’t assign a benchmark to a measure unless at least 20 clinicians or groups submit performance data that meet the two data submission thresholds. If CMS is unable to establish a benchmark for a measure, you won’t be able to earn more than 3 achievement points for reporting that measure. 

    On Oct. 1, CMS updates the ICD-10 code set—and this could have repercussions for quality measures. The quality performance category relies on ICD-10 codes (the diagnosis codes) to determine which patients are eligible for each quality measure. However, CMS updates the ICD-10 code set annually on Oct. 1, which is 75% of the way through the MIPS performance year. In some cases, these changes to the ICD-10 code set may mean that it would no longer be fair to compare your performance on a measure to its historical benchmark—you would be comparing apples to oranges.

    Quality measures that are significantly impacted by ICD-10 changes will be subject to a nine-month assessment. After CMS has determined its changes to the ICD-10 code set, it will assess whether any quality measures are significantly impacted by those changes. It will publish a list of those measures on the CMS website at some point between Oct. 1, 2019, and Jan. 2, 2020. For the measures on that list, CMS would evaluate your performance based only on the first nine months of 2019, before the ICD-10 codes were changed.

    In rare cases, a quality measure may be “suppressed.” During the course of the year, changes in clinical guidelines may mean that continued adherence to a measure could result in patient harm and/or provide misleading results as to good quality care. In the unlikely event that this happens with one of ophthalmology’s measures, CMS could suppress that measure. This means that if you submitted data on the measure before it was suppressed—because, for example, you were reporting it by claims—1) you wouldn’t score points for that measure, and 2) when CMS calculates your quality score it would reduce your denominator by 10 points (so you wouldn’t be penalized for reporting the suppressed measure).

    Previous: Quality: Meet the Data Submission Thresholds

    Next: Quality: Bonuses for High-Priority Measures and CEHRT

    DISCLAIMER AND LIMITATION OF LIABILITY: Meeting regulatory requirements is a complicated process involving continually changing rules and the application of judgment to factual situations. The Academy does not guarantee or warrant that regulators and public or private payers will agree with the Academy’s information or recommendations. The Academy shall not be liable to you or any other party to any extent whatsoever for errors in, or omissions from, any such information provided by the Academy, its employees, agents, or representatives.

    COPYRIGHT© 2019, American Academy of Ophthalmology, Inc.® All rights reserved. No part of this publication may be reproduced without written permission from the publisher. American Academy of Ophthalmic Executives® and IRIS® Registry, among other marks, are trademarks of the American Academy of Ophthalmology®.

    All of the American Academy of Ophthalmology (AAO)–developed quality measures are copyrighted by the AAO’s H. Dunbar Hoskins Jr., MD, Center for Quality Eye Care (see terms of use).