Data Registries and MIPS: The Easy Way, Someday


Mythreyi Bhargavan Chatfield, PhD, is a 15-year veteran with the ACR and a witness of multiple permutations of government quality programs during that time.  As director of quality and safety for the ACR, she also oversees the ACR’s National Radiology Data Registry (NRDR).

“It feels like this is a time when everything is coming together,” she told an audience during “MACRA: Radiology Tools for Success,” on Monday, Nov. 28, 2016, during the annual meeting of the RSNA in Chicago. “The goal of the college is to use everything we’ve done so far to make MACRA easier for radiologists.” Registries, it seems, will have a central role.

Congress and CMS began recognizing the role registries could play in making quality easier for physicians with the passage of the American Taxpayer Relief Act of 2012, by acknowledging that some physicians—thoracic and other surgeons, for instance—had been monitoring quality improvement for decades and doing this with data. The idea was that they shouldn’t have to fill out a form to prove that they are doing quality work, they should be able to submit the data they collect and get credit for it, Chatfield explained.

The ACR launched its registry in 2008 with the CT Colonoscopy Registry, and the NRDR currently contains seven different databases. A new registry dedicated to interventional radiology is under construction to be launched in 2017, with the primary mission of improving quality.


More recently, the college took steps to have the NRDR approved by CMS as a Qualified Clinical Data Registry (QCDR) for the Physician Quality Reporting System (PQRS), and it will be available for 2017 Merit-based Incentive Payment (MIPS) reporting in the spring.

Currently, about 2,600 of 4,000 registered facilities are actively contributing data to the various registries, representing community and academic hospitals and freestanding imaging centers, in metropolitan and rural environments.

The largest and fastest growing database is the Dose Index Registry, but the Lung Cancer Screening Registry is growing rapidly as well. The ACR provides participants with detailed, semi-annual comparison reports at the levels of facility, group, and physician.

“If you as a group are participating, you may opt to report those data to the CMS for PQRS credit currently, and moving forward it will be MIPS credit,” Chatfield explained. “In that event, the ACR provides a new set of dashboards and physicians can pick and choose the data they want to submit for PQRS and, moving forward, MIPS.”

To date, however, use of the QCDR for quality data submission has been limited. In 2014, about 243 NPIs used the ACR registries for data submission for CMS credit and in 2015, the total was about double that, Chatfield said.

“It’s not your only pathway, but it is intended to be the easy pathway,” Chatfield said. “We aren’t there yet, and I don’t know that any of the registries represent the easy way yet, but that is where we are headed.”

Pros and Cons

MIPS has four quality reporting domains—quality, cost, advancing care information, and practice improvement activities—and ACR’s goal is for the QCDR to have a role in all of these activities. Currently, though, the greatest opportunities for registry versus claims reporting lie in reporting the quality measures, although the registry can be used to submit data for all reporting domains.

“The benefit of using the QCDR is that radiologists can elect to report on measures that are relevant to the specialty of radiology that actually contribute to quality patient care,” Chatfield said.

Other benefits include:

  • More choice. In 2016, the QCDR had more than 60, specialty-relevant measures to choose from versus 8 to 14 CMS measures that are applicable to radiology by Chatfield’s estimation.
  • Frequent feedback. A QCDR is required to provide feedback to participants at least four times a year, enabling physicians to monitor and improve their performance. QCDRs also are required to provide individual physician reports for review prior to submitting to CMS. When submitting via claims data, physicians must wait until the end to see how they did.
  • Plentiful outcome measures. CMS has recognized that outcomes mean something different in diagnostic specialties and the QCDRs reflect that, Chatfield said: “For outcomes, we are measuring anything that happens to a patient as a consequence of the radiologist’s action. For instance, if a radiologist calls an exam positive, was it a true positive? The ACR QCDR includes quite a few outcome measures, including turnaround times.”

On the other hand, there are two added burdens in submitting quality data via QCDR versus claims data:

“One of the differences with QCDR for PQRS participation is that CMS requires physicians to submit data on all of their patients, which is a little bit harder than submitting just on Medicare patients,” Chatfield said.

The other additional burden of using a QCDR is that you must do at least nine measures across three domains, whereas if you submit claims you do only what is relevant, and that will suffice.

The MACRA Connection

While all QCDRs will be permitted to submit data for all of the MIPS performance categories, the ACR QCDR may not be able to support all aspects of MIPS. However, you can submit all of your data through the registry and you won’t need to submit on different platforms, Chatfield said.

Quality: In 2017, participants will report on six total measures including one outcome measure. They must report on 50% of your Medicare patients or all of your patients depending on which reporting mechanism you are using (practices using a registry must report on 50% of all payor patients).

Chatfield estimated that between 8 and 14 CMS quality measures are relevant to radiology practices, while the ACR QCDR offers more than 60 relevant measures.

“If you have end-to-end electronic reporting—which we are trying to do for some of our registries—you get an extra 10%,” she said.

“The measures that you choose to report are going to matter, and if you look through the measures that are available to you, you get to pick high priority or outcome measures for higher points,” Chatfield said. “Given those stakes, having a lot of options becomes even more important.”

Cost measures. Not much in the QCDR can contribute to a radiologist’s score on this front, but ACR is working on other measures that may be able to support this domain through clinical decision support or other things that the registries do.

Advancing care information. This category contains a number of base activities and bonus activities, Chatfield said, but, again, not much that directly relates to registry participation other than clinical data registry reporting. It’s not a required measure, but it qualify for bonus points. If these are activities you are doing anyway in your practice, and your practice intends to participate in the “meaningful use” aspect of the program, the registry can help you report on it.

Improvements activities. This domain was designed in response to statements specialty societies have made about the many local quality improvement activities practices have been doing with no acknowledgement. “The improvement activities are a way to capture all of these local activities,” Chatfield said. “A number of these activities are directly supported by QCDRs, such as 24/7 coverage and communication of critical results. The college is working on examples of what you can do for each of the 19 areas— or as many as are relevant—and we are going to allow you to attest to them, and if there is any additional documentation, we are going to help you manage those.”

Stay tuned!