ABR Staff and Volunteers Contribute to OLA Relevance and Robustness
By Brent Wagner, MD, MBA, ABR Executive Director
2025;18(4):2
A major part of the ABR’s Continuing Certification program is Online Longitudinal Assessment (OLA), in which approximately 38,500 diplomates are enrolled. Collectively, they have answered 14 million questions. Since the introduction of the platform for diagnostic radiologists in 2019, we have received positive comments about most of its features. This is a testament to the efforts of approximately 200 talented and committed volunteers who develop the content in a committee setting; their careful deliberations attempt to avoid minutiae but encourage all of us to be better at what we do in our daily work.
Developed between 2017 and 2019 as an alternative to the “every 10-year MOC Exam,” OLA represents evaluations that are not only summative (assessment of learning) but also formative (assessment for learning). The formative function allows individuals to identify and correct knowledge gaps by providing a rationale for the correct answer and, if a diplomate offers an incorrect answer, by revisiting the subject matter after a few weeks with a related (but not identical) question.
The summative function satisfies Standard 14 of the American Board of Medical Specialties (ABMS) Continuing Certification Standards, specifically that “continuing certification assessments must meet psychometric and security standards to support making consequential, summative decisions regarding certification status.” This relates to the public-facing mission of the ABR and the value of the certificate for diplomates and those outside the profession who view certification as an indicator of high-quality care. The Standard also states, “In order for users to have confidence in the value of the certificate, sufficient psychometric standards must be met for reliable, fair, and valid assessments to make a consequential (summative) decision.”
The design features we considered nearly 10 years ago were intended to create a system that minimizes burden for diplomates while providing content that is relevant to their practice. The relevance element is challenging because of the wide range of practice, especially for those who are highly specialized. The “decline” function allows diplomates to skip over a question that is outside of their practice without being penalized in their score.
In addition to the volunteers, dozens of ABR staff are committed not only to keeping the process running, but also to implementing refinements we have introduced along the way, including many enhancements suggested by diplomates. Committees are directly assisted by exam development staff. Editors, psychometricians, data analysts, and software developers contribute to optimizing the delivery of approximately 3,000 new questions each year. They also continuously monitor older items to ensure they remain accurate and relevant to practice. While this infrastructure represents a significant commitment of ABR resources, we believe the success of the program depends on careful maintenance of the question pool and a robust scoring model that provides space for learning and growth, while also being sufficient for accurate, consequential summative decisions.