Through Assessment and Education, OLA Helps Diplomates Practice Safely and Effectively
By Brent Wagner, MD, MBA, ABR Executive Director; Robert M. Barr, MD, ABR President; and Matthew B. Podgorsak, PhD, ABR Board of Trustees Chair
Physicians and physicists certified by the ABR would agree that one of the challenges of optimal clinical practice is the increasing complexity and rate of expansion of medical knowledge. The cognitive assessment component of the ABR’s Continuing Certification program is an integral part of its goal to reassure the public that diplomates maintain proficiency in their specialty after satisfying the requirements for initial certification. In the American Board of Medical Specialties (ABMS) standards, the requirement states that “Member Boards (such as the ABR) must assess whether diplomates have the knowledge, clinical judgment, and skills to practice safely and effectively in the specialty. Member Boards must offer assessment options that have a formative emphasis and that assist diplomates in learning key clinical advances in the specialty.” More than 90% of ABR diplomates who are maintaining their certificate attempt to meet this requirement by participating in Online Longitudinal Assessment (OLA). As an alternative, diplomates may choose to take and pass a Continuing Certification Exam every five years to fulfill their Part 3 requirement and demonstrate their clinical knowledge and skill; this option is remotely administered as a computer-based exam.
The OLA platform provides flexibility. For example, for most diplomates, answering 52 questions in the first half of a calendar year allows them to take a break in the second half of the year without penalty. Additionally, most diplomates can align the content with their clinical practice, and diplomates may “decline” to answer some questions that are not part of their practice area. Further, as required in the ABMS standards, OLA has a formative function (feedback to reveal and address knowledge gaps) in addition to the summative function (assessment of performance relative to a standard of competence). The difficulty level and the corresponding passing standard are determined by the participating diplomates themselves; more than 30% of OLA participants have agreed to serve as item difficulty raters. More details about the OLA program may be found here.
As of January 2024, over 10 million questions have been administered. Last year, 62% of diplomates chose to answer “extra” questions (i.e., more than the minimum annual number required). Among diplomates who answered at least the minimum number of questions, more than 96% are above their passing standard. (For most diplomates, the minimum is 52 questions.)
As part of the formative function, every OLA question has a “variant” – a question similar in scope and content. These paired questions are administered at random (within the structure of a blueprint of the knowledge domain). If the diplomate incorrectly answers the first version, they can review the rationale for the correct answer and, if desired, pursue more information in the provided reference. The variant question is administered three or more weeks later to directly address the knowledge gap and reinforce the teaching point. Approximately 900,000 “variants” have been administered and the percentage correct for those questions is greater than 75%.
We thank the volunteer committees for their hard work and invaluable expertise in generating and refining OLA content, and we are very grateful for the constructive criticism offered by the diplomates (discussed elsewhere in this issue) in support of further improvements.