Item Writing for ABR Exams is Performed with Great Care
by N. Reed Dunnick, MD, ABR Associate Executive Director for Diagnostic Radiology
2020;13(5):4
Although the ABR is moving to remote exams, and the platforms on which the exams are given will be different, what won’t change is the care that goes into creating each question.
ABR exam questions (“items”) are written by numerous committees comprising more than 600 subject matter experts who practice in the many subspecialties of radiology. These professionals donate their time and expertise to develop exam content.
The life cycle of an exam item begins when a subject matter expert prepares a case with associated assets and references and submits it through our proprietary exam development software. ABR staff members (exam developers) facilitate all volunteer committee activities, including the submission of questions and images, editorial review of material for clarity and accuracy, and preparation of multimedia assets to meet ABR needs.
When an item is submitted, it goes first to multimedia processing specialists to check for HIPAA compliance and multimedia quality standards. It then goes to the editors, who ensure it meets quality benchmarks set forth by our standards division, including psychometric integrity and adherence to both the American Medical Association Style Guide and the ABR Item-Writing Guide.
Volunteer committees meet throughout the year to review content and select items for the exams. Test assembly meetings are conducted to review each question in detail for appropriateness, accuracy, clarity, and clinical relevance. ABR staff members make the recommended revisions to produce a final version of each exam for administration. Exams go through an Angoff process to set the passing standard, a proofread by the editors and exam developers, and then a final review by the associate executive directors (AEDs).
With the computer-based diagnostic radiology (DR) Core and Certifying exams, feedback on items comes in the form of how the item performs on the exam (i.e., how many examinees get it right or wrong). With Online Longitudinal Assessment (OLA), we get immediate and more detailed feedback, as diplomates can rate each question and leave open-text comments.
Diplomate question ratings replace the Angoff process for setting the passing standard. They also ensure that we have the necessary data on each OLA question to validate that it is performing at an acceptable level to be considered a “scorable” question. A question is scorable when it has been answered by 50 or more diplomates, has been rated by 10 or more diplomates, and has acceptable psychometric statistics. Questions that are not “scorable” are removed from the OLA pool and sent back to the AED or appropriate committee to review.
The open-text comments from diplomates help us improve the questions. Diplomates are not limited on the feedback they can provide on questions they have answered. Since OLA began in January 2019, we have received more than 38,000 comments on questions in various DR categories. The AED for diagnostic radiology reviews these comments and prepares a report for each of the committees. Exam items are often revised in response to diplomate comments.
The ABR’s mission ̶ to certify that our diplomates demonstrate the requisite knowledge, skill, and understanding of their disciplines to the benefit of patients ̶ is at the core of our exam development processes. Volunteers and staff take great care in making certain that ABR exams are indicators of the quality and professionalism that the public expects from board-certified radiologists. Whether exams are given remotely or in person, that will never change.