Complex Question Development Process Limits Number of Exam Administrations Per Year
By Kalpana M. Kanal, PhD, ABR Trustee; Brian J. Davis, MD, PhD, ABR Trustee; and Brent Wagner, MD, MBA, ABR Executive Director
2024;17(2):3
Since 2021, the ABR has offered two exam administrations annually for the oral certifying exams in medical physics, radiation oncology, and interventional radiology. This has been possible because of the transition, partly in response to the pandemic, to a videoconference platform in place of the previous in-person model. The primary reason for the change to multiple administrations was to protect against unexpected life events, such as personal illness or family issues, interfering with a candidate’s ability to sit for an exam.
Frequently we are asked why most of our computer-based exams (CBEs) are offered only once each year. The ABR has considered multiple offerings for qualifying exams in medical physics and radiation oncology, for the certifying exam in diagnostic radiology, and for subspecialty exams in neuroradiology, nuclear radiology, and pediatric radiology. However, there are practical considerations that make this difficult.
CBEs require more content than oral exams. As shown in this figure depicting the lifecycle of an item, the development of written exam questions that are psychometrically valid, high quality, and relevant to the discipline represents a complex process. The process starts with the efforts of an individual committee volunteer, who drafts the question with corresponding high-quality anonymized images, if appropriate. The material is uploaded into ABR-developed software that allows for curation, indexing, and queuing of the questions. ABR staff, including specialists in imaging, editing, and exam development, prepare each question for subsequent online sharing with the item-writing committee. Iterative review and committee approval during several item review calls in the writing cycle is followed by tentative assignment to an exam.
A subsequent exam assembly process is necessary and includes several features. The exam must adhere to specifications of a defined blueprint intended to cover the specific domain in a balanced distribution of important concepts. The process involves coordinating the efforts of ABR staff who assist in exam assembly with volunteer subject matter experts who are ABR diplomates. This step often involves debate and discussion among the test assembly participants to mitigate potential ambiguity in the questions and to identify and remove misleading features of the questions that might be viewed as a “trick.” After the content is approved, several steps remain to upload the questions into the exam software for final review and distribution.
Valid content requires knowledgeable volunteers and some level of security related to the exposure of content. Multiple administrations of the same exam each year, for example, would erode the credibility of ABR testing because widespread knowledge of the questions would affect the validity of the instrument as a measure of knowledge and skill of the examinee, especially for candidates on their second or third attempt at the same exam. Offering a different exam, even twice a year, would be very challenging because the process requires extensive resources for the development of each new question.
Notwithstanding the above, a minority of exam questions are deliberately reused for psychometric analysis. This small cohort of questions, which may be repeated one or several times over the course of many years, is reviewed by volunteer subject matter experts prior to each exam administration to confirm that they are not based on obsolete concepts or practice.
The ABR strives to deliver high-quality exam content that is relevant, fair, reliable, and unambiguous, while keeping costs at a reasonable level for candidates and diplomates. For computer-based exams, these constraints necessitate adhering to a one exam per year schedule.