Maintaining Relevance of Radiation Oncology Clinical Exams
By Paul E. Wallner, DO, ABR Associate Executive Director for Radiation Oncology, and Anthony Gerdeman, PhD, ABR Director of Exam Services
2021;14(4):6
A critical element in maintaining the credibility and validity of clinical certification exams is that they are deemed to be relevant by those involved, including developers and examinees. In a continuously evolving clinical care delivery milieu, maintenance of relevance can be challenging.
Prior to implementation of the ABR Continuing Certification Online Longitudinal Assessment (OLA) instrument in 2019, radiation oncology (RO) had relied primarily on tri-annual clinical practice analyses (CPAs) (1,2) and input from volunteers to inform decision-making related to the relevance of existing items, timeliness of adding emerging interventions, and elimination of those deemed to be outdated. The CPAs provided timely data related to the frequency that various tumor sites were seen within the clinical environment and how those disease sites were being treated. Thus, questions about what and how were answered. Although the CPAs provided useful and actionable information, the data was limited by the number of respondents to the optional surveys, the potential for information bias of respondents, and the infrequency of the surveys.
The OLA instrument for RO was introduced in January 2020. Each week, more than 3,000 individual items (questions) are answered by diplomates. The collated, disease-related information available from these responses will be provided to stakeholder organizations to assist in continuing medical education program development and is important to the ABR in identifying areas of knowledge gaps. In addition to providing responses to items, on average, 30% of respondents provide weekly answers to optional questions related to the relevance of the item to their practice, the importance of the topic, and the likelihood that a minimally competent radiation oncologist would get the item correct. The responses to relevance and importance provide guidance in the development of annual certification exam blueprints. The responses to whether the item would be answered correctly or not establish an Angoff score for the item by which all respondents are assessed (3,4,5,6).
The ABR also relies on requirements for residency training established by the Accreditation Council for Graduate Medical Education (ACGME) Radiation Oncology Review Committee for guidance on what topics should be included in exam blueprints. Topics not considered essential to residency training, or as ongoing research, are generally excluded from candidate assessment (7).
- Wallner, PE, Yang, J, and Gerdeman, A. Clinical Practice Analysis and Radiation Oncology MOC Examination Development. Beam 7 (1). Spring 2014.
- Wallner, PE, McGeagh, AM, Gerdeman, AM et al. Snapshot of a specialty: results of the ABR 2016 radiation oncology clinical practice analysis. Jour Amer Coll Radiol. https://doi.org/10.1016/j.jacr.2018.11.002
- Wallner, PE, Gerdeman, AM and McGeagh, AM, Data-driven initial certification and maintenance of certification examination development. ASTROnews, Annual Meeting Issue Summer 2016, pp 64-65.
- Wallner, PE, and Ng, AK. OLA: Radiation oncology dips a toe in the water and likes the temperature. Beam. 2020;13(5):7.
- Wallner, PE, Segal, S, Laszakovits, DJ et al. The American Board of Radiology online longitudinal assessment Part 3 maintenance of certification instrument: rationale and summary of 6-month experience. Pract Radiat Oncol. 2020;10(6):386-388.
- Wallner, PE, Gerdeman, AM and McGeagh, AM, Data-driven initial certification and maintenance of certification examination development. ASTROnews, Annual Meeting Issue Summer 2016, pp 64-65.
- Wallner, PE and Giaccia, A. Integration of “new science” into ABR certification examination process. ASTROnews, Winter 2016; 19 (4): 36-37.