Overlapping Roles of the ABR and Residency Programs Support High-Quality Patient Care
By Donald J. Flemming, MD, ABR Governor; Matthew B. Podgorsak, PhD, ABR Board of Trustees Chair; and Brent Wagner, MD, MBA, ABR Executive Director
2024;17(4):4
The ABR and residency training programs pursue overlapping goals but use separate and distinct methods. Achieving certification requires rigorous training and passing a series of standardized exams. These two components support high-quality practice by radiologic science professionals (in radiology, radiation oncology, and medical physics), which ultimately serves the interests of patients and the public.
The mission of the residency programs is to graduate the best trained residents possible within the limits of available resources, and the program faculty aspire to develop medical professionals who exemplify excellence. Accreditation by the Accreditation Council for Graduate Medical Education (ACGME) (for physicians) or the Commission on Accreditation of Medical Physics Education Programs (CAMPEP) (for medical physics programs) requires compliance with standards that cover – among other elements – curricula, goals and objectives, faculty qualifications, assessment and feedback processes, and training environment. Overall, the program requirements define parameters within which programs must perform to support a setting that is conducive to effective learning. In recent years, a program’s efforts have been increasingly assessed by evaluating outcomes based on defined objectives that include the success rates of their residents with the certification process.
Assessment of individuals during the residency is primarily formative (“assessment through learning”), which often includes informal questioning and intentional observation of a resident’s knowledge and skill when applied to, for example, a diagnosis, a treatment plan, a procedure, or safe and effective use of clinical technology. Secondarily, summative assessment (“assessment of learning”) can be used to measure progress through graduated levels of responsibility, with the goal of competence for independent practice. For ACGME-accredited residencies, evaluation by the Clinical Competency Committee is in part directed toward progress relative to well-defined milestones. For CAMPEP-accredited residencies, the program director, like their ACGME counterpart, is required to meet periodically with each resident to evaluate the resident’s progress.
Multiple years of residency training, combining longitudinal experiential learning with frequent in-depth faculty observations and feedback, are invaluable not only for a resident’s acquisition of skills and knowledge that form the basis for excellence in practice but also for the objective determination of their competence and suitability for independent practice.
The mission of the ABR is “To certify that our diplomates demonstrate the requisite knowledge, skill, and understanding of their disciplines to the benefit of patients.” Complementing the extended training period of the residency, the ABR’s point-in-time exams offer two distinct features that contribute to the validity and value of board certification. First, ABR exams represent a standard that normalizes the inherent variability of opportunities, education and assessment across residency programs. Second, they offer a goal for trainees (and faculty) to encourage the pursuit of excellence via self-study across a broad range of specialty skills and knowledge.
Development of a valid and comprehensive exam as a high-stakes assessment instrument requires the collective wisdom of subject matter experts combined with the generous efforts of practitioners who volunteer from diverse academic, research, and clinical practice backgrounds. Most important, defining the relevance, reasonableness, and boundaries of the breadth and depth of the practice domain is most effectively accomplished by committees whose members respectfully challenge each other and are motivated to produce a high-quality exam.
The ABR’s team of psychometricians helps guide exam construction to contribute to fairness and reliability. Acknowledging that no testing instrument is perfect, rigorous statistical analysis after each exam administration seeks to confirm reliability of the assessment for individuals and the cohort of candidates. For example, although uncommon, problematic questions that might have been ambiguous or confusing are identified by systematic post-test analysis and, after review by subject matter experts, may be omitted from scoring to enhance fairness.
The complementary functions of structured residency training and systematic development and administration of valid standardized exams contribute to achieving the goal of creating a fair and meaningful certification process that can be trusted and valued by the health care system and patients alike.