(Note: This blog was written before the start of the COVID-19 pandemic in the U.S.)
By Erin Cooke, MD
As a diagnostic radiology (DR) residency program director and member of one of the last cohorts to take the oral ABR exam in Louisville, I have been curious about how the standards are set for the DR Core and Certifying exams.
Earlier this year, I was fortunate to have the opportunity to learn more about the process during my first time volunteering for the ABR. One of my mentors from my residency days encouraged me to become a member of the ABR Angoff Committee. She had been a regular ABR volunteer for many years, under both the current exam process and the former oral boards. I valued her regard for the learning gleaned from participation as an educator as well as for the importance of the validity of the exam standard-setting process with respect not only to examinees but also to the public.
What, you may ask (as I did), does “Angoff” mean? As my mentor explained to me, and as was detailed during our introductory session for the committee meeting, criterion-referenced exams have a passing standard set prior to administration. If a candidate or diplomate meets or exceeds this standard (cut score), he/she passes the exam. Thus, it is possible that everyone who takes the exam could pass. The goal of each exam is to determine whether the candidate/diplomate has demonstrated mastery of a set of knowledge and skills.
For ABR exams, a subcommittee of several members for each radiology subspecialty evaluates each question for their section and considers whether a minimally competent radiologist would answer the item correctly. As many of us in the committee discussed during the meeting, the “would” here is key and is what makes the process challenging; it is not “should” the individual answer correctly.
Notably, there is variation in how individuals rate questions; this is vital to the process. In fact, the variation is an expected attribute given the carefully selected diversity of Angoff members, who differ in years of practice, type of practice (private, community, or academic), geographical location, and gender, among other factors. The goal is to provide a broad representation of the knowledge base of subspecialty areas and of general diagnostic radiology across the country. The committee volunteers rate each question individually, and then ratings are averaged for each question to generate a raw cut score.
As with any method of setting a standard for a criterion-referenced exam, the Angoff method is not perfect. But the experience of participating in the process was illuminating as to the degree of care taken to ensure a wide range of diplomates has a stake in standard-setting. I would encourage anyone who has an interest in the work of continuing improvement for our profession to consider volunteering in the standard-setting process. It is a critical contribution to our field and to maintaining high quality care for our patients.
Erin Cooke, MD, is a diagnostic radiologist and co-program director of the diagnostic radiology residency at Virginia Mason Medical Center in Seattle, where she graduated from residency in 2010. She completed an MRI fellowship in 2011 at Ochsner Medical Center in New Orleans and practiced with Ochsner as a general radiologist until she moved back to Seattle in 2014.