Qualifying Exam Survey Results Provide Suggestions for Improvement
By Robert M. Barr, MD, ABR President, and Brent Wagner, MD, MBA, ABR Executive Director
2023;16(4):2
Our June Qualifying (Core) Exam was administered to more than 1,400 candidates in diagnostic radiology and interventional radiology/diagnostic radiology. This is our largest combined cohort (approximately half in week one and half in week two) and our longest exam (more than 650 questions for each group) each year.
Unfortunately, there was a widespread slowdown in system performance on the first day of the exam, soon after examinees began signing in. Minutes after the problem was identified, our new communication procedure allowed us to reassure candidates via text message that the problem was on our end, and we were able to readmit nearly everyone within 30 minutes. Exam end times were adjusted appropriately to allow for the lapse in connection. The remaining sessions in weeks one and two worked extremely well. Although the process reminded us that delivering a remote exam to a large group of candidates is a complicated endeavor, we hope that, despite the brief interruption, the experience was less stressful and more convenient than an in-person exam would have been in our old model, which required travel to either Chicago or Tucson.
We have conducted post-exam surveys for many years to gauge the appropriateness of content and the exam experience. For this administration, over 97% agreed that the registration process was easy to complete. More than 97% believed that completing the Exam Readiness Check and Sample Questions prior to exam day helped them prepare for their exam.
Approximately half of the candidates used Wi-Fi and half used a wired connection. Over 90% agreed or strongly agreed that the technical set-up was easy to use (including the side-view camera and identification authentication).
A few themes showed up repeatedly in the free-form written comments. We learned that, based on clinical practice, some respondents found portions of the exam to be overemphasized relative to others. This has been a topic of detailed discussions recently, and modifications are being considered.
In addition, many commented that the exam was too difficult. We strive to create an exam that is manageable, relevant, and appropriately rigorous to maintain the credibility of the certificate and to allow our diplomates to distinguish themselves. They have committed to years of training and study, including a long residency, covering a large knowledge and skill domain. It is important to recognize that our exams are created in combination with standard-setting using the Angoff method to adjust for difficulty, such that a more challenging set of questions will require fewer correct answers to pass.
We were glad to hear that most respondents thought that the technical aspect of the exam went well. Over 95% agreed that “it was easy to figure out how to use the exam interface,” “it was clear how much exam time I had left,” and “it was clear how much break time I had left.” Although the overall ratings of the software platform are favorable, we continue to see variability in the reliability of the connection for a small minority of candidates; this appears to be based on factors we (and, sometimes, the examinees) can’t control.
Most of the survey respondents who needed the Exam Day Help Desk were appreciative of the assistance provided. Our Help Desk team members see themselves as partners in the success of the examinees. We are not surprised that most candidates not only understand the need for but also comply with the security measures to maintain the credibility of the exam. We have a two-stage monitoring process, including (a) real-time monitoring to allow us to coach the examinees (camera position adjustment, for example) during the exam and (b) post hoc audio and video review to confirm compliance with the security requirements. The security elements benefit the examinees by reinforcing the validity of a remote at-home or in-office testing environment.
We are extremely grateful to those examinees who provided input via the survey. We value the comments and genuinely view them as opportunities to improve. Speaking for the ABR trustees and 100-plus volunteers who create Qualifying (Core) Exam content, we commit to continued improvements to the exam mechanics, relevance, and appropriateness.