ABR Trustees Find Tremendous Value in Post-Exam Surveys
By Matthew B. Podgorsak, PhD, Kalpana M. Kanal, PhD, and Robert A. Pooley, PhD, ABR Trustees; and Geoffrey S. Ibbott, PhD, ABR Associate Executive Director for Medical Physics
2022;15(1):6
The ABR’s mission is to certify that our diplomates demonstrate the requisite knowledge, skill, and understanding of their discipline to the benefit of patients. Future medical physicist diplomates must first undergo a rigorous evaluation of their clinical skills and knowledge by taking a series of three qualifying exams (Part 1 general, Part 1 clinical, and Part 2) followed by a final certifying exam (Part 3 oral).
The exams include questions written by medical physicist volunteers who are board certified and are maintaining their certificate in the Continuing Certification (MOC) program for their respective disciplines. Each question is reviewed for content and evaluated for clarity of presentation by committees of medical physicist volunteers and ABR editorial staff. Moreover, each ABR exam is assessed by the ABR psychometrics staff and additional medical physicist volunteers before it is delivered. Finally, each exam is assessed after delivery to ensure that participants who passed the exam have demonstrated appropriate knowledge such that the ABR’s mission is achieved.
The ABR continually strives to improve the quality of the exam experience. This is done by surveying participants after they have completed an exam. Each participant is offered an opportunity to answer specific questions and provide free-form feedback related to their experience. These surveys are reviewed extensively, and responses help guide refinements and changes to the exam process.
Over the past year, all ABR computer-based and oral exams have undergone significant changes stemming from the transition to fully remote exams. Valuable input from medical physicist stakeholders during the almost year-long hiatus in exam delivery in 2020 was incorporated into the new delivery formats. As a result of these changes, it was perhaps more important than ever to query examinees about the new approaches. In 2021, the ABR administered two rounds of qualifying exams along with a pilot certifying exam offered to a limited number of candidates, followed by two certifying exams open to all eligible candidates. Of the 662 participants in the exams given in August 2021, which included one administration each of the Part 1, Part 2, and Part 3 exams, 405 candidates responded to post-exam surveys. A review of the survey responses showed the following highlights:
- 85% of respondents indicated they understood the exam registration process, although some had difficulty scanning their IDs or performing a room scan.
- 96% of respondents successfully completed the technical self-check, and 94% found the process to be useful.
- Only 58% of respondents found the online practice exam to be useful. We are currently reconfiguring the practice exam to be an Exam Readiness Check and Sample Questions, which will provide a broader range of question types.
- About 85% of respondents had a connection issue either before or during their exam. There was an interruption in accessing the Part 1 Qualifying Exam question server that was a result of circumstances outside our control, and we worked with the affected participants to address their situations. Connection issues experienced by oral exam participants were mostly related to local power outages and home internet interruptions. We worked with these individuals as well to provide opportunities for them to complete their exams once their connections were re-established.
- 88% of computer-based and oral exam participants liked the newly developed exam interfaces, although some indicated that exam content, particularly images, was blurry. The remote exam format continues to evolve, and the quality of images and text will be significantly improved.
- We received many comments about case-based questions in which there was a block between sequential questions in a two- or three-question case. We will shortly discontinue the use of cases that require blocks between questions.
- 39% of respondents rated their exam as “too difficult” or “much too difficult.” This is not different from previous years.
- 38% of respondents said they were surprised by the exam content, with Part 1 respondents commenting that the exam contained too many imaging questions, not enough imaging questions, too many therapy questions, or not enough dosimetry questions (depending on their discipline interest).
- Many oral exam respondents indicated displeasure with the requirement to run both WebEx and a browser window simultaneously. This will be modified in a future version of the oral exam software platform.
- Of oral exam respondents, 92% found their navigator to be helpful, 92% perceived their examiners to be fair and professional, and 85% said their examiners helped them express their knowledge.
- Finally, candidates were asked to rate the exam software by giving one to five stars. The overall average was 3.4 stars, with the following individual averages: Part 1 – 2.9 stars, Part 2 – 3.1 stars, and Part 3 – 3.8 stars.
We were pleased to find that most responses to survey questions based on exam content rather than delivery format were similar to those from previous years, indicating that candidate opinions about content have not changed. This is not surprising because metrics used to assess each exam were largely consistent with those of previous exams.
Additionally, responses to survey questions continue to provide useful input that has already led to improvements in the exam delivery user interfaces and, for the computer-based qualifying exams, changes to the question types and operational format. Further evolution based on survey responses is already planned and will be implemented soon.
The Board of Trustees finds tremendous value in the feedback received from exam participants and looks forward to continuing the evolution of all exams informed by survey responses.