Earlier this year at the Texas Radiological Society annual meeting, I attended an ABR update given by current ABR president, Bob Barr, where he announced the rapid progress of the ABR’s plan to revitalize the Certifying Exam to address widespread discontent. I wrote about it here. The plan was to announce the change no later than June but potentially as early as April. The takeaway summary at the end of that short post?
But the ABR did reiterate that their hope for the Certifying Exam is a better demonstration of the skills needed for general practice.
But no, I don’t think they’ll be bringing back the oral boards.
I was wrong.
The Status Quo
The current ABR Initial Certification process–comprised of a multiple-choice Core Exam during residency followed by another multiple-choice Certifying Exam at least 12 months after residency–is flawed in many ways.
The ABR suggests that the initial certification pathway is meant to convey minimal general competence for independent practice, but it is unclear how a second multiple-choice exam after fellowship–especially with choose-your-own-adventure content that is often more subspecialized–adds any value to this calculation after the Core Exam.
Short answer: it doesn’t.
The final paragraph of that post summarizes the situation:
Despite how challenging the Certifying Exam may feel, it’s an exam designed to “psychometrically” validate whether you suffered a traumatic brain injury in the two years since you passed the Core exam.
The bottom line is that everyone, the ABR included, knows the Certifying Exam is truly useless.
A few years ago, I wrote a summary of a Twitter chat with ABR Executive Director Brent Wagner. I called it “The ABR Defines the Intent of the Core Exam,” and I think it’s worth reading for our discussion here. One of his tweets:
#RGchat T1: the ABR certification exam is intended to test knowledge as it relates to competence, and critical thinking as it relates to image interpretation. Other elements of competence (procedures, professionalism, etc) are better tested (assessed) by the residency faculty.
— Brent Wagner (@brentwagner99) April 14, 2020
Per the ABR, its mission is “to certify that our diplomates demonstrate the requisite knowledge, skill, and understanding of their disciplines to the benefit of patients.”
The current certification process tests for knowledge. It does not test for skill.
This is its core flaw.
Radiology in the Post-Orals Era
Many old-guard radiologists bemoan a perception of decreased real-world readiness and confidence of recent graduates in the era of the Core Exam. It is felt–with some merit–that the loss of the high-stakes hot-seat oral boards format (and the style and volume of preparation it required) has reduced radiology trainees’ ability to speak cogently about radiology. A lot of radiologists have suggested that a multiple-choice exam simply can’t test for the soft skills that the oral boards could (which is undeniable).
From that same Twitter conversation, Dr. Wagner summarized the exam’s goal:
#RGchat T1: Much of the exam seeks to assess a candidate's ability to choose the most likely diagnosis based on a set of images. Similar to what I expect of myself and my colleagues in daily work.
— Brent Wagner (@brentwagner99) April 14, 2020
This comically obvious disconnect–between real-life radiology and selecting the best answer from a list of predefined possibilities–is a great source of consternation for many radiology stakeholders.
The reality is that, while oral examinations are also a contrived format, the radiology educators who have worked during both eras aren’t wrong. I began my diagnostic radiology residency in 2013 as the Core Exam was about to be administered for the first time. I remember the way seniors could take cases when I was an R1, and I know how subsequent classes took cases. They weren’t the same (we were clearly worse). If we believe casemanship is an important skill, then it’s a skill that has atrophied. You play like you practice.
So: the perception is that the certification pathway is flawed and that a second MCQ exam in particular isn’t adding value or testing meaningfully different skills. Its timing after residency is awkward and confusing and makes it harder to test general competence after a period of typically intense fellowship subspecialization. And lastly, there’s the usual intergenerational conflict with its grumbling about the good old days and bemoaning the foibles of millennials.
While the ABR doesn’t want to admit that the post-orals era under the leadership of Dr. Valerie Jackson was a failure, they are willing to admit it wasn’t exactly a success insofar as their willingness to change one half of the new process.
If you’d asked me to provide a differential for the ABR’s plan to revamp the Certifying Exam, a return to the oral boards would not have been high on the list.
I thought, perhaps foolishly, that the ABR was going to put in the hard work to offer real-world simulation as a key part of the initial certification process (something I’ve long advocated for).
If we want to see if radiology trainees can practice radiology, perhaps we should just have them practice radiology and see how they do. If it’s not a PACS, then it’s probably not a good test.
The Details of Back to Future (so far)
Instead, the future is the past: In 2028, DR residents training from 2023-2027 will trade the current Certifying Exam for the new “DR Oral Exam.” The Core Exam will, despite its flaws, remain unchanged. The new oral boards will take place after the completion of radiology training just like the current Certifying Exam. Instead of an awkward slog parading through examiners’ hotel rooms in Louisville, Kentucky (i.e. the old oral boards), it will take place in the safe space of the ABR’s remote videoconferencing platform. Unlike the current MCQ offerings, you probably shouldn’t wear pajamas.
But this is a “new” old exam for a new world.
One pitfall of the old oral exam was bias and subjectivity:
To mitigate subjectivity and potential bias inherent in an oral exam, examiners will use a standard set of cases, and detailed rubrics will be used to score each candidate. This is an improvement over the prior oral exam model and is facilitated by current technology, including software developed specifically for this purpose by the ABR and currently used for oral exams in the three other disciplines (interventional radiology, radiation oncology, and medical physics). As in the past, examiner panels will meet after each session to discuss candidate results to ensure fairness and consistency. A conditioned exam result will be possible. The panels will be balanced for geography, gender, and new vs experienced examiners.
Instead of a panel, candidates will meet with one examiner virtually at a time (technically much easier in a Zoom-style setting).
We anticipate there will be seven individual exam sessions. The clinical categories are breast, chest (includes cardiac and thoracic), abdominal (includes gastrointestinal, genitourinary, and ultrasound), musculoskeletal, nuclear radiology, pediatrics, and neuroradiology.
The content will include critical findings as well as common and important diagnoses routinely encountered in general practice.
It will take one day and be offered “winter/spring” and “fall” each year.
A candidate’s first oral exam opportunity will be the calendar year following the completion of residency training. Candidates are eligible to take the exam through the end of their board eligibility, which extends six full calendar years after the completion of residency training. We anticipate having two administrations of the new DR Oral Exam each year.
Back to mitigating scoring conflicts:
For each session and category, the examiner will use a standard set of cases for all candidates. Detailed rubrics will be used to score each candidate. This is an improvement over the prior oral exam model and is facilitated by current technology, including software developed specifically for this purpose by the ABR.
As in the past, examiner panels will meet after each session to discuss candidate results to ensure fairness and consistency. The panels will be balanced for geography, gender, and new vs experienced examiners.
How does a pow-wow ensure fairness and consistency instead of groupthink? Good question.
Note, the costs to traineess will be the same (trainees still pay much more than diplomates). We’re just swapping one exam you take at home that everyone passes for another exam taken at home that we currently know very little about.
One thing that’s also not clear is how the ABR will handle the manpower needs. While developing the test will be straightforward, getting enough volunteer evaluators together will be no small feat. If they intend to pay attending radiologists for their time, then we are planning for a deeply inefficient high perpetual cost for the foreseeable future. Radiology is already among the most expensive medical specialty certification processes. I for one would like to see those costs go down, especially for trainees.
Timing
There’s also the issue of timing: radiology is certainly not alone in pushing final board certification until after residency. Orthopedic surgery for example includes an oral exam Part II where examiners grill you on a log of your cases and complications two years after finishing training.
But, in our field, there is absolutely no good reason to delay this assessment. What magical general diagnostic radiology content and experience do most radiologists have a year after residency that they don’t have at the end of residency? None.
Pushing a general competence assessment until after finishing subspecialty training only makes sense if the ABR intends to argue that subspecialization is making young radiologists worse and that somehow forcing people to spend their fellowship maintaining general skills and prepping for orals will prevent that atrophy (no easy feat without the daily hot seat conferences that were once so common in residency). That, so far, does not appear to be the party line.
This exam needs to be at the end of residency like it used to be. If anything, it might help combat the post-Core senioritis that many fourth-years struggle with, particularly when rotating through services outside of their chosen specialty. I appreciate that many program directors don’t want this during residency because in the past seniors used to disappear from service (and especially the call pool) before Orals just like they do now before the Core Exam. It’s easier to run a residency with only one class preparing for one big test at a time. But convenience shouldn’t be our primary metric.
The choice of returning to orals aside, the timing of this exam remains a mistake. We should be graduating board-certified radiologists.
What’s Old is New Again
In the end, we will again get to pretend that a high-stress, closed-book, hot seat examination is the best way to assess for the requisite absolutely nonnegotiable critical skill of a real-world diagnostic radiologist: the ability to sit in a dimly lit room and crank out mostly correct reports.
From the ABR’s announcement:
The new DR Oral Exam will include select critical findings as well as common and important diagnoses routinely encountered in general DR practice, focusing on examples that optimally assess observation skills, communication, judgment, and reasoning (application of knowledge learned during residency).
It is not meant to represent a comprehensive review of clinical content. The oral exam aims to assess higher-level skills that are needed to be an effective diagnostic radiologist and are valued by referring physicians and patients.
Now, for what it’s worth, I don’t want to suggest that this change may not be in several ways an improvement. Yes, this will be a considerably more stressful examination, especially for the first few years. But the Certifying Exam is duplicative, and if nothing else, this is not. I suspect the real reason oral boards could potentially produce “better” radiologists will be because–unlike the current exam–they will again require real longitudinal preparation for an additional extended period of time. Said improvements, if they occur and if they can be measured, will likely be a reflection of the increased prep time (and perhaps the increased focus on verbalized explicit reasoning that programs will emphasize in hot-seat conferences in the future).
However, the soft skills that the oral boards will presumably purport to test are unlikely to be particularly well or fairly evaluated during the examination itself, especially if we want to ensure it’s unbiased, fair, and less susceptible to the individual preferences of the examiners. To assume that the test itself will actually differentiate between competence and incompetence is analogous to believing that the now defunct USMLE Step 2 CS differentiated between good and bad clinicians. It’s mostly anachronistic theatre.
However, I’ll attempt to reserve judgment for now. I’m almost willing to believe it could be an improvement, even if it will inevitably be a source of stress for trainees and a burden on programs.
Losing the Plot
However, we can acknowledge that a change may have some benefits but still ultimately be the wrong choice.
The DR Oral Exam potentially having some value over the current paradigm does not make it the optimal choice for certification reform.
I have little doubt that giving someone a case and making them perform reflects their understanding and skills differently than their ability to answer a multiple-choice question. I don’t even think that’s arguable. Oral Exams test different abilities. Verbal articulation in a high-stakes situation is a unique skill.
But is it the right skill?
Is that *waves hands encompassingly* the sine qua non of a modern radiologist?
Forgive the tautology, but: the true core competency of a radiologist, whether we want to admit it or not, is the ability to practice radiology.
All things being equal, sure, we do want radiologists to interface well with ordering providers and have good bedside manner for direct patient care. We should want radiologists to dazzle crowds with their facile wit during multidisciplinary conferences. Don’t we love it when the clinicians add on a bunch of last-minute cases and the radiologist still hits a series of home runs on the spot without breaking a sweat? Excellence is a beautiful sight!
But it’s not hard to realize that we are doing something with vague potential benefits that nonetheless is still off-target from the true purpose of board certification.
We may produce better radiologists because of how programs and trainees will respond to this anxiogenic and more opaque exam format–but not because said format itself is particularly well-suited to doing much of anything. Every time we contrive a new evaluation that tests some subset of radiology, we are trying desperately to design a test that can serve as a proxy for real-world competence.
We are constantly assessing knowledge, reasoning, etc “as they relate” to radiology ability. We are not, however, directly measuring a person’s skills: their ability to do this actual job day in and day out. We are fooling ourselves when we pretend that the radiology report is not our core work product. When we shake our heads at the quality of radiology from the “outside hospital,” we’re talking about misses and useless reports.
We can test for that directly. We can have trainees do real work under real conditions and see how they compare to a reference standard of practicing radiologists. We can see if they “demonstrate the requisite knowledge, skill, and understanding of their disciplines to the benefit of patients.” Imagine a test when a trainee just did several 8-hour shifts comprised of curated cases?
We already have the internet at our fingertips, but we are also on the precipice of a world with increasingly powerful AI tools. Any radiology certification process that does not include high-fidelity simulation with clear metrics that reflect real-world performance is ultimately a waste of time.
4 Comments
The ACR completely missed any opportunity to differentiate themselves from this mess. The ABCR is one organization.
The late great surgeon William Silen was known for saying “Doctors are dangerous.” However, we knew that some are more dangerous than others. The primary purpose of Boards is to assure that candidates they certify are not unduly dangerous to their patients.
Each section of my oral board exam in that Louisville hotel in 1977 seemed to have one key case to test if it would be unsafe to let me loose on the world. I clearly remember two of them.
On the GU exam I was shown films (!) from an IVP with prompt bilateral nephrograms at 1 minute persisting at 10 minutes with no pyelogram. I needed to recognize this as a possible sign of shock from a contrast reaction (more common in the days before non-ionic contrast) and know to check my patient and know how to treat the reaction. Seemed like a fair case to me. A radiologist who would give contrast but not recognize a life threatening reaction and know how to treat it is in fact unduly dangerous.
On the bone exam (that’s what we called it back then) I was shown an x-ray of a young man with an expansive metatarsal lesion with an appearance worrisome for a possible osteosarcoma. No fracture line was visible, but I needed to be aware that the differential diagnosis included a healing stress fracture and elicit the history that the patient was indeed an army recruit who had participated in a 25 mile hike with a heavy backpack a few weeks earlier. I also need to reject the suggestion to biopsy the abnormality, understanding that to the pathologist a healing fracture and an osteosarcoma look very similar and can be confused. As a radiologist I can make the right call, or I can get this kid’s foot amputated for a benign healing fracture. Fair exam case. I would have deserved to fail for dangerousness if I had missed this one.
The in person oral boards I took simulated the radiologist’s actual practice at the time—a face to face consultation with a referring clinician. Sadly, in my view, the proposed Zoom type exam does at least fairly simulate today’s practice where radiologists remain chained to their workstations and are consulted remotely if at all.