From the awesome and scathing “What Went Wrong With the ABR Examinations?” in JACR:
The new examination format also does a poor job of emulating how radiology needs to be practiced. Each candidate is alone in a cubicle, interacting strictly with a computer. There is no one to talk to and no opportunity to formulate a differential diagnosis, suggest additional imaging options, or provide suggestions for further patient management. The examination consists entirely of multiple-choice questions, a highly inauthentic form of assessment.
Only partially true. Questions can ask you for further management. Additionally, it’s possible to formulate questions (via checkbox) that allow you select reasonable inclusions for a differential. This isn’t the same as having a list memorized but is in some ways more accurate in the world of Google, StatDX etc. Of course, this kind of question isn’t meaningfully present, but multiple choice format itself doesn’t necessarily preclude all meaningful lines of testing.
Another rationale for the new examination regimen was integrity. Yet instead of reducing candidate reliance on recalled examination material, the new regimen has increased it, spawning at least six commercial online question bank sites. The fact that one of the most widely used print examination preparation resources is pseudonymously authored is a powerful indicator that the integrity of the examination process has been undermined, effectively institutionalizing mendacity.
Every board exam has qbank products. Part of why Crack the Core is pseudonymously authored isn’t just the recalls; it’s presumably also related to his amusing but completely unprofessional teaching style. I very much doubt the Core Exam is more “recalled” than anything available for the prior exams. What we should be doing is acknowledging that any standardized test will be prepared for this way via facsimile questions, and there is literally no way to avoid it. It’s not as though Step 1 is any different.
Many of the residents we speak with regard the core examination not as a legitimate assessment of their ability to practice radiology but as a test of arcana. When we recently asked a third-year resident hunkered down over a book what he was studying, he replied, “Not studying radiology, that’s for sure. I am studying multiple-choice tests.” The fact that this sentiment has become so widespread should give pause to anyone concerned about the future of the field.
Yes, this is true. But it also strikes me that the old school boards wrapped a useful and worthwhile skill in a bunch of gamesmanship, BS, and pomp. Nonetheless I can’t dispute that casemanship skills have real-world parallels and that the loss of them may have resulted in some young radiologists sounding like idiots when describing a novel case in front of a group of their peers.
In essence, the ABR jettisoned a highly effective oral board examination that did a superb job of preparing candidates for the real-world practice of radiology and replaced it with an examination that merely magnifies the defects of the old physics and written examinations. The emphasis is now on memorization rather than real-time interaction and problem solving. In our judgment, candidates are becoming less well prepared to practice radiology.
It seems increasingly true that anyone more than a couple years out of residency has now fully fetishized the oral boards. It’s definitely true that traditional case taking skills have rapidly atrophied; residency may feel long but institutional memory is short. Old school casemanship isn’t really the same thing as talking to clinicians, but it certainly has more in common with that than selecting the “best” answer from a list of choices.
It is an important skill/ability to succinctly and correctly describe a finding and its associated diagnosis. Some residents now are still able to get the diagnosis but may struggle with describing the findings appropriately when on the spot. But I don’t how much that matters in the long term and if this lack self-corrects over time. I would be interested in seeing if any of the old vs new debate has would have any impact on the quality of written reports, the fundamental currency of our field in the 21st century. I’ve seen plenty of terrible reports and unclear language from older radiologists, so the oral boards barrier couldn’t have been that formative.
The fact is that neither exam is a good (or even reasonable) metric. Frankly, a closed-book exam in and of itself is inherently unrealistic from daily practice. But any exam that trades in antiquated “Aunt Minnies” or relies on demonstrating “common pathology in unusual ways” are really dealing in academic mind games and not really testing baseline radiologic competence.