Radiographics hosted a Twitter chat this week to discuss a recent op-ed, “Have We Done Radiology Trainees a Disservice by Eliminating the Oral Board Examination.” I was asked to participate.
ABR President and soon to be highly-compensated Executive Director Brent Wagner was also set to throw down and managed to fire off a couple of tweets at the beginning before disappearing. He answered the first of five pre-shared discussion questions and only responded to one direct question. You can read the full thread collected here if you’re interested.
The first question: “Does the ABR Core Exam test radiology competence?”
#RGchat T1: the ABR certification exam is intended to test knowledge as it relates to competence, and critical thinking as it relates to image interpretation. Other elements of competence (procedures, professionalism, etc) are better tested (assessed) by the residency faculty.
— Brent Wagner (@brentwagner99) April 14, 2020
That sounds like doublespeak to me, but I think this is probably as straightforward an answer as the ABR can provide. It also makes the unfortunate admission that we are essentially testing for a simile.
Per the ABR, its mission is “to certify that our diplomates demonstrate the requisite knowledge, skill, and understanding of their disciplines to the benefit of patients.” We are testing for #1, but we can and should also be testing directly for #2.
The Core Exam is a knowledge assessment, and knowledge assessments based on multiple-choice questions are reductive and intellectually lazy. Knowledge in isolation is likely one of the least significant measurables in determining if a radiology trainee is safe to practice for the “benefit of patients,” especially in a world with easy access to electronic resources. So despite purporting to assess for skill in its mission statement, we are testing for knowledge and pretending that the correlation between knowledge and competence is so high that it can stand alone as the sole determinant of minimal competence.
We could, however, directly test for competence and critical thinking by designing a test where the diagnostic portion simulates actual radiology practice and not an artificial multiple-choice single-best-answer format.
#RGchat T1: Much of the exam seeks to assess a candidate's ability to choose the most likely diagnosis based on a set of images. Similar to what I expect of myself and my colleagues in daily work.
— Brent Wagner (@brentwagner99) April 14, 2020
Leaving aside the other parts of the exam that are irrelevant to practice, the issue that Wagner sidesteps (and that made up a large fraction of the discussion on Twitter) is that choosing a single likely diagnosis from a list is an unnecessary artificial construct being used for psychometric convenience. An MCQ test is cheap to create, easy to administer, and easy to validate. The ABR once said it was creating “the test of the future,” but it really just replaced two smaller MCQ tests and an oral exam with one longer MCQ test. It was only the “future” in the sense that it was announced before it happened. I don’t get primed by answer choices in real life, and that’s the difference between knowing the correct answer and merely recognizing it.
Today, a recent interview with Wagner was included in this Radiology Business article:
One of the fundamentals I’ve been encouraging is to take ownership of flawed or incomplete communications. In other words, if an ABR stakeholder doesn’t understand something, that’s our problem, not theirs, and we have the responsibility to do whatever it takes to fix it.”
So, fix it.
3 Comments