2019 was the initial offering of ABR’s MOC of the future: Online Longitudinal Assessment (OLA). I wrote about it earlier this year, but to recap: All Diagnostic Radiology ABR diplomates including those fresh off their Certifying Exam victory lap were immediately thrust into the new paradigm. This amounts to answering a whopping 52 multiple choice questions over the course of the calendar year in whatever subspecialty composition you prefer. Questions are released 2 per week and expire after a month.
It’s…fine? Sorta I guess?
The website works (mostly), and the questions are questions (undeniable). Some are pretty good, some certainly less so. People on the internet grumble about content relevance more than I personally would, but then again the minute I got a lame low-yield Core-style GI fluoro question I switched to 100% neuroradiology.
The ABR hasn’t released the passing thresholds yet, which is the most interesting facet of the whole ordeal: recall that the Core and Certifying Exams are “criterion-referenced” by magical Angoff committees that can infallibly determine what a “minimally component” radiologist can do. The ABR just doesn’t seem to have that same confidence when it comes to MOC, presumably because they have no idea how many people would fail if they had logically employed that exact same Angoff method, and failing an unknown number of already dissenting practicing radiologists is a much bigger deal than embarrassing some more trainees.
Now before you say that each diplomate needs to hit 200+ questions to hit the psychometric validity threshold, the ABR could still tell people if they were on track to pass or fail based on their current performance. There are apparently plans to release preliminary feedback soon (which may do just that now that there is some real-world data to calibrate with), but all of us will need to do another few years of OLA to learn if we’re truly maintaining the magic.
In case you were wondering, I did get one question wrong (the software buries additional images in tabs you have to click through; I kept forgetting, though it only burned me the one time).
Drip-Feeding
There are no secrets as to why the ABR chose to release two questions per week that subsequently expire a month later. I finished my required questions in August, less than a year from when I took (and presumably destroyed) the Certifying Exam (but we’ll never know because they don’t release scores for that exam).
What I can tell you is that I spent approximately one hour satisfying the OLA requirements for the year. Without the forced drip-feeding, I could’ve accomplished the entire process during a single generous lunch break.
Some of you reading may be thinking, hey, that’s not so bad. And you’re right, the process is relatively painless. I didn’t learn anything, but at least it didn’t take a lot of my time.
Ultimately, that’s also what makes MOC a meaningless box-checking endeavor and blatant money grab.
The argument that something isn’t stupid, bad, useless, or wrongheaded as long as it doesn’t suck is spurious. Just because it could be worse doesn’t mean that it shouldn’t be better.
And the fact that many doctors are scared that these unelected unaccountable pseudo-governing organizations will punish any dissent by making tests harder and MOC more arduous is toxic and should not be accepted. We shouldn’t treat the relative ease of a profit-seeking exercise as a thrown bone from the shadow lords that can be taken away at any time or a secret to keep quiet so the “public” doesn’t find out.
The Anti-MOC Wave
I am actually not really part of the large and growing cohort of physicians adamantly opposed to any third-party validation of demonstrable skill or the mere idea of a certification-granting organization that can reliably establish minimal competence. In fact, if board certification wasn’t a de facto requirement in many contexts (and thus akin to licensure itself), I wouldn’t even mind if the supposed threshold was greater than minimal competence.
The ABMS was founded in 1933. The ABR was founded in 1934. We’re still waiting on evidence that anything these people do means anything. If the intellectual underpinnings of initial certification are up for debate, then the “maintenance” of said certification is doubly so (hence the lawsuit).
The new system may be no worse than the 10-year exam it replaced; it would seem to me that it’s likely significantly less hassle. Less studying, less travel, less time, less effort, and more relevant (in that you can exclude broad categories of radiology irrelevant to your practice). However, cumbersomeness (or lack thereof) is not a component of psychometric validity.
A lack of rigor may serve as a salve for diplomates injured by losing out on years of rightfully-earned respite after a recently passed 10-year exam, but it doesn’t change the fact that gradually adding strata of multiple-choice questions on a foundation of more multiple-choice questions creates a weak structure that teeters in the winds of change.
4 Comments