In 2003, Charlie Munger gave a lecture titled ‘Academic Economics: Strengths and Weaknesses, after Considering Interdisciplinary Needs,’ at the University of California at Santa Barbara.
It’s a pretty good read.
He mostly discusses problems with economics as a soft science that desperately wants to be a hard science.
Medicine is also surprisingly soft. I’ve replaced some words with medicine in several paragraphs to illustrate how cross-domain the problems with medical practice can be:
The Man with a Hammer Syndrome
Yet medicine, like much else in academia, is too insular.
The nature of this failure is that it creates what I always call, “man with a hammer syndrome.” And that’s taken from the folk saying: To the man with only a hammer, every problem looks pretty much like a nail. And that works marvelously to gum up all professions, and all departments of academia, and indeed most practical life. The only antidote for being an absolute klutz due to the presence of a man with a hammer syndrome is to have a full kit of tools. You don’t have just a hammer. You’ve got all the tools. And you’ve got to have one more trick.
This is an argument for a broad foundation in medicine before specialization. The more siloed we are, the less we can draw on different toolsets to help patients.
This is also an argument against fee-for-service. If doctors and hospitals can generate the most money with a certain hammer, that hammer is likely to be used disproportionately.
Overweighing what can be counted
A special version of this “man with a hammer syndrome” is terrible, not only in economics but practically everywhere else, including medicine. It’s really terrible in medicine. You’ve got a complex system and it spews out a lot of wonderful numbers that enable you to measure some factors. But there are other factors that are terribly important, [yet] there’s no precise numbering you can put to these factors. You know they’re important, but you don’t have the numbers. Well practically everybody (1) overweighs the stuff that can be numbered, because it yields to the statistical techniques they’re taught in academia, and (2) doesn’t mix in the hard-to-measure stuff that may be more important. That is a mistake I’ve tried all my life to avoid, and I have no regrets for having done that.
This gives rise to the classic Goodhart’s Law: when a measure becomes a target, it ceases to be a good measure.
We shouldn’t confuse measurability with importance. In many cases, the measure is a poor surrogate for what we really care about or can be gameable to ultimately negative downstream effects. An example? Patient satisfaction.
The first-order short-term focus
Too little attention in medicine to second-order and even higher-order effects. This defect is quite understandable, because the consequences have consequences, and the consequences of the consequences have consequences, and so on. It gets very complicated. When I was a meteorologist I found this stuff very irritating. And medicine makes meteorology look like a tea party.
…
Extreme economic ignorance was displayed when various experts, including Ph D. economists, forecast the cost of the original Medicare law. They did simple extrapolations of past costs. Well the cost forecast was off by a factor of more than 1000%. The cost they projected was less than 10% of the cost that happened. Once they put in place all these new incentives, the behavior changed in response to the incentives, and the numbers became quite different from their projection. And medicine invented new and expensive remedies, as it was sure to do. How could a great group of experts make such a silly forecast? Answer: They over simplified to get easy figures, like the rube rounding Pi to 3.2! They chose not to consider effects of effects on effects, and so on.
Short-term thinking is bad at both micro and macro levels.
On the micro level, the patient’s care doesn’t end when they leave the clinic or hospital. It keeps going throughout their life. And each episode of care from each different provider doesn’t exist in a vacuum. It interfaces with every other bit of care they get. The combination of direct patient care, socioeconomic factors, and education is a complicated mess at baseline.
But decisions lead to decisions, and outcomes further affect outcomes. We treat at the n of 1 and often at the timepoint of right this second. Missing the forest for the trees is easy to do when your patient is usually coming to you for a tree and you are also paid to look at the tree.
On the macro level, Munger’s Medicare example above. Or, the more recent news, approving a multibillion-dollar a-year Alzheimer’s drug with no evidence that it works: It won’t just cost billions of upfront, it will result in other companies diverting resources in a rush for me-too drugs that also may not work to get a slice of a massive market and likely cost still billions more while potentially resulting in less novel drug development. We think in linear terms but systems often work exponentially.