From the excellent Alchemy: The Dark Art and Curious Science of Creating Magic in Brands, Business, and Life by Rory Sutherland:
In theory, you can’t be too logical, but in practice, you can. Yet we never seem to believe that it is possible for logical solutions to fail. After all, if it makes sense, how can it possibly be wrong?
[…]
If you are a technocrat, you’ll generally have achieved your status by explaining things in reverse; the plausible post-rationalisation is the stock-in-trade of the commentariat. Unfortunately, it is difficult for such people to avoid the trap of assuming that the same skills that can explain the past can be used to predict the future.
The world trades in stories, but compelling stories aren’t necessarily true. See also: hindsight bias. The post hoc seeming inevitability of where we are now is a mirage.
Adam Smith, the father of economics – but also, in a way, the father of behavioural economics – clearly spotted this fallacy over two centuries ago. He warned against the ‘man of system’, who: ‘is apt to be very wise in his own conceit; and is often so enamoured with the supposed beauty of his own ideal plan of government, that he cannot suffer the smallest deviation from any part of it. He goes on to establish it completely and in all its parts, without any regard either to the great interests, or to the strong prejudices which may oppose it . . . He seems to imagine that he can arrange the different members of a great society with as much ease as the hand arranges the different pieces upon a chess-board. He does not consider that the pieces upon the chess-board have no other principle of motion besides that which the hand impresses upon them; but that, in the great chess-board of human society, every single piece has a principle of motion of its own, altogether different from that which the legislature might chuse [sic] to impress upon it. If those two principles coincide and act in the same direction, the game of human society will go on easily and harmoniously, and is very likely to be happy and successful. If they are opposite or different, the game will go on miserably, and the society must be at all times in the highest degree of disorder.’
This chess piece argument is also a metaphor favored by conservative economist Thomas Sowell as the key failing conceit of central planning.
The problem that bedevils organisations once they reach a certain size is that narrow, conventional logic is the natural mode of thinking for the risk-averse bureaucrat or executive. There is a simple reason for this: you can never be fired for being logical. If your reasoning is sound and unimaginative, even if you fail, it is unlikely you will attract much blame. It is much easier to be fired for being illogical than it is for being unimaginative.
[…]
The fatal issue is that logic always gets you to exactly the same place as your competitors.
The primary function of managers is to preserve their position within management. The second function is to be promoted. The distant third is to actually manage people well or improve their organizations. (Further Reading: Academic Medicine and the Peter Principle)
The late David Ogilvy, one of the greats of the American advertising industry and the founder of the company I work for, apparently once said, ‘The trouble with market research is that people don’t think what they feel, they don’t say what they think, and they don’t do what they say.’*
[…]
It is fine to provide up-to-date magazines in reception to show that you care, but when the urge to show commitment to patients involves performing unnecessary tests and invasive surgery, it probably needs to be reined back.
Yes. I am reminded of Patient Satisfaction: A Danger to be Avoided.
If you want to change people’s behaviour, listening to their rational explanation for their behaviour may be misleading, because it isn’t ‘the real why’. This means that attempting to change behaviour through rational argument may be ineffective, and even counterproductive. There are many spheres of human action in which reason plays a very small part. Understanding the unconscious obstacle to a new behaviour and then removing it, or else creating a new context for a decision, will generally work much more effectively.
Behavior change is hard. I can barely control a whole host of my own impulses, let alone guide others.
The self-regarding delusions of people in high-status professions lie behind much of this denial of unconscious motivation. Would you prefer to think of yourself as a medical scientist pushing the frontiers of human knowledge, or as a kind of modern-day fortune teller, doling out soothing remedies to worried patients? A modern doctor is both of these things, though is probably employed more for the latter than the former. Even if no one – patient or doctor – wants to believe this, it will be hard to understand and improve the provision of medical care unless we sometimes acknowledge it.
[…]
To put it crudely, when you multiply bullshit with bullshit, you don’t get a bit more bullshit – you get bullshit squared.
[…]
Nassim Nicholas Taleb applies this rule to choosing a doctor: you don’t want the smooth, silver-haired patrician who looks straight out of central casting – you want his slightly overweight, less patrician but equally senior colleague in the ill-fitting suit. The former has become successful partly as a result of his appearance, the latter despite it.
The Taleb reference is commonly referenced even if it may not always work in real life. It’s one of those lightbulb-generating remarks that strikes that magic of being surprisingly intuitive after seeming counterintuitive.
There is an important corollary: proxy metrics (patrician manner, various training factors) don’t actually mean what we want them to mean. You want to see a Harvard-trained doctor because you assume they are better through the implied meritocratic scarcity of an elite institution and the presumption that therefore, surrounded by other geniuses and some presumably fancy digs, their training was uniquely better–but there is no actual basis for this belief.
In making decisions, we should at times be wary of paying too much attention to numerical metrics. When buying a house, numbers (such as number of rooms, floor space or journey time to work) are easy to compare, and tend to monopolise our attention. Architectural quality does not have a numerical score, and tends to sink lower in our priorities as a result, but there is no reason to assume that something is more important just because it is numerically expressible.
Measurability does not equal importance. See “Overweighing what can be counted” in Munger’s Incorrect Approaches to Medicine.
The more data you have, the easier it is to find support for some spurious, self-serving narrative. The profusion of data in future will not settle arguments: it will make them worse.
…I naively thought Covid would bring people together. And it did, for maybe a week or two. Then the dueling data wars began.
We are flush with data. Absolutely awash in data. If the past few years of social media have taught us anything, it’s that information isn’t truth. It’s raw material for storytelling. Yoval Noah Harari does a nice if depressing job discussing information networks in his recent book, Nexus.