10 Ways to Avoid Being Fooled
Mental models for discerning truth
The poet Emily Dickinson once wrote that “The Brain—is wider than the Sky— For—put them side by side— The one the other will contain With ease—and you—beside.”
I’m rather ashamed to nitpick poetry, but, objectively speaking, the sky is wider than the brain. It’s vaster than we can see, deeper than we can fathom, and made of more particles than we can count. Even basic facts about the sky often escape us, such as the fact that there is more sky beneath us than above us.
Due to its limitations, the brain can only “contain” the sky by first flattening it into a bluish sheet. We shrink the world to fit our minds, and think we’ve expanded our minds to grasp the world.
Since our brains are made for small ideas rather than big ones, the best way to discern truth is not with elaborate, all-encompassing philosophies but with simple, easy to follow instructions called heuristics.
A heuristic is a mental shortcut, a way to shrink the sky so it fits the mind. Since a heuristic cuts corners and simplifies reality, it should be used as a rule of thumb and not a universal law.
Below I present to you 10 heuristics that I use to avoid being fooled.
1. Epistemic Humility
Instead of trying to be right, try to be less wrong.
The investor Charlie Munger said, “It is remarkable how much long-term advantage people like us have gotten by trying to be consistently not stupid, instead of trying to be very intelligent.”
Avoiding idiocy is much simpler than achieving genius, so it’s easier to turn into a habit. Furthermore, if we try to be right then we’ll often convince ourselves we’re right even when we’re not. But if we begin from the position that we’re wrong, and we simply try to be less wrong, we gain more awareness of our blindspots and become less wedded to our beliefs, reducing our resistance to learning.
2. Munger's Iron Prescription
Another gem from Munger: If you can't state the opposing view at least as well as the people supporting it, then you shouldn’t feel entitled to your own view.
Take the time to learn how people you don’t agree with think. Seek out the best advocates of opposing views. Sometimes you’ll find that what you thought were your strongest counterarguments are in fact easily refuted. And sometimes you’ll find far better lines of attack. Either way, learning what the other side thinks (and not what people on your side tell you the other side thinks) will save you from much stupidity.
3. Survivorship Bias
During WWII, the statistician Abraham Wald examined planes riddled with bullet-holes from recent battles. He concluded that the best areas of the planes to reinforce with armor were not the regions with the most bullet-holes, but the least. The reason was that the planes he was examining were the ones that had safely returned; those that received damage to other areas didn’t survive to be examined.
This is relevant today because the information we consume online must also survive significant selection effects before we see it. Crucially, the info in your feed has been selected because it’s surprising. It is therefore an indication not of the ordinary but the extraordinary, a reflection not of reality but of that which is uncharacteristic of reality. Remember this whenever your feed convinces you the world is going crazy.
4. Wittgenstein's Ruler
The less you know of the measurer compared to the thing being measured, the less the measurer’s measure measures the thing being measured and the more it measures the measurer.
As an analogy, if a man says everyone he meets is an asshole, the asshole is likely him.
You can apply this concept to any source of information. For example, if a news outlet’s stories frequently outrage you, instead of taking it as evidence the world is becoming more outrageous, consider the possibility that the news outlet is deliberately trying to outrage you, and temper your reactions accordingly.
Basically, never take information at face value, and always ask yourself, “what does the info from this source suggest about the source itself?”
5. Streetlight Effect
A police officer sees a drunkard searching under a streetlight and asks what the man has lost. The drunk says he lost his keys and the cop joins him in his search. After a few minutes the cop asks if the drunk is sure he lost the keys here, and the drunk says he’s not. The cop asks why he is searching here, and the drunk replies, “This is where the light is.”
The moral of the story is that people tend to search where it’s easiest to look. They get all their information from the first few search results, read the books that appear at the top of bestseller lists, and follow whatever topics are trending on social media.
The problem with such “streetlights” is that they reflect the behaviors, and cater to the desires, of the average human, and the average human isn’t very smart. If you want to avoid popular blindspots, avoid getting your info from popularity contests like top search results, “trending” algorithms, and bestseller lists.
6. Popper’s Falsifiability Principle
As rationalizing creatures who often believe things because they’re convenient rather than because they’re true, we excel at guarding our beliefs from evidence that threatens them.
An example of such a belief is “white fragility,” the idea that white people unconsciously resist recognizing “white privilege” because they don’t want to lose that privilege. The concept of white fragility can’t be realistically refuted, because anyone who challenges its existence is accused of exhibiting it (including nonwhite people, who are accused of “internalizing” white supremacy.)
If a belief you hold can’t be falsified, then this is a sign that you’ve protected it from reality. As such, for each of your beliefs you should develop a clear idea of what would persuade you you're wrong. If you can’t, your belief is immune to reason and you should be highly suspicious of it.
Indoctrination typically occurs by rote; constant reinforcement of an idea until it’s engraved into your mind. Creepily, news and commentary outlets often engage in this constant reinforcement, reiterating the same talking points again and again, making the news fit their views instead of their views fit the news.
So to avoid being brainwashed, regularly switch up your news sources so you alternate between outlets of clashing political stances. If you read the New York Times on Thursday, then read the Wall Street Journal on Friday. If you read The Prism on Monday, then read some bastard who disagrees with me on Tuesday. The more mercurial your news consumption, the greater your resistance to manipulation.
8. Opinion Lock
A great danger of the culture war is that it pressures people to take stances on issues they know little about. As such, many people form strong opinions without giving them due consideration. This wouldn’t be a big problem if people were constantly updating their views. Unfortunately, people don’t like to change their mind, especially not publicly, because they think it makes them seem weak or stupid.
Before social media, people could save face when changing their mind by pretending they'd never held the obsolete opinion. But now that there's a public record of utterances, changing beliefs can’t be easily hidden, so instead people double down on their old opinions, fortifying their follies.
Therefore, resist the instinct to state a half-baked opinion, as once you’ve declared your stance, your ego will do all it can to stop you from changing your mind.
9. The Never-Ending Now
Humans are neophiles; they’re attracted to whatever’s new. As such, they're always chasing the latest info, ignoring anything older than 24 hours. The problem with this approach is that the latest info is often shit. Its main selling point is not its quality but its novelty, and to ensure its novelty it is typically rushed out, so it’s mostly untested and often inaccurate.
This lack of quality control means the newest info is also the most perishable, and people who endlessly chase it spend their lives in a never-ending now of momentary seeing and forgetting, a trench in time that obscures the past and future.
If you want to improve the quality of the info that enters your head, end your addiction to the new. Instead of mindlessly scrolling through breaking news, status updates, and the latest gossip, seek out info that's stood the test of time: classic literature, replicated studies, proven theorems, and fulfilled predictions. Millennia of humanity’s accumulated wisdom await you.
The best way to practise clear thinking is to practise clear writing.
We're forever lost in external stimuli: phones, computers, other people. They keep us distracted from our thoughts, estranging us from ourselves. As such, we often don’t realize what we believe, which often causes us to cobble together makeshift opinions to meet an immediate need.
You only become fully aware of your beliefs when you express them, so if you’re unsure about them, write them down. Laying your thoughts out on paper will reveal your reasoning and expose its errors. It will also generate new ideas, deepening your understanding.
When you see how often writing down your thoughts changes your view of them, you’ll stop trusting beliefs that have never escaped the confines of your head.
I hope you found these heuristics as useful as I have. If you wish to see how I tried to fit each of them into a 280-character tweet, you can do so here:
The Prism is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
I fail at all of these, most of the time. Thankyou for the guidance.
Off topic, but the Charlie Munger references made me think of his answer to a question about his thoughts about the wisdom of investing in crypto. He quipped that Oscar Wilde’s description of fox hunting applies, “It’s the pursuit of the inedible by the unspeakable.”
Great work as always Gurwinder, many thanks.