10 Reasons Your Beliefs Are Probably Bullshit
And 10 ways to de-bullshit your mind
I’ve had requests from y’all to do more listicles (list-type articles), as apparently this format is much easier to digest for those who don’t have big chunks of free time. I’ve also received requests to do more content on avoiding psychological pitfalls.
As such, I present this listicle of 10 mental phenomena that lead your mind astray, and how you can defeat them. Click on each entry’s heading to learn more.
1. Principle of Least Effort
We humans tend to do the bare minimum to get our information. We’ll typically accept the first idea that comes to mind, and get all our facts from the first relevant search result. This laziness often leads us to embrace suboptimal answers.
Solution: Whether you’re searching the web or trying to think of an idea, don’t just stop at the first answer you find. Instead, put the answer aside and then try to find a different answer. You’ll be surprised how many times you find something better.
2. Belief Bias
Arguments we'd normally reject for being idiotic suddenly seem perfectly logical if they lead to conclusions we agree with. In other words, we judge an argument’s strength not just by how strongly it supports the conclusion but also by how strongly we support the conclusion.
Solution: Ensure your explanations move forwards, not backwards. Begin with a set of premises and follow them to their logical conclusion. Avoid your natural inclination, which is to begin with a conclusion then look backwards for a way to get there.
3. Dunning-Kruger Effect
Our ignorance makes us ignorant of our own ignorance. We don’t know what we don’t know. This leads us to overestimate our understanding. Furthermore, learning more can make this effect worse due to the beginner’s bubble effect, which convinces us we’ve gained more knowledge than we actually have.
Solution: Humility. Assume you’re stupider than you think you are. Be slow to answer your own questions and quick to question your own answers.
4. Blind Men & An Elephant
We assume our experiences are a representative sample of the universe, and therefore base our assumptions about reality on a few meager impressions. We shrink the world to fit our minds and think we've expanded our minds to grasp the world.
A group of blind men heard that a thing called an “elephant” had been brought to town, but none of them knew what it was. So they sought it out, and when they found it they groped about it. The first person, whose hand landed on the trunk, said the elephant was a snake. Another person, whose hand reached its ear, said the elephant was a fan. Another, whose hand was on its leg, said the elephant was a pillar. The last blind man felt its tusk, and said the elephant was a spear.
Solution: Try to get your information from multiple sources, and be confident in your judgement in proportion to the representativeness of your sample.
5. Wishful Thinking
We often accuse others of wishful thinking, but we seldom consider its effects on ourselves. As much as anyone, our beliefs are shaped not just by what we think is true but also by what we'd prefer is true, and the amount of proof that we need to be convinced of something is inversely proportional to how much we want to be convinced of it.
Solution: Always subtract your desire to believe from the available evidence.
6. Causal Reductionism
Things rarely happen for just one reason. Usually, outcomes result from many causes conspiring together. But our minds cannot process such a complex arrangement, so we ascribe outcomes to single causes, reducing the web of causality to a mere thread, and simplifying systems into stories.
Solution:: Allow your mind the space to consider other causes. Make a habit of saying “a cause of…” instead of “the cause of,” “a reason for…” instead of “the reason for…”
7. Reactive Devaluation
We judge a message by the messenger. In 2002, researchers Maoz et al. showed Israelis a Palestine peace plan. When told it was authored by Palestinians, the participants rated it as less fair than when told it was authored by the Israeli government.
Solution: When you need to objectively evaluate a piece of info, appraise it as if it was authored by the person you respect most, then as if it was authored by the person you respect least, then keep whatever the two appraisals have in common.
8. Selective Laziness:
We're critical of other's arguments but not our own. Trouche et al (2016) found that showing people their own claims, slightly reworded and attributed to other people, often led them to reject the claims.
Solution: Subject your own views to the same scrutiny you subject others’. And to discover what you really think about your beliefs, imagine they're someone else's.
9. Gurwinder's Theory of Bespoke Bullshit:
Many don’t have an opinion until they’re asked for it, at which point they cobble together a viewpoint from whim and half-remembered hearsay, before deciding that this two-minute-old makeshift opinion will be their new hill to die on.
Solution: Discard all the opinions you thought of instinctively. Resist the reflex to offer impromptu answers, and become comfortable with saying “I don’t yet have enough information.”
10. Belief Perseverance:
Our opinions are like bricks in a building; each supports and is supported by others. Changing a belief means tearing down all beliefs atop it. Such demolition is hard to bear, so people will rarely let that single brick budge, opting instead to live with a belief system crooked like the Tower of Pisa.
Solution: Don’t define your identity by your beliefs. Define your identity by your willingness to learn. More specifically:
I hope this was helpful. If you’d like to see more listicles like this, let me know. Cheers.
The Prism is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Love this (been writing around similar themes lately). To share my personal pet theory, I think that most of these issues are downstream from one problem; a chronic need for certainty leading us to totalise and flatten.
Here is a trap that "experts in the field" usually fall for. The problem of model lock-in. - https://thescienceanalyst.substack.com/p/the-problem-of-model-lock-in
In short: a certain assumption is made. Usually based on faulty evidence. And after a while this assumption becomes an important rule. A rule that is used everywhere in the field. This is beyond just bias. You can not even question it.