Love this (been writing around similar themes lately). To share my personal pet theory, I think that most of these issues are downstream from one problem; a chronic need for certainty leading us to totalise and flatten.
In short: a certain assumption is made. Usually based on faulty evidence. And after a while this assumption becomes an important rule. A rule that is used everywhere in the field. This is beyond just bias. You can not even question it.
Good stuff. Number 9 in particular is a danger for anyone scoring low on agreeableness. I've often noticed a dynamic in which I'll advance a tentative suggestion, which my brain then wants to make into a firm belief if it's challenged and I'm required to defend it. Much of the polarization of online discourse stems from this blowback effect, I think.
Regarding 10, changing beliefs, particularly long-standing beliefs, is an energetically expensive process as it requires physical restructuring of the brain. If other beliefs depend on the challenged belief, they too require reevaluation - thus even more energy. The brain interprets this as pain. It follows that increasing one's pain tolerance at a somatic level should improve one's ability to reevaluate belief structures. I went into more detail about that model here:
I love your long reads. Listicles are OK occasionally, but I feel your time+brain is worth a lot and should be deployed on more worthwhile pursuits (such as long reads).
I very much appreciate this list. I would add one more.
It's not ideal but we often must make take an action or make a decision based on incomplete knowledge. Once we take an action or make a decision, it seems to create an attachment to the idea that we took or made the right one, and we forget that we acted/decided on merely provisional knowledge. It seems worth it to try to remember! In the moment of action and thereafter, so stave off that attachment.
I liked this short article because these cognitive glitches need constant reminders. And reflecting on them is like a quick refresher course. Long articles are good too.
Long sentences need to be broken up with short ones, too. Thanks for this short one, and for all of your helpful insights and writing. ☺️🙏
I don't know how I feel about all this rationalism. When you get down to it, it doesn't seem like top performers like athletes and fighters and politicians think in this way. I think they view the world very intuitively and care not one whit for cognitive biases. I wrote something semi-related to that here:
The Dunning-Krueger effect is bullshit. It was just bad math. But it’s popular because it’s an emotionally satisfying idea to the highly educated. Not only are the uneducated ignorant, but they don’t even know they’re ignorant. It’s just not true, though.
Love this (been writing around similar themes lately). To share my personal pet theory, I think that most of these issues are downstream from one problem; a chronic need for certainty leading us to totalise and flatten.
Here is a trap that "experts in the field" usually fall for. The problem of model lock-in. - https://thescienceanalyst.substack.com/p/the-problem-of-model-lock-in
In short: a certain assumption is made. Usually based on faulty evidence. And after a while this assumption becomes an important rule. A rule that is used everywhere in the field. This is beyond just bias. You can not even question it.
Good stuff. Number 9 in particular is a danger for anyone scoring low on agreeableness. I've often noticed a dynamic in which I'll advance a tentative suggestion, which my brain then wants to make into a firm belief if it's challenged and I'm required to defend it. Much of the polarization of online discourse stems from this blowback effect, I think.
Regarding 10, changing beliefs, particularly long-standing beliefs, is an energetically expensive process as it requires physical restructuring of the brain. If other beliefs depend on the challenged belief, they too require reevaluation - thus even more energy. The brain interprets this as pain. It follows that increasing one's pain tolerance at a somatic level should improve one's ability to reevaluate belief structures. I went into more detail about that model here:
https://barsoom.substack.com/p/truth-hurts
I have these all down pat. Now I just wish other people did. Lol.
Seriously, great read, love your listicle (and this word!) More please.
I love your long reads. Listicles are OK occasionally, but I feel your time+brain is worth a lot and should be deployed on more worthwhile pursuits (such as long reads).
This is great, thanks. Do you have any books or articles you can recommend which go deeper into some of the research behind these?
I very much appreciate this list. I would add one more.
It's not ideal but we often must make take an action or make a decision based on incomplete knowledge. Once we take an action or make a decision, it seems to create an attachment to the idea that we took or made the right one, and we forget that we acted/decided on merely provisional knowledge. It seems worth it to try to remember! In the moment of action and thereafter, so stave off that attachment.
Excellent post, G! Much thanks.
I liked this short article because these cognitive glitches need constant reminders. And reflecting on them is like a quick refresher course. Long articles are good too.
Long sentences need to be broken up with short ones, too. Thanks for this short one, and for all of your helpful insights and writing. ☺️🙏
Very helpful. In my experience, knowing these pitfalls doesn’t prevent falling into them. One must renew one’s efforts along these lines.
Brilliant writing, and such well framed perspective! This is the stuff I wish would be in some of the leading magazines.
I don't know how I feel about all this rationalism. When you get down to it, it doesn't seem like top performers like athletes and fighters and politicians think in this way. I think they view the world very intuitively and care not one whit for cognitive biases. I wrote something semi-related to that here:
Triumphalism and Defeatism
https://squarecircle.substack.com/p/triumphalism-and-defeatism
Thank you.
The Dunning-Krueger effect is bullshit. It was just bad math. But it’s popular because it’s an emotionally satisfying idea to the highly educated. Not only are the uneducated ignorant, but they don’t even know they’re ignorant. It’s just not true, though.
Stick to the crazy long reads. Don’t get “brainwashed by your audience.” ;)