Discover more from The Prism
Why Smart People Believe Stupid Things
Intelligence is not rationality
What causes delusion?
The prevailing view is that people adopt false beliefs because they’re too stupid or ignorant to grasp the truth. This may be true in some cases, but just as often the opposite is true: many delusions prey not on dim minds but on bright ones. And this has serious implications for education, society, and you personally.
In 2013 the Yale law professor Dan Kahan conducted experiments testing the effect of intelligence on ideological bias. In one study he scored people on intelligence using the “cognitive reflection test,” a task to measure a person’s reasoning ability. He found that liberals and conservatives scored roughly equally on average, but the highest scoring individuals in both groups were the most likely to display political bias when assessing the truth of various political statements.
In a further study (replicated here), Kahan and a team of researchers found that test subjects who scored highest in numeracy were better able to objectively evaluate statistical data when told it related to a skin rash treatment, but when the same data was presented as data regarding a polarizing subject—gun control—those who scored highest on numeracy actually exhibited the greatest bias.
The correlation between intelligence and ideological bias is robust, having been found in many other studies, such as Taber & Lodge (2006), Stanovich et al. (2012), and Joslyn & Haider-Markel (2014). These studies found stronger biases in clever people on both sides of the aisle, and since such biases are mutually contradictory, they can’t be a result of greater understanding. So what is it about intelligent people that makes them so prone to bias? To understand, we must consider what intelligence actually is.
In AI research there’s a concept called the “orthogonality thesis.” This is the idea that an intelligent agent can’t just be intelligent; it must be intelligent at something, because intelligence is nothing more than the effectiveness with which an agent pursues a goal. Rationality is intelligence in pursuit of objective truth, but intelligence can be used to pursue any number of other goals. And since the means by which the goal is selected is distinct from the means by which the goal is pursued, the intelligence with which the agent pursues its goal is no guarantee that the goal itself is intelligent.
As a case in point, human intelligence evolved less as a tool for pursuing objective truth than as a tool for pursuing personal well-being, tribal belonging, social status, and sex, and this often required the adoption of what I call “Fashionably Irrational Beliefs” (FIBs), which the brain has come to excel at.
Since we’re a social species, it is intelligent for us to convince ourselves of irrational beliefs if holding those beliefs increases our status and well-being. Dan Kahan calls this behavior “identity-protective cognition” (IPC).
By engaging in IPC, people bind their intelligence to the service of evolutionary impulses, leveraging their logic and learning not to correct delusions but to justify them. Or as the novelist Saul Bellow put it, “a great deal of intelligence can be invested in ignorance when the need for illusion is deep.”
What this means is that, while unintelligent people are more easily misled by other people, intelligent people are more easily misled by themselves. They’re better at convincing themselves of things they want to believe rather than things that are actually true. This is why intelligent people tend to have stronger ideological biases; being better at reasoning makes them better at rationalizing.
This tendency is troublesome in individuals, but in groups it can prove disastrous, affecting the very structure and trajectory of society.
For centuries, elite academic institutions like Oxford and Harvard have been training their students to win arguments but not to discern truth, and in so doing, they’ve created a class of people highly skilled at motivated reasoning. The master-debaters that emerge from these institutions go on to become tomorrow’s elites—politicians, entertainers, and intellectuals.
Master-debaters are naturally drawn to areas where arguing well is more important than being correct—law, politics, media, and academia—and in these industries of pure theory, secluded from the real world, they use their powerful rhetorical skills to convince each other of FIBs. During their master-debatery circlejerks, the most fashionable delusions gradually spread from individuals to departments to institutions to societies.
Some of these FIBs can now be found everywhere. A particularly prominent example is wokeism, a popularized academic worldview that combines elements of conspiracy theory and moral panic. Wokeism seeks to portray racism, sexism, and transphobia as endemic to Western society, and to scapegoat these forms of discrimination on white people generally and straight white men specifically, who are believed to be secretly trying to enforce such bigotries to maintain their place at the top of a social hierarchy.
Naturally, woke intellectuals don’t consider themselves alarmists or conspiracy theorists; they believe their intelligence gives them the unique ability to glimpse a hidden world of prejudices. What they don’t know is that high IQ people and low IQ people display similar levels of prejudice, except toward different groups, and educated people actually display greater prejudice against those with different views.
Wokeism is mostly a prejudice of master-debaters and their followers because the idea that straight white men are bigots keeping down women and minorities requires a high degree of rationalization: one must uncritically accept the social disparities that favor men over women or whites over blacks as evidence of discrimination, while finding excuses to dismiss all the countervailing disparities.
For instance, if a wokeist wishes to use the overrepresentation of white men in STEM as evidence that women and minorities are being discriminated against, then the wokeist must either ignore or explain away the fact that Asian men are also overrepresented in STEM, or that women are overrepresented in the field of psychology, or that the biggest racial disparity of all is black men comprising less than 7% of the US population but holding over 70% of dream jobs playing in the NBA.
Not only does intelligence in the service of wokeism lead to one-sided readings of reality, it also leads to the production of pure fiction. The popular woke myth that sex is a spectrum is often justified on the basis that there’s no single thing that distinguishes all men from all women. Such an abstract explanation is seductive to an intellectual, but beneath the allure it’s just an instance of the univariate fallacy (it’s true that no single thing distinguishes all men from all women, but no single thing distinguishes all cats from all monkeys either; does this make cats monkeys?)
Labyrinthine sophistry like “sex is a spectrum” prevails among cognitively sophisticated cultural elites, including those who should know better such as biologists, but it’s rarer among the common people, who lack the capacity for mental gymnastics required to justify such elaborate delusions.
Despite being irrational, wokeism is nevertheless an intelligent worldview. It’s intelligent but not rational because its goal is not objective truth but social signaling, and in pursuing this goal it’s a powerful strategy. People who engage in woke rituals, such as proclaiming their pronouns during introductions, or capitalizing the word “black” but not the word “white,” signal to others that they’re clued-up, cosmopolitan, and compassionate toward society’s designated downtrodden. This makes them seem trustworthy and likable, and explains why wokeism is most prevalent in industries where status games and image are most important: politics, media, academia, entertainment, and advertising.
Wokeism is what happens when identity-protective cognition is allowed to run rampant through cultural institutions like Harvard and Hollywood. But while wokeism is currently systemic in the West, in the 1800s the dominant racial ideology in America was white supremacy. As a result, the master-debaters of that age often used their reasoning not to justify discrimination against whites but discrimination against blacks. An example would be the 19th century American physician Samuel Cartwright. A strong believer in slavery, he used his learning to avoid the clear and simple realization that slaves who tried to escape didn’t want to be slaves, and instead diagnosed them as suffering from a mental disorder he called drapetomania, which could be remedied by “whipping the devil” out of them. It’s an explanation so idiotic only an intellectual could think of it.
Cartwright’s case shows that the problem of runaway rationalization is not just a disorder of today’s woke intellectuals, but of educated people of any persuasion and any time. And that includes you. Since you’re reading about intelligence right now, you’re likely above average in intelligence, which means that you, whatever you believe, should be extra vigilant against your intellect being commandeered by your animal impulses.
But how does one do that, exactly? How does an intelligent person avoid a disorder that preys specifically on intelligence?
The standard rationalist path is to try to avoid delusion by learning about cognitive biases and logical fallacies, but this can be counterproductive. Research suggests that teaching people about misinformation often just causes them to dismiss facts they don’t like as misinformation, while teaching them logic often results in them applying that logic selectively to justify whatever they want to believe.
Such outcomes make sense; if knowledge and reasoning are the tools by which intelligent people fool themselves, then giving them more knowledge and reasoning only makes them better at fooling themselves.
I’ve been tweeting about irrationality since 2017, and in that time I’ve noticed a disturbing pattern. Whenever I post of a cognitive bias or logical fallacy, my replies are soon invaded by leftists claiming it explains rightist beliefs, and by rightists claiming it explains leftist beliefs. In no cases will someone claim it explains their own beliefs. I’m likely guilty of this too; it feels effortless to diagnose others with biases and fallacies, but excruciatingly hard to diagnose oneself. As the famed decision theorist Daniel Kahneman quipped, “I’ve studied cognitive biases my whole life and I’m no better at avoiding them.”
This is not to say that education is futile. Knowledge can help to limit motivated reasoning—but only if it’s accompanied by a far deeper kind of growth: that of one’s character.
Motivated reasoning occurs when we place our intelligence and learning into the service of irrational goals. The root of the problem is therefore not our intelligence or learning, but our goals. Most goals of thinking are not to reach objective truth but to justify what we wish to believe. There is only one thing that can motivate us to put our intelligence into the service of objective truth, and that is curiosity. It was curiosity that was found by Kahan’s research to be the strongest countermeasure against bias.
But how do we make ourselves curious? Is it even possible?
Good news: if you’re reading this, you’re probably quite curious already. But there’s something you can do to supercharge your curiosity: enter the curiosity zone. Basically, curiosity is the desire to fill gaps in knowledge. As such, curiosity occurs not when you know nothing about something but when you know a bit about it. So learn a little about as much as you can, and this will create “itches” that will spur you to learn even more.
Curiosity is essential to directing your intellect toward objective truth, but it’s not all you need. You must also have humility. This is because the source of our strongest biases is our ego; we often base our self-worth on being intelligent and being right, and this makes us not want to admit when we get things wrong, or to change our mind. And so, in order to protect our chosen identity, we stay wrong.
If you define your self-worth by your ability to reason—if you cling to the identity of a master-debater—then admitting to being wrong will hurt you, and you’ll do all you can to avoid it, which will stop you learning. So instead of defining yourself by your ability to reason, define yourself by your willingness to learn. Then admitting you’re wrong, instead of feeling like an attack, will become an opportunity for growth.
Anyone who’s sure they’re humble is probably not, so I can’t say whether I’ve succeeded in becoming humble. But I can say that I always try to be humble. And, well, there’s little difference between trying to be humble and actually being so.
For me, trying to be humble entails the constant interrogation of my own motives. Could my most cherished belief be a FIB? Why do I really believe what I believe? What other reasons beside reason could I have? My self-questioning makes me agonize over every word I write, but in the long term my hesitancy gives me confidence, for by being careful about what I think I develop trust in my thoughts.
Humility and curiosity, then, are what we most need to find truth. By seeking one we also seek the other: being curious makes us humble, because it shows us how little we know, and in turn, being humble makes us curious, because it helps us acknowledge that we need to learn more.
In the end, rationality is not about intelligence but about character. Without the right personal qualities, education and IQ won’t make you a master of your biases, they’ll only make you a better servant of them. So be open to the possibility that you may be wrong, and always be willing to change your mind—especially if you’re smart. By being humble and curious you may not win many arguments, but it won’t matter, for even losing arguments will become a victory that moves you toward the far grander prize of truth.
The Prism is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.