When Intelligence Defeats Itself

Book Review:

David Robson, "The Intelligence Trap: Why Smart People Make Stupid Mistakes -- and How to Make Wiser Decisions", Hodder & Stoughton, 2020. 

[Alternative subtitle: "Revolutionise your thinking and make wiser decisions".]

The Intelligence Trap is the story of how Nobel Prize-winning scientist Kary Mullis could believe in alien abductions, astrology, and AIDS denialism; how the Sherlock Holmes author Arthur Conan Doyle could believe in spiritualism and fairies; how Apple co-founder Steve Jobs could believe in a fruit juice diet as the cure for his cancer; how FBI fingerprint experts could have falsely accused Brandon Mayfield of the 2004 Madrid bombings; and how a team of engineers could have missed the warning signs before the Deepwater Horizon disaster of 2010.

The idea that intelligent people can be foolish is not a new one: there is a volume titled Why Smart People Can Be So Stupid, edited by Robert J. Sternberg (2002), and various related books. Alexander Kruel wrote a blog post about "highly intelligent and successful people who hold weird beliefs". Nassim Nicholas Taleb talks about the "intellectual yet idiot". On Overcoming Bias/Less Wrong, Eliezer Yudkowsky warned us about people "with high g-factor who end up being less effective because they are too sophisticated as arguers". But BBC journalist and science writer David Robson's recent book, The Intelligence Trap, is perhaps the most wide-ranging popular-science overview of the topic.

The book is divided into four parts: Part 1 defines the "intelligence trap"; Part 2 identifies the new discipline of "evidence-based wisdom" as a potential remedy; Part 3 elaborates on the topics of learning and memory; and Part 4 examines stupidity on the group level. The Appendix features a "Taxonomy of Stupidity" and a "Taxonomy of Wisdom". You might already be starting to see the theme here: the author continually makes a distinction between intelligence and wisdom.

By "intelligence", Robson means the raw, general brainpower underlying memory, vocabulary, analogical, spatial and numerical reasoning, information processing, and related abstract thinking skills. Intelligence is measured by IQ as well as academic tests such as the Scholastic Aptitude Test (SAT) and Graduate Record Examinations (GRE). Actually, there are two aspects to the intelligence trap: the first relates to one's general intelligence or mental "energy", and the second relates to education and expertise. The two are closely intertwined, since schools often test for the same abstract abilities as IQ does (which might explain the Flynn effect). Even recruiters sometimes test prospective employees using the Wonderlic Personnel Test. To some extent, intelligence does matter: it predicts greater problem-solving skills, academic success, and achievement in white-collar careers such as law, medicine, accountancy, engineering and computer programming.

However, this kind of intelligence is far from the only thing that matters. As much as 70 percent of the variation in professional performance (as judged by managers) cannot be accounted for by IQ, and there are plenty of people who outperform those with higher IQ scores. Higher intelligence does not automatically correspond to motivation, leadership ability, communication skills, wise judgement, or even creativity. See also Scott Alexander's blog post "Against Individual IQ Worries", wherein he notes that "... on an individual level, we see that below-average IQ people sometimes become scientists, professors, engineers, and almost anything else you could hope for."

But The Intelligence Trap goes even further: in certain ways, intelligence can be a disadvantage! As David Robson writes in the Introduction:
"Intelligent and educated people are less likely to learn from their mistakes, for instance, or take advice from others. And when they do err, they are better able to build elaborate arguments to justify their reasoning, meaning that they become more and more dogmatic in their views. Worse still, they appear to have a bigger 'bias blind spot', meaning they are less able to recognise the holes in their logic." (p. 3)
Needless to say, the errors made by intelligent people can have adverse effects for the individual (e.g. for their health and career), for business (bankruptcy), and for society (miscarriages of justice, iatrogenics, and inaction in the face of climate change). Therefore, we need additional mental qualities that help us drive in the right direction and use our intelligence more safely. Robson explores these guardrail qualities across ten chapters and approximately 250 pages, always asking: "Why do smart people act stupidly? What skills and dispositions are they missing that can explain these mistakes? And how can we cultivate those qualities to protect us from those errors?" (p. 7). Those qualities all relate in one way or another to the idea of wisdom. Notably, Robson excludes Angela Duckworth's notion of "grit" from the scope of the book -- while perseverance is important, he says, it does not seem to solve the biases of intelligence. Below I summarize the chapters.

***

In Chapter 1, Robson discusses different conceptions of intelligence, ranging from "emotional intelligence" to Gardner's "multiple intelligences", but finds the most promising one to be Sternberg's "Triarchic Theory of Successful Intelligence". As the name suggests, there are three components:
  • Analytical intelligence -- corresponding to the traditional idea of general intelligence as measured by IQ and SAT scores;
  • Practical intelligence -- the ability to execute plans and overcome problems in a pragmatic way, including an awareness of one's strengths and weaknesses, the ability to read the motives of others and persuade them, tacit knowledge, and so on; 
  • Creative intelligence -- imagination and counter-factual thinking (including the ability to imagine and preempt the consequences of one's actions).
Research has shown that when SAT scores are supplemented with measures of practical and creative intelligence, students' grade point averages at university can be predicted more accurately. Practical intelligence also predicts leadership ability in the military. Another concept, developed by Soon Ang, is "cultural intelligence", or sensitivity to different cultural norms. This CQ has been linked to the performance of international salespeople and negotiators, among other things. All these measures are only weakly correlated to IQ.

What also does not correlate strongly with IQ? The answer, provided in Chapter 2, is your capacity "to form beliefs based on evidence, logic and sound reasoning" (epistemic rationality or epistemic accuracy) and "to make the optimal decisions needed to meet [your] goals" given the resources on hand (instrumental rationality). These notions of rationality will be very familiar to followers of LessWrong. Robson writes about the Canadian psychologist Keith Stanovich, who promotes the idea of "dysrationalia" as a kind of learning difficulty in which intelligent people score poorly on tests of rationality, such as Stanovich's Comprehensive Assessment of Rational Thinking (or "rationality quotient" for short). To score high in RQ, one needs to know the scientific method and statistical reasoning (thus avoiding so-called contaminated mindware), and be less susceptible to cognitive biases (like anchoring, framing, and the sunk cost fallacy). To be sure, there is a correlation between RQ and SAT scores, but a modest one that leaves room for dysrationalia.

The chapter provides further evidence that intelligent people don't necessarily make more rational decisions: for example, people with a high IQ are more likely to smoke or reach their credit limit, and one study found that rationality scores are better predictors of real-world behavior (e.g. missing a flight, getting an STD, being jailed) than general intelligence is. But perhaps most importantly, according to Robson, intelligent students are more vulnerable to motivated reasoning and the confirmation bias (or myside bias), being less likely to consider both sides of an argument. The bias blind spot compounds this.

Chapter 3 identifies the final aspect of the intelligence trap, which Robson calls the "curse of expertise" or curse of specialist knowledge. Of course, experts usually do perform better, but they still overestimate their own knowledge, forget how much they have forgotten since university (meta-forgetfulness) and are less likely to listen to those who disagree -- something that Victor Ottati calls "earned dogmatism". It gets worse: part of the expert's advantage comes from deeply-ingrained psychological schemas (mental templates), but these also make the expert less flexible in the face of change, and less likely to notice fine details. This process is called entrenchment. In other words, the neurological architecture underlying expertise may come with inherent drawbacks. This can be especially dangerous in aviation, where pilots use automatic, scripted behaviors and may therefore miss the warning signs of disaster.

So these, then, are the faces of the intelligence trap:
  1. A lack of practical or creative intelligence, especially tacit knowledge and counterfactual thinking;
  2. Dysrationalia, motivated reasoning and the bias blind spot;
  3. Earned dogmatism and entrenched behaviors due to expertise.
The obvious question is: what can we do about these?

***

One of the heroes of Chapter 4 is Igor Grossmann, a pioneer in the movement toward evidence-based wisdom. His definition of "wisdom" was inspired by philosophy and ethnography, but formulated in terms of six metacognitive principles: the ability to take perspectives, to predict different ways in which a conflict might unfold, to recognize the likelihood of change, to look for a compromise, to predict the resolution of the conflict, and to have intellectual humility. These components can be tested and compared to measures of well-being. Grossmann's team found that people with wiser reasoning also tend to be happier. But can we cultivate this kind of wisdom? Well, you can deliberately take the time to consider reasons why your initial viewpoint might be wrong (Robson calls this "actively open-minded thinking"), and practice self-distancing by describing the situation from a neutral, third-person perspective. This works because it reduces the emotionally-charged hot cognition that feeds into bias; indeed, Solomon's paradox refers to the observation that we (like the ancient Israelite king) show better judgment when reasoning about others' problems. You can also imagine explaining the problem to a twelve-year-old child (see: the Socrates effect). These techniques seem to reduce motivated reasoning and "hot" emotions. Similarly, Philip Tetlock's Good Judgment Project showed that people make better predictions when they accept uncertainty and look for outside perspectives. We can also take inspiration from Benjamin Franklin, whose "moral algebra" consisted of weighing the advantages and disadvantages of a decision, over a number of days. In his own words, he found this made him "less liable to make a rash step" (p. 100).

To become more rational, however, we don't need to discard our emotions and intuitions. David Robson argues in Chapter 5 that our feelings can provide us with valuable information -- the challenge is to interpret them correctly and override them when needed. In fact, the much-cited work by Antonio Damasio shows that people struggle to make wise decisions if they have damage to their ventromedial prefrontal cortex. This brain area seems to be responsible for somatic markers: physiological changes (e.g. heart rate, sweat, muscle tension) that precede conscious emotions, and that are based on past experiences. People with damage to this area score well on intelligence tests, but have reduced interoception, or access to the "gut instincts" that normally guide our choices. Research also suggests that traders who are more in tune with their visceral feelings make more profits. Of course, irrelevant feelings can still lead us astray. So how can we make better sense of our feelings?

One trick is to use a more precise emotional vocabulary in order to differentiate feelings. Then you can regulate those feelings through self-distancing, reappraisal, humor, or a change of scene. Robson prefers to call these skills -- interoception, differentiation and regulation -- "reflective thinking" or the "emotional compass" rather than emotional intelligence. Reflective skills tend to improve with age, but you can help the process by practicing mindfulness meditation (where mindfulness is the opposite of mindlessness -- a lack of insight into our actions and surroundings), dancing or singing, and routinely trying to distinguish between different emotions. Keep a journal of how your thoughts and feelings may have influenced your decisions. If you speak a second language, you may be surprised to find out that reasoning in the other language can reduce cognitive bias (the "foreign language effect"). Finally, note down your initial gut reaction before analyzing alternative hypotheses, so that you don't get overly focused on inconsequential background information. Reaching the level of expertise where one knows when to question one's intuitions -- which may be called reflective competence -- is key to overcoming the curses of expertise discussed earlier.

One of Robson's missions with this book is to combat fake news and its cousins, misinformation and conspiracy theories. That's why he devotes Chapter 6 to a "bullshit detection kit". To avoid being duped by lies and rumors, it is not enough to be intelligent or well-educated; it turns out that university graduates are more likely to believe misinformation about medicine or use alternative medicines. What we need, according to Robson, are critical thinking skills based on cognitive reflection. Back in the Introduction of the book, the author presented the following test:
  • "Jack is looking at Anne but Anne is looking at George. Jack is married but George is not. Is a married person looking at an unmarried person? Yes, No, or Cannot Be Determined?" (p. 4)
Apparently, the majority of people answer "cannot be determined", but that is incorrect. The answer is yes, because in the scenario that Anne is married, she would be looking at George (unmarried), and if Anne were not married, Jack (who is married) would be looking at her. Either way, one married person is looking at an unmarried person. In Chapter 6, the author presents some questions from the Cognitive Reflection Test, including the following:
  • "A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?" (p. 149)
Here, the answer is 5 cents, but many people incorrectly say 10 cents. To get these kinds of questions right, you need to challenge assumptions, intuitions and cues that may seem obvious. And as it turns out, people who score higher on the CRT also score higher on Stanovich's rationality quotient and are less susceptible to magical thinking and seeing meaning in "pseudo-profound bullshit" (e.g. vacuous statements like "hidden meaning transforms unparalleled abstract beauty"). Cognitive reflection may help us detect the Moses illusion, which is when people fail to spot the error in the question: "How many animals of each kind did Moses take on the Ark?" (p. 142). Of course, it was Noah, not Moses, who had an ark in the Bible. Researcher Norbert Schwarz explains that we experience a sense of "truthiness" for statements that are familiar and fluent (i.e. easy to process), which is why our judgment can also be influenced by the font in which a text is written, how often a statement is repeated, the ease by which we can pronounce someone's name, and even whether a statement rhymes. For these reasons, an attempt to debunk a myth can backfire -- the false claim typically gets repeated. A better approach is to focus on the facts, to avoid over-complicating the argument, and to frame the issue in a less politically-loaded way (which reduces motivated reasoning).

But how can you protect yourself? Well, you can attend a "cognitive inoculation program", which works by exposing you to bad arguments so that you can better spot bullshit in the future. You can read up on common logical fallacies. Learn to ask questions such as: Who is making the claim, what are the premises of the claim, and what is the evidence? What are the alternative explanations for their claim, and what further information may be needed to come to a conclusion? The skeptic and writer Michael Shermer offers "baloney detection" tips: check who is making the claim, what their source is, how strong the evidence is, and whether someone has tried to verify or debunk the claim. Study real-life examples of misinformation, such as the fake experts used by the tobacco industry, or the "anomalies-as-proof" tactic used by 9/11 conspiracy theorists and Holocaust deniers. But also try to understand the other side's worldview and keep an open mind.

***

Part 3 of The Intelligence Trap discusses two qualities that help us learn and remember: epistemic curiosity and the growth mindset (discussed in Chapter 7). David Robson then looks at the advantages of the East Asian approach to education, which emphasizes "desirable difficulties" or "eating bitter" as a means to deep learning (discussed in Chapter 8).

Richard Feynman is the star of Chapter 7, as someone renowned for being a genius despite not having a "genius-level" IQ. Feynman's secret was a constant yearning to build on his potential and grow his mind -- which can be summarized as curiosity (a genuine interest in the world) and a growth mindset (a belief that one's performance is not fixed, as the fixed mindset suggests, but can be improved with practice). Charles Darwin fell into the same category. Of course, Robson isn't arguing that anyone can be Feynman or Darwin, but that smart people who don't go outside their comfort zones are sabotaging their own chances of success. Interestingly, research shows that people with more curiosity remember more material even after accounting for the amount of work they put in, and they also report higher levels of well-being. The growth mindset might be related to deeper conceptual processing (indicated by temporal lobe activity), and like curiosity, it can even reduce dogmatic motivated reasoning. Perhaps unsurprisingly, a growth mindset is associated with intellectual humility. Robson argues that these characteristics are antidotes to cognitive miserliness, our tendency to make decisions based on intuition rather than analytical reasoning.

What you might not have known is that East Asian countries such as Japan do a better job of cultivating these qualities in their schoolchildren than do Western countries. Findings from neuroscience suggest that people learn more when they struggle, forget, get confused, and learn multiple things at a time. A drop in performance today might mean better performance in the future. What this implies, as Chapter 8 suggests, is that "desirable difficulties" like spaced learning, interleaving tasks, and pre-testing (trying to solve a problem before you learn the correct method) can improve long-term understanding. But few schools in the West make use of these. Meanwhile, Japanese teachers challenge their students ("productive struggle"), encourage the exploration of alternative solutions, and promote perseverance ("deliberate practice"). They are less likely to oversimplify complex material. Research shows that on average, people in East Asian countries score higher on tests of flexible and critical thinking. But perhaps Western education can catch up if we follow the example of California's "Intellectual Virtues Academy", which emphasizes the virtues of curiosity, intellectual humility, intellectual autonomy, attentiveness, intellectual carefulness, intellectual thoroughness, open-mindedness, intellectual courage, and intellectual tenacity. We should promote tolerance of ambiguity. Each individual, child or adult, can also use better learning techniques. Robson lists some: beware of slick, superficially fluent textbooks; vary your study environment; learn by teaching; look for multiple solutions to a problem; try to explain the source of your errors; and test yourself regularly, even on material you think you know well.

***

The final two chapters of The Intelligence Trap talk about wisdom on the group level. Chapter 9 challenges the notion that the more star players in a team, the better. The too-much-talent effect is when a football team or basketball team actually performs worse when the percentage of top talent in the team exceeds about 60%. The author uses England's ignominious defeat to Iceland in the Euro 2016 championship as an example. And this isn't limited to sports -- a study on "collective intelligence" found that performance on a group task is only weakly predicted by the average IQ or highest IQ of the group's members. What are the better predictors? Social sensitivity, equal participation, and more women on the team. Furthermore, problems arise because groups of high-flying executives are more likely to experience status conflicts and less likely to share information, especially when they don't agree on the pecking order and have overlapping areas of expertise. So, it can be helpful to apply the following tips: (i) select people with good interpersonal skills; (ii) state each person's expertise at the start of each meeting; (iii) give each person a fixed amount of time at the start; (iv) be clear about the decision-making procedure; and (v) encourage dissent to avoid poor decisions. A humble leader is valuable here.

Chapter 10 is about disasters, such as the Deepwater Horizon oil spill of 2010, the shuttle Columbia's disintegration in 2003, or the downfall of Nokia. Workplace cultures can contribute to "functional stupidity", a kind of narrow thinking characterized by a lack of reflection, curiosity, or consideration of long-term consequences, and which may bring short-term benefits (hence the "functional" part). According to Mats Alvesson and André Spicer, people tend to go with the flow and nod along because it may be to their individual advantage (e.g. they practice strategic ignorance), plus it increases productivity for the organization. In fact, organizations contribute to functional stupidity by focusing excessively on the specialization of labor (the Fachidiot takes a single-minded approach to a multifaceted problem), and demanding positivity and corporate loyalty rather than the expression of doubt. In the long term, it leads to unwise thinking. While Robson doesn't discuss it in the text, he includes the Peter principle in the Appendix, which is the idea that managers get promoted to their level of incompetence. Another factor that increases the risk of failure is the outcome bias, whereby we don't consider that a "successful" decision could have had an alternative outcome if initial circumstances were slightly different and we were less lucky -- this causes us to overlook warning signs. By contrast, a reliable organization will avoid complacency, reward employees for questioning assumptions, look for the root causes of anomalies, regularly discuss "near misses", and its managers will defer to expertise. Together these characteristics are known as "collective mindfulness". Some additional tips are: try to reduce time pressure; do a pre-mortem (imagine that you have failed, and then consider which factors may have contributed); assign someone in your team to be devil's advocate; and implement critical thinking programs. Train new employees using desirable difficulties, and cultivate a collective growth mindset. Note the similarity to the advice given in Tim Harford's Adapt, especially when it comes to resisting conformity and empowering whistle-blowers.

David Robson closes The Intelligence Trap with an Epilogue, in which he reiterates that evidence-based wisdom can "help anyone to maximise their potential" (p. 262) and even enhance the skills measured by IQ, SAT or GRE scores. He also believes (or hopes) that wiser reasoning can help us combat climate change, social inequality, political polarization, and misinformation.

***

Back in the 1920s, Stanford psychologist Lewis Terman sought out a cohort of extremely precocious young children, who had IQs ranging from 140 to 192. His team followed these "Termites" over the next few decades, hoping to identify the next leaders in science, art, and government. It was a landmark study in standardized testing, although some of the motives may smack of elitism to modern audiences (e.g. Terman supported eugenics). Anyway, many Termites stood out in their careers, but those with IQs above 180 were not significantly more successful than those with IQs around 150. In old age, they felt that they didn't make full use of their talents. That Robson refers to this as "the rise and fall of the Termites" (my emphasis) suggests that it should serve as a cautionary tale about the pitfalls of IQ fetishism. Even in Terman's own lifetime there were skeptics, such as Walter Lippmann, who wrote the following:
"It is not possible to imagine a more contemptible proceeding than to confront a child with a set of puzzles, and after an hour's monkeying with them, proclaim to the child, or to his parents, that here is a C-individual." (p. 36)
Perhaps it is encouraging, then, that psychologists in recent years have been paying more attention to a broader range of cognitive skills, including intellectual humility, actively open-minded thinking, a refined emotional awareness, cognitive reflection, curiosity, and the growth mindset. Perhaps Eliezer Yudkowsky was on to something when he included curiosity and humility in his Twelve Virtues of Rationality.

***

The thing about The Intelligence Trap is that it has an agenda, yes, but I suppose that David Robson is partly trying to counterbalance all those people who are telling us about how important IQ is (although he doesn't go as far as Nassim Taleb, whose statistical argument against the validity of IQ is beyond the scope of this post). Furthermore, he actually doesn't spend much time bashing the notion of IQ (in fact, he acknowledges that it can be useful); the book is more about exploring a series of concepts that may help us mitigate the traps of motivated reasoning, the availability heuristic, and truthiness, among others. Rather than trying to undermine abstract reasoning skills and factual knowledge, what Robson tries to do is to call attention to the potential for them to coexist with, and be complemented by, other thinking skills in our education system. To the extent that this will help us use our intelligence with greater "insight, precision and humility" (p. 264), it seems like an agenda that any Rationalist can get on board with. The spirit of The Intelligence Trap may thus be summarized by a remark from René Descartes, who Robson quotes on page 6:
"It is not enough to possess a good mind; the most important thing is to apply it correctly. ... The greatest minds are capable of the greatest vices as well as the greatest virtues; those who go forward but very slowly can get further, if they always follow the right road, than those who are in too much of a hurry and stray off it."
David Robson makes his case well, and despite not being a scientist himself, he seems to have done the research. He made it his mission to personally interview the key figures in his book, including James Flynn, Robert Sternberg, Keith Stanovich, Igor Grossmann, Norbert Schwarz, Michael Shermer, and Carol Dweck. He also gives credit to the Center for Practical Wisdom at the University of Chicago (which opened relatively recently in 2016), and in the Notes section cites David Perkins's Outsmarting IQ and Scott Kaufman's Ungifted as inspirations. The references are there, although as with any pop-psych book, the reader has to trust that the author represented the original material accurately and minimized cherry-picking.

Stylistically, Robson's writing is clear and engaging. My only complaint would be that the structure of the chapters is somewhat meandering. A simple chapter overview or summary, plus the use of headings, would have helped. Overall, this is an accessible book that I can recommend to anyone interested in psychology or self-improvement. As I noted earlier, The Intelligence Trap is perhaps the best introduction to the topic for a general audience at the moment. I was tempted to rate the book five stars on Goodreads, but in the end I went with 4/5 because it lacks the originality or depth of the books that I've given top ratings.

What would be interesting to see is the Center for Applied Rationality putting some of the techniques from the book (e.g. actively open-minded thinking, self-distancing, keeping a journal of one's feelings, introducing desirable difficulties etc.) to the test. I know that they've already incorporated the pre-mortem ("Murphyjitsu"), a kind of mindfulness ("Focusing"), and curiosity ("boggling") into their curriculum. It would also be interesting to discuss the implications of the intelligence trap for transhumanism -- would cognitive enhancement (in a narrow sense) make the world more politically polarized if untempered by these additional mental qualities?

If you want to check out more of David Robson's work, his portfolio of articles is available here, and his blog here. It looks like he's working on a second book, which might be about aging and longevity, judging by his most recent work. But it could be anything. For the time being, check out "Why smart people believe coronavirus myths", which highlights how "truthiness", cognitive miserliness, and the incentives of social media combine to help misinformation go viral.

***

"It was dysrationalia that did them in; they used their intelligence only to defeat itself."
-- Eliezer Yudkowsky (source)

Comments

Popular posts from this blog

Don't accept a Nobel Prize -- and other tips for improving your rationality