The Art of Human Rationality

If you go looking for evidence on only one side of an argument, chances are you’ll find what you’re looking for without much difficulty. For anyone who wants to believe that the Apollo moon landings were faked, or that MMR vaccine causes autism, or even that we’re ruled by lizard people, an internet search will quickly oblige your curiosity with plenty of sophisticated arguments – not to mention a supportive community of other believers – to permit you to believe what you sought to.

An acquaintance of mine was once arguing that the object that struck the Pentagon on September 11 2001 was not a passenger aircraft. His argument involved things like looking at details of the debris visible in photographs of the event, and some assertions about the aerodynamics of low-altitude flight.

“But why“, I replied, “would we care about these details when we have the testimony of over a hundred eyewitnesses who saw the crash happen?” It turned out that even though he was intimately familiar with tiny details about the windows in the facade of the Pentagon and the location of debris on the lawn, somehow he had no familiarity with the reports of the people who were stuck in traffic on the freeway nearby and witnessed the entire event unfold.

This is one of the most insidious ways that human reasoning can go awry, especially because the people most prone to make this kind of mistake are often those who are doing a great many things exactly right. They’re actively seeking information, they’re being skeptical, they’re learning about the relevant science, they’re reasoning from the evidence. These are exactly the kinds of things one should do when trying to form rational beliefs. The problem is that although they’re doing these things right, they’re also doing them selectively, and that can be worse than not doing them at all.

If your goal when reasoning about something is to strengthen your current belief then focusing on the evidence pointing in your favored direction and ignoring any evidence pointing the other way is a good strategy. But if you’d like to figure out what’s actually true, this strategy is rather poor.

Unfortunately, once people choose a particular opinion or take a side on an issue, they tend to then accumulate many more reasons to hold it and to ignore or rationalize away arguments from the other side. It has even been shown that when exposed to a balanced set of pro and con arguments, people’s initial beliefs often become stronger regardless of which side they started on; evidence that seems to support the other side is treated with suspicion and meticulously inspected for flaws, while evidence supporting the favored position is accepted easily.[1]

Some refer to human rationality as an art, and this is a very compelling way to see it. People can be given tools that can help them improve their thinking – by teaching them about the scientific method, probabilistic thinking, logical fallacies, cognitive biases etc. – but this does not suddenly make them rational, just as giving someone a paintbrush doesn’t suddenly make them a great artist. Like any other tool, it takes study and practice to learn how to wield these skillfully and safely. A powerful tool in the hands of a novice can easily do more harm than good, especially if the novice believes they are a master.

Psychologist Keith Stanovich has spent much of his career emphasizing that rational thinking skills are different than intelligence. Furthermore, he has been able to show that a difficulty with rational thinking skills is quite common even in people with high intelligence – a condition he calls “dysrationalia” (by analogy with “dyslexia”, a difficulty with language skills despite adequate intelligence). His work suggests that there is an important component of rational thinking abilities that isn’t measured by things like IQ tests, and that isn’t included in the normal way we think about intelligence. He thinks that our undervaluing and neglecting of these rational thinking skills has put us in a position where many of the people we have chosen – based on their intelligence – to make our most important decisions lack the very skill sets necessary to make such decisions rationally.

“It is useful to get a handle on dysrationalia and its causes because we are beset by problems that require increasingly more accurate, rational responses. In the 21st century, shallow processing can lead physicians to choose less effective medical treatments, can cause people to fail to adequately assess risks in their environment, can lead to the misuse of information in legal proceedings, and can make parents resist vaccinating their children. Millions of dollars are spent on unneeded projects by government and private industry when decision makers are dysrationalic, billions are wasted on quack remedies, unnecessary surgery is performed and costly financial misjudgments are made.”

-Keith Stanovich [2]

It seems that giving people thinking tools can affect them in two very different possible ways. In some cases it encourages them to think carefully, to examine and question their knowledge, and to be aware of the flaws built into human reasoning so they can be identified and corrected for whenever possible. But in other cases it seems to simply arm them with better argumentative weaponry with which to defend their favored beliefs, and to give them better and more sophisticated ways to deceive themselves.

A person can be taught to be skeptical, to reason from the evidence, and all that good stuff, and can still go horribly wrong if they are selectively skeptical, or if they reason from only the evidence they like. Beware those who approach their beliefs as possessions to be defended, rather than as maps to be drawn, tested, and improved.

FacebookTwitterGoogle+Share