Fighting for Your Beliefs

“In a political debate you feel like the other side just doesn’t get your point of view, and if they could only see things with your clarity, they would understand and fall naturally in line with what you believe. They must not understand, because if they did they wouldn’t think the things they think. By contrast, you believe you totally get their point of view and you reject it. You see it in all its detail and understand it for what it is – stupid. You don’t need to hear them elaborate. So, each side believes they understand the other side better than the other side understands both their opponents and themselves.”

-David McRaney[1]

When someone disagrees with you it can feel like they’re attacking you, so it’s a natural tendency for humans to think of debates as a conflict. This is a most unfortunate human tendency.

If you think the goal of debate is to defeat the people who disagree with you, then debating is not an attempt to determine who is correct, but merely a contest of argumentative skill. You pit your arguments against each other, both hoping your argument will prove the stronger one and overpower your opponent’s argument. This is not an efficient method for forming beliefs.

Reasoning is not a zero sum game, where for every winner there’s a loser. Instead, forming beliefs could be viewed as a cooperative game. A disagreement can then be more usefully thought of as an opportunity to combine the knowledge of two people and, perhaps, to improve the quality of their beliefs.

Rather than making two arguments duel, you can compare two arguments side-by-side. Their logic can be examined, the strengths and weaknesses of each can be weighed. You may both discover that there was more evidence on the other side than you realized, and adjust your opinions to some compromise between your initial positions. Two people may even find that combining their knowledge supports a position more extreme than either of them initially held. Or you may discover you’re actually trying to answer different questions, and don’t disagree at all. If one argument turns out to be based on a broader base of evidence, or avoids a mistake that the other argument makes, or represents a deeper understanding of an issue, the position supported by the weaker argument can simply be discarded in favor of the belief supported by the stronger one.

In practice it is surprisingly difficult for humans to see a debate this way. The scientific literature from experimental psychology gives us some good clues as to why this is the case.

Part of the story is that our beliefs are based on evidence and reasoning to a much lower degree than we think they are. For example, people tend to persist in beliefs even when the reason for holding those beliefs has been entirely falsified. In an ingenious 1975 study[7], subjects were given an assortment of suicide notes and given the task of identifying which were real suicide notes and which were fake ones. Next, by telling the subjects either that they had done very well or very poorly at this task, some were brought to believe that they were very good at this task, and others to believe that they were not. Later it was revealed that this feedback was completely random, and therefore was no indication at all whether the subject was actually good at the task or not. Still, most subjects retained the belief that they were good or bad at this task even after the sole reason for holding that belief had been totally disconfirmed.

After watching a presidential debate, about half of people asked are unable to recall a single specific thing either candidate said[2], and yet 70-80% judge one or the other candidate the winner[3]. Irrelevant factors, such as the physical attractiveness[4][5] of the person expressing an opinion, or their confident style[6] can also have a significant impact on the beliefs of an audience.

Even the act of explaining our beliefs to someone else tends to have the effect of making them more permanent[8]. In explaining our beliefs we build that belief into part of our identity, so the very act of debating an idea makes it even more difficult to later change your mind.

We tend to think of beliefs as being like possessions. This can be seen in the words we use to talk about them. Beliefs are things you “have”, “find”, “lose”, “cherish”, “appraise”, “value”, and “abandon”. The prospect of having a belief challenged can feel like the prospect of losing something valuable.

This is why Alfred Korzybski’s metaphor of human knowledge as a map of the territory is so powerful. If you discover someone has a better map than you, you can simply photocopy their map and toss out your old one. You could take two different maps, like a political map and a topographical map, and combine them into a map with both sets of information. Or if you discover that two maps disagree on a matter of fact, you can go out and look at the territory to find out which one is right and which is wrong. Some maps will be useful for answering some questions about the territory, and other maps will be useful for answering other questions. And, of course, no map includes every detail of the territory, just as nobody’s knowledge includes every detail about the real world.

Approaching debate as a conflict is not a strategy that tends to lead you toward truth. You want to employ a strategy that maximizes the chances that if you are wrong, you will discover this and change your mind.

FacebookTwitterGoogle+Share