Reason and explanation

In Enlightenment 2.0 (2014), Joseph Heath describes how conservatives in the US are hostile to experts, and prefer feelings (also known as "common sense") over reason. (If this makes you doubt Heath's even-handedness, he and Andrew Potter also wrote a book criticizing the Left's idea that political change will come through creating a counterculture, Rebel Sell.)

... conservatives have become enamored of the idea that politics is ultimately not about plans and policies, it's about "gut feelings" and "values." Elections are decided by appealing to people's hearts, not their heads. So, for example, when a Republican candidate says that he is going to "close down the Department of Energy," he doesn't really mean that he is going to close down the Department of Energy and fire all of its employees. After all, the U.S. Department of Energy is responsible for maintaining the nuclear reactors in U.S. military submarines, among other things. What it really means to say that you'll close down the Department of Energy is just "I feel very strongly that the federal government hates oil companies, and I want to change that." The objective is to communicate your feelings, not your thoughts.

This privileging of visceral, intuitive, gut feelings is central to the movement known as "common sense" conservativism, which has become a powerful force everywhere in the Western world, not just the United States. The central characteristic of common sense, according to Republican communication strategist Frank Luntz, is that it "doesn't require any fancy theories; it is self-evidently correct." To say that it is self-evident is to say that is known to be correct without argument and without explanation.

There's at least two immediate problems with this approach. One is that if your beliefs are wrong, your feelings won't tell you this. Your beliefs, incorrect though they may be, will still feel correct to you.

A second problem is that our society depends on many complex institutions whose workings aren't at all self-evident: science and technology, the law, bureaucracies, corporations, banks, and so on. Naturally, the "common-sense" approach, impatient with anything complicated, is hostile to these institutions; and given free rein, may attempt to dismantle them.

So unless you're willing to use reason to examine your beliefs, you won't be able to correct them, and you may end up disconnected from reality. And you may end up destroying institutions simply because you don't understand them.

What do we mean by reason? Heath describes it as a mental faculty that takes time and effort: you need to sit down and put things in order, and express them using language, one step at a time.

The term reason traditionally refers to a particular mental faculty, one that is associated with a distinctive style of thinking. David Hume famously described reason as a "calm passion," and a degree of detachment and distance from immediate circumstances is a hallmark of the rational style. But perhaps the more significant feature of rational thought is that it can be made fully explicit. To the extent that we are reasoning, we are fully aware of what we are doing and we are able to explain fully what we have done--hence the connection between the faculty of reason and the practice of giving reasons, or argumentation and justification. For any particular claim, we must be able to explain what entitles us to make it and we must be willing to acknowledge what it commits us to.

This provides the basis for the traditional contrast between reason and intuition. An intuitive judgment is one that you make without being able to explain why you made it. Rational judgments, on the other hand, can always be explained.

Tom Stafford suggests that an excellent way to test one of your beliefs is to try to explain it, step by step:

A little over a decade ago Leonid Rozenblit and Frank Keil from Yale University suggested that in many instances people believe they understand how something works when in fact their understanding is superficial at best. They called this phenomenon "the illusion of explanatory depth". They began by asking their study participants to rate how well they understood how things like flushing toilets, car speedometers and sewing machines worked, before asking them to explain what they understood and then answer questions on it. The effect they revealed was that, on average, people in the experiment rated their understanding as much worse after it had been put to the test.

What happens, argued the researchers, is that we mistake our familiarity with these things for the belief that we have a detailed understanding of how they work. ...

It's a phenomenon that will be familiar to anyone who has ever had to teach something. Usually, it only takes the first moments when you start to rehearse what you'll say to explain a topic, or worse, the first student question, for you to realise that you don't truly understand it. All over the world, teachers say to each other "I didn't really understand this until I had to teach it". Or as researcher and inventor Mark Changizi quipped: "I find that no matter how badly I teach I still learn something".

... Research published last year on this illusion of understanding shows how the effect might be used to convince others they are wrong. The research team, led by Philip Fernbach, of the University of Colorado, reasoned that the phenomenon might hold as much for political understanding as for things like how toilets work. Perhaps, they figured, people who have strong political opinions would be more open to other viewpoints, if asked to explain exactly how they thought the policy they were advocating would bring about the effects they claimed it would.

Recruiting a sample of Americans via the internet, they polled participants on a set of contentious US policy issues, such as imposing sanctions on Iran, healthcare and approaches to carbon emissions. One group was asked to give their opinion and then provide reasons for why they held that view. ...

Those in the second group did something subtly different. Rather that provide reasons, they were asked to explain how the policy they were advocating would work [emphasis added]. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.

People in the second group--those who tried to explain how the policy would work--realized that their understanding of the issues wasn't as strong as they thought it was, and their views became less certain. In other words, they were successful in realizing that their beliefs might be incorrect. If you want to be able to correct things you believe which aren't true, you need to keep in mind that your beliefs may be false, but this is surprisingly difficult; the natural tendency is to look for evidence reinforcing your existing beliefs ("confirmation bias"). Sometimes called "Cromwell's rule," after Oliver Cromwell: "I beseech you, in the bowels of Christ, think it possible that you may be mistaken."

blogroll

social