Ain’t nobody got time for this?

There is an excellent paper by Tyler Cowen and Robin Hanson that discusses how to deal with self-deception. I have a longer blog post on it, but here is the most important bit:

For a truth-seeker, the key question must be how sure you can be that you, at the moment, are substantially more likely to have a truth-seeking, in-control, rational core than the people you now disagree with. This is because if either of you have some substantial degree of meta-rationality, then your relative intelligence and information are largely irrelevant except as they may indicate which of you is more likely to be self-deceived about being meta-rational.

How to stop fooling yourself

When I first wrote about self-deception, a little more than a year ago, I found the experience traumatising.

It can be very troubling to realise that inside my own head I maybe tucking away truths that I am already aware of!

Now Eli Dourado has an essay in The Umlaut that discusses a paper by Tyler Cowen and Robin Hanson that grapples with the issue of self-deception.  Here’s the relevant bit:

Self-favoring priors, they note, can help to serve other functions besides arriving at the truth. People who “irrationally” believe in themselves are often more successful than those who do not. Because pursuit of the truth is often irrelevant in evolutionary competition, humans have an evolved tendency to hold self-favoring priors and self-deceive about the existence of these priors in ourselves, even though we frequently observe them in others.

Self-deception is in some ways a more serious problem than mere lack of intelligence. It is embarrassing to be caught in a logical contradiction, as a stupid person might be, because it is often impossible to deny. But when accused of disagreeing due to a self-favoring prior, such as having an inflated opinion of one’s own judgment, people can and do simply deny the accusation.

So how can we cope with self-deception?

Cowen and Hanson argue that we should be on the lookout for people who are “meta-rational,” honest truth-seekers who choose opinions as if they understand the problem of disagreement and self-deception. According to the theory of disagreement, meta-rational people will not have disagreements among themselves caused by faith in their own superior knowledge or reasoning ability. The fact that disagreement remains widespread suggests that most people are not meta-rational, or—what seems less likely—that meta-rational people cannot distinguish one another.

We can try to identify meta-rational people through their cognitive and conversational styles. Someone who is really seeking the truth should be eager to collect new information through listening rather than speaking, construe opposing perspectives in their most favorable light, and offer information of which the other parties are not aware, instead of simply repeating arguments the other side has already heard.

What sells in the name of journalism today is written to appease one side or the other. Opinions are important, but what is more important is facts that help you arrive at it and the process you use to form them. A good journalist should be a meta-rational person. He should also be able to find other meta-rational people, so that the world is made aware of facts and arguments that help take things a step forward.

The problem, of course, is that meta-rational people care less about success but more about seeking the truth/understanding the world. Their voices are present but harder to find. They get far less attention than they deserve.

You can stop reading now, if you like. For, because we are on the subject of good journalism and meta-rational people, I’d like to discuss an example of exactly the opposite. Aleks Eror writes in VICE that the Chinese are engineering genius babies. This is a great story, if it were true. Sadly, he bases this claim on the word of Geoffrey Miller, an American academic, who wrote a heavily biased article about it in January. What’s worse is that it Miller’s argument had been criticised and debunked (sort-of) immediately after. But Eror chose to ignore that and rehash the fear-mongering story.

The central point of the argument is that the Chinese have an institute where they are doing genome sequencing of the world’s smartest people to find the genes that makes us smart. This is not a new story. Eugenics in some shape or form has existed for a long time. The problem with Eror’s piece is that he doesn’t run by an expert the claims made by Miller.

A good science journalist should always do that because the complex nature of the subjects we usually deal with. Experts may not be the perfect meta-rational candidate that we’d like, but they are close enough. They bring an external perspective, and I’ve found that invaluable in my own work.

The trouble with self-deception

I recently argued that becoming a strategic self-deceiver is difficult but worth the effort. In the article that led me to this thought, David Brooks argued those who weren’t entirely honest when it came to self-narratives led impressive lives. My aim, as always, has been to understand how to use this knowledge to help us.

This idea of self-deception has in some form or the other been on my mind since I first wrote about it. To be frank, it has been a troubling thought. We can lead lives where we are able to successfully hide the truth from our own selves – isn’t that messed up?

As a scientist, my life is a pursuit to uncover truths about the world. It’s my day job and when one spends a large part of their life doing just that, it can be very troubling to realise that inside my own head I maybe tucking away truths that I am already aware of!

The self-deception paradox

Philosophers argue that one can not try to convince oneself of something being false when they know that it is true i.e. one cannot be successful at self-deception.

Here, of course, they are assuming that the person is trying to deceive themselves consciously. I argue that most self-deception occurs unconsciously and that’s why self-deceiving is not just possible but to a certain extent easy.

The helpers of self-deception

What aids self-deception is that we aren’t perfectly rational beings. Our thought process is flawed, of course, but beyond that even the machinery that runs it is far from perfect. We are the victims of our own biological shortcomings.

In the Seven Sins of Memory, Daniel Schacter shows that human beings have a remarkable ability to mess with their own memories. We don’t just forget things but are also able to create false memories and selectively block some memories. When so much of our lives are built on our past, I shudder to think that the memories that are the foundation of the building called me may not really be ‘true’ memories.

When both my rational self and the knowledge of the world that I hold are being questioned, it’s not hard to see why exploring self-deception has been a troubling experience.

Photo credit: Annek