Ain’t nobody got time for this?

There is an excellent paper by Tyler Cowen and Robin Hanson that discusses how to deal with self-deception. I have a longer blog post on it, but here is the most important bit:

For a truth-seeker, the key question must be how sure you can be that you, at the moment, are substantially more likely to have a truth-seeking, in-control, rational core than the people you now disagree with. This is because if either of you have some substantial degree of meta-rationality, then your relative intelligence and information are largely irrelevant except as they may indicate which of you is more likely to be self-deceived about being meta-rational.

How to stop fooling yourself

When I first wrote about self-deception, a little more than a year ago, I found the experience traumatising.

It can be very troubling to realise that inside my own head I maybe tucking away truths that I am already aware of!

Now Eli Dourado has an essay in The Umlaut that discusses a paper by Tyler Cowen and Robin Hanson that grapples with the issue of self-deception.  Here’s the relevant bit:

Self-favoring priors, they note, can help to serve other functions besides arriving at the truth. People who “irrationally” believe in themselves are often more successful than those who do not. Because pursuit of the truth is often irrelevant in evolutionary competition, humans have an evolved tendency to hold self-favoring priors and self-deceive about the existence of these priors in ourselves, even though we frequently observe them in others.

Self-deception is in some ways a more serious problem than mere lack of intelligence. It is embarrassing to be caught in a logical contradiction, as a stupid person might be, because it is often impossible to deny. But when accused of disagreeing due to a self-favoring prior, such as having an inflated opinion of one’s own judgment, people can and do simply deny the accusation.

So how can we cope with self-deception?

Cowen and Hanson argue that we should be on the lookout for people who are “meta-rational,” honest truth-seekers who choose opinions as if they understand the problem of disagreement and self-deception. According to the theory of disagreement, meta-rational people will not have disagreements among themselves caused by faith in their own superior knowledge or reasoning ability. The fact that disagreement remains widespread suggests that most people are not meta-rational, or—what seems less likely—that meta-rational people cannot distinguish one another.

We can try to identify meta-rational people through their cognitive and conversational styles. Someone who is really seeking the truth should be eager to collect new information through listening rather than speaking, construe opposing perspectives in their most favorable light, and offer information of which the other parties are not aware, instead of simply repeating arguments the other side has already heard.

What sells in the name of journalism today is written to appease one side or the other. Opinions are important, but what is more important is facts that help you arrive at it and the process you use to form them. A good journalist should be a meta-rational person. He should also be able to find other meta-rational people, so that the world is made aware of facts and arguments that help take things a step forward.

The problem, of course, is that meta-rational people care less about success but more about seeking the truth/understanding the world. Their voices are present but harder to find. They get far less attention than they deserve.

You can stop reading now, if you like. For, because we are on the subject of good journalism and meta-rational people, I’d like to discuss an example of exactly the opposite. Aleks Eror writes in VICE that the Chinese are engineering genius babies. This is a great story, if it were true. Sadly, he bases this claim on the word of Geoffrey Miller, an American academic, who wrote a heavily biased article about it in January. What’s worse is that it Miller’s argument had been criticised and debunked (sort-of) immediately after. But Eror chose to ignore that and rehash the fear-mongering story.

The central point of the argument is that the Chinese have an institute where they are doing genome sequencing of the world’s smartest people to find the genes that makes us smart. This is not a new story. Eugenics in some shape or form has existed for a long time. The problem with Eror’s piece is that he doesn’t run by an expert the claims made by Miller.

A good science journalist should always do that because the complex nature of the subjects we usually deal with. Experts may not be the perfect meta-rational candidate that we’d like, but they are close enough. They bring an external perspective, and I’ve found that invaluable in my own work.

The trouble with self-deception

I recently argued that becoming a strategic self-deceiver is difficult but worth the effort. In the article that led me to this thought, David Brooks argued those who weren’t entirely honest when it came to self-narratives led impressive lives. My aim, as always, has been to understand how to use this knowledge to help us.

This idea of self-deception has in some form or the other been on my mind since I first wrote about it. To be frank, it has been a troubling thought. We can lead lives where we are able to successfully hide the truth from our own selves – isn’t that messed up?

As a scientist, my life is a pursuit to uncover truths about the world. It’s my day job and when one spends a large part of their life doing just that, it can be very troubling to realise that inside my own head I maybe tucking away truths that I am already aware of!

The self-deception paradox

Philosophers argue that one can not try to convince oneself of something being false when they know that it is true i.e. one cannot be successful at self-deception.

Here, of course, they are assuming that the person is trying to deceive themselves consciously. I argue that most self-deception occurs unconsciously and that’s why self-deceiving is not just possible but to a certain extent easy.

The helpers of self-deception

What aids self-deception is that we aren’t perfectly rational beings. Our thought process is flawed, of course, but beyond that even the machinery that runs it is far from perfect. We are the victims of our own biological shortcomings.

In the Seven Sins of Memory, Daniel Schacter shows that human beings have a remarkable ability to mess with their own memories. We don’t just forget things but are also able to create false memories and selectively block some memories. When so much of our lives are built on our past, I shudder to think that the memories that are the foundation of the building called me may not really be ‘true’ memories.

When both my rational self and the knowledge of the world that I hold are being questioned, it’s not hard to see why exploring self-deception has been a troubling experience.

Photo credit: Annek

Living under an illusion

Sure, I’d like to change the world. Expecting that I will is plain wrong though!

All my life I have had different things that motivated me to do what I have done. But for the past few years, a constant driving force for the choices I make and the work I do has been the impact that those choices and work have on the world. Unfortunately, I have been self-deceiving myself into believing that what I am doing has or is going to have a measurable impact on the world.

For past three years I have been working on synthesising a large chunk of an even larger molecule. The way I put it to ninth graders, recently, was that I am attempting to stitch  atoms together in a very restricted manner. I am using the technology that chemists have developed over the past two hundred years to produce something in the lab that nature took millions of years to do. Sounds cool and it is.

And yet, when I finish writing my thesis I am not sure if it will be read by more than a handful chemists in its lifetime. The paper that will eventually be published in a reputed journal may be read by a few hundred chemists around the world and a small percentage of them may even cite my work.

A total of ten man-years of work, including three years of my work, and ~£1 million of tax-payers money will have what impact on the world of chemistry or on the world in general? Maybe nothing and maybe a lot, I don’t know.

This blog is very shortly going to reach the 100,000-hits mark since it was brought back to life in June 2009. What impact my writing has had on the world? I don’t know.

Some people will bring a small stone to the building called science and some people will bring a big one, but nevertheless no one can take that stone away from you.” These words by the Nobel laureate Jean-Marie Lehn’s, may soothe my scientist soul and may be I can find such words to do the same for my writing soul. But I cannot deny that walking into something thinking it will make a measurable impact on the world is a little foolish.

Looking back at one’s activities one may be able to understand what is the ‘impact’ those activities have had, but looking forward it is incredibly hard to do be able to predict that impact. But such is human nature that, as someone venturing in to a new area of work, I find it hard to be able to convince and motivate myself to keep working hard if I can’t see the impact of that work.

I posed this as a question to someone who has been working in sustainability for the past 10 years after having switched from a successful career as an accountant. The answer I got was an obvious one, but I think I needed to be told. He said, “The world is incredibly complex. One may never really be able to understand the impact of one’s work and, in this case, the only piece of advice I can give to you is something that won’t be satisfying. Learn to let go off the expectations and you will find it simpler to deal with the world and keeping doing the incredible work that you are doing.”

Knowing this is one thing, applying it to my life is another.

Related: It should be about choices not goals

Becoming a Strategic Self-Deceiver

A few weeks ago David Brooks of the New York Times asked his many readers for a gift, “If you are over 70,  I’d like you to write a brief report on your life so far, an evaluation of what you did well, of what you did not so well and what you learned along the way.”

His reason to make such a request was clear, “These essays will be useful to the young. Young people are educated in many ways, but they are given relatively little help in understanding how a life develops, how careers and families evolve, what are the common mistakes and the common blessings of modern adulthood.” From the many essays that he received, he tried to extract a few general life lessons. One of those many fantastic lessons took me somewhat by surprise.

Beware rumination. There were many long, detailed essays by people who are experts at self-examination. They could finely calibrate each passing emotion. But these people often did not lead the happiest or most fulfilling lives. It’s not only that they were driven to introspection by bad events. Through self-obsession, they seemed to reinforce the very emotions, thoughts and habits they were trying to escape.

Many of the most impressive people, on the other hand, were strategic self-deceivers. When something bad was done to them, they forgot it, forgave it or were grateful for it. When it comes to self-narratives, honesty may not be the best policy.

Self-development is something I am passionate about and self-examination enables me to pursue that passion. Even though I won’t call myself a self-examination expert by any length, reading the above made me question what is it that I exactly do when I work  on self-development. I don’t want an obsession to improve myself to lead to an unhappy life!

But, while re-reading the paragraph, I realised that the more impressive people were ‘strategic self-deceivers’. So they selectively chose to lie to themselves about certain mistakes they made or about certain faults they had and moved on with life, eventually living an impressive life.

So was the route that I took for self-development through self-examination wrong? Not entirely. To be able to successfully deceive oneself would require one to know themselves well enough. Self-examination is then a necessary tool.

What I seem to be able to gather from this reflection is that self-examination can be a double-edged sword. It can lead to giving us the exact information we need about ourselves to enable us to build our confidence or, if over done, it can lead to reinforcing the very thoughts and habits that one wants to overcome.

But now let’s come to the more interesting part of the lesson – strategic self-deception.

Philosophers argue that one cannot, in principle, be successful at self-deception. The reason being that one can not try to convince oneself of something being false when they know that it is true. But leaving the philosophers aside, there are practical ways in which we employ self-deception, I would argue, everyday. I am ready to bet that any of you reading this is probably deceiving yourself of multiple truths right now (of course, if you are able to recognise what exactly it is that you are deceiving yourself about then you will have failed at self-deception!).

An example of self-deception is when we try to forget the pain associated with a certain event. It could be a break-up or the passing away of a loved one. We may be capable of accepting what happened and moving on but the shorter, faster route is to lie to oneself. It may also be an intermediate step in acceptance (when unsuccessful at pulling off this self-deception, we are supposed to be in denial).

Forgetting events, overcoming fears, facing danger and taking risks, all to a certain extent involve self-deception. We know, at some level, that we are very bad at analysing risks and to be able to do something risky (like bungee jumping) we have to convince ourselves that it is not as risky as it seems. In that self-deception enables the person to take risks that he would not normally.

A strategic self-deceiver is one who is able to trick themselves about the right things. Of course, knowing the right things is the more difficult part.