Thinkers should wallow in the middle ground, but doers should choose a side

A widely-accepted definition of progress is that it is the improvement in the standard of living of the greatest number of people, and by that definition the world has progressed much since the beginning of civilisation.


As a crude indicator of progress, in the last 2000 years per capita GDP (gross domestic product) has increased from a few hundred dollars to about $7000 (in 2000). Even if on average humanity has been progressing rapidly, most of that progress has happened in fits and starts—in different times it has benefited different groups of people.

Consider, for example, the fact that real incomes in the UK scarcely doubled from the beginning of the common era to 1570. They then tripled from 1570 to 1875, and more than tripled from 1875 to 1975. Yet, from 1770 to about 1830, during the industrial revolution, real wages in Britain remained stagnant.

Ryan Avent, economics correspondent for The Economist, makes a case that technological progress disproportionately benefits those with capital, before raising everyone’s income in the long term. During these short periods of high innovation, the creation of inequality in society may be inevitable. (He further argues that we may be in just such a phase right now.)

This is why techno-optimists (including myself) need to be careful. There is an expectation among this breed that technology will always lead to progress within their lifetimes—say that to the textile workers of the industrial revolution. When slagging off technocritics, like Evgeny Morozov, it is worth keeping in mind that neither extremes of the argument are correct.

Neither left nor right

Another place where disillusionment is common is on the left-right political divide. Those on the left think progress will come through reducing inequality and providing everyone with the same opportunities. Those on the right think survival of the fittest through competition is the only way humanity has progressed so far. History proves both of them wrong.

Take the example of US presidents. Republican presidents, widely representing the right, have had 88 years in power, whereas Democratic ones, widely representing the left, have had 85. In the UK the corresponding numbers for prime ministers are skewed slightly to the left, but not by a lot.

More often than not, however, in new elections people elect a party with an opposing ideology as they get fed up with the policies of the ruling party. Continuous power of the same ideology at the top for a long time is an exception than the norm.

This signifies that progress is often achieved by a mixture of left and right policies. Competition is good, but it can lead to crony capitalism. Egalitarianism is great, but it can lead to stagnation as the history of communist governments make clear.

(An exception here is that of the likes of China and Singapore, which have single party rule and have still done spectacularly well when it comes to “progress”. So what I’m proposing here must be taken to be applicable to countries which conduct free and fair elections, at least to a large extent.)

Being in the middle is not cool

Politicians on the left and right bring their own baggage of biases during their time as leaders. The flip-flop between the ideologies of those elected to lead, in some ways, shows that people try to correct for the biases of their leaders. When the left-leaning party pushes a country far to the left, say, by making it less competitive in the global market, people elect a right-leaning party to correct the situation. (There may be other factors at play, including randomness, but I would argue on the whole pre-election voter sentiment seems to agree to this hypothesis.)

So if this is the case, why is the following among centrist parties of the world so small? I’m not sure, but I think the answer may lie in the fact that human herd behaviour works best when people believe in a certain set of tenets very strongly. This must work better when there is a left-right divide than when those in the middle take beliefs from either side.

Another reason may be that it is easier to act in unison on certain kinds of beliefs, say by being a blind techno-optimist, than it is to be in a position where one is continuously re-evaluating which side to lean to. In other words, rationality among an individual or a small group matters less than rationality of a crowd which may be split into two moderately extreme sides.

All this leads me to conclude that, for a thinker, it may be good to wallow in the middle ground. But for a doer, it would be better to choose one side and stick to it.

Thanks to Alex Flint and Deeksha Sharma for reading a draft of this article.

Following the lead

In this highly complex world, it is extremely difficult to be able to make rational choices for even the most basic of human actions.

Take the consumption of food: what is the best food to eat to minimise our impact on the environment and improve our health? The answer “be a vegan” is not as simple and straightforward. Many have put forth convincing cases against. 

While this topic is something I’ve followed for quite sometime and hold views about, there are many others that I can’t spend the same amount of time analysing. In those cases I’m not afraid of following the lead of some trustworthy “thought leaders”.

Ain’t nobody got time for this?

There is an excellent paper by Tyler Cowen and Robin Hanson that discusses how to deal with self-deception. I have a longer blog post on it, but here is the most important bit:

For a truth-seeker, the key question must be how sure you can be that you, at the moment, are substantially more likely to have a truth-seeking, in-control, rational core than the people you now disagree with. This is because if either of you have some substantial degree of meta-rationality, then your relative intelligence and information are largely irrelevant except as they may indicate which of you is more likely to be self-deceived about being meta-rational.

How to stop fooling yourself

When I first wrote about self-deception, a little more than a year ago, I found the experience traumatising.

It can be very troubling to realise that inside my own head I maybe tucking away truths that I am already aware of!

Now Eli Dourado has an essay in The Umlaut that discusses a paper by Tyler Cowen and Robin Hanson that grapples with the issue of self-deception.  Here’s the relevant bit:

Self-favoring priors, they note, can help to serve other functions besides arriving at the truth. People who “irrationally” believe in themselves are often more successful than those who do not. Because pursuit of the truth is often irrelevant in evolutionary competition, humans have an evolved tendency to hold self-favoring priors and self-deceive about the existence of these priors in ourselves, even though we frequently observe them in others.

Self-deception is in some ways a more serious problem than mere lack of intelligence. It is embarrassing to be caught in a logical contradiction, as a stupid person might be, because it is often impossible to deny. But when accused of disagreeing due to a self-favoring prior, such as having an inflated opinion of one’s own judgment, people can and do simply deny the accusation.

So how can we cope with self-deception?

Cowen and Hanson argue that we should be on the lookout for people who are “meta-rational,” honest truth-seekers who choose opinions as if they understand the problem of disagreement and self-deception. According to the theory of disagreement, meta-rational people will not have disagreements among themselves caused by faith in their own superior knowledge or reasoning ability. The fact that disagreement remains widespread suggests that most people are not meta-rational, or—what seems less likely—that meta-rational people cannot distinguish one another.

We can try to identify meta-rational people through their cognitive and conversational styles. Someone who is really seeking the truth should be eager to collect new information through listening rather than speaking, construe opposing perspectives in their most favorable light, and offer information of which the other parties are not aware, instead of simply repeating arguments the other side has already heard.

What sells in the name of journalism today is written to appease one side or the other. Opinions are important, but what is more important is facts that help you arrive at it and the process you use to form them. A good journalist should be a meta-rational person. He should also be able to find other meta-rational people, so that the world is made aware of facts and arguments that help take things a step forward.

The problem, of course, is that meta-rational people care less about success but more about seeking the truth/understanding the world. Their voices are present but harder to find. They get far less attention than they deserve.

You can stop reading now, if you like. For, because we are on the subject of good journalism and meta-rational people, I’d like to discuss an example of exactly the opposite. Aleks Eror writes in VICE that the Chinese are engineering genius babies. This is a great story, if it were true. Sadly, he bases this claim on the word of Geoffrey Miller, an American academic, who wrote a heavily biased article about it in January. What’s worse is that it Miller’s argument had been criticised and debunked (sort-of) immediately after. But Eror chose to ignore that and rehash the fear-mongering story.

The central point of the argument is that the Chinese have an institute where they are doing genome sequencing of the world’s smartest people to find the genes that makes us smart. This is not a new story. Eugenics in some shape or form has existed for a long time. The problem with Eror’s piece is that he doesn’t run by an expert the claims made by Miller.

A good science journalist should always do that because the complex nature of the subjects we usually deal with. Experts may not be the perfect meta-rational candidate that we’d like, but they are close enough. They bring an external perspective, and I’ve found that invaluable in my own work.

The God of the gaps

One side of the debate over God’s existence focuses on the notion that what science cannot explain today must be the doing of God. Neil deGrasse Tyson neatly explains this God to be “an ever-receding pocket of scientific ignorance”. 

But what’s funnier is that the God of the gaps argument was actually used by Henry Drummond, a 19th century evangelist, to decry the scientifically oriented Christians from believing in this argument.

Fixing the bugs in our brain

We human beings should be famous for doing irrational things predictably. No, I am not joking.

Dan Ariely, a behavioral economist at Duke University, wrote a book about it. The take home message from his TED talk is that the only way to stop making some those mistakes is to be able to challenge our intuitions. The inherent difficulty in doing so cannot be overstated but that is exactly why we make those mistakes time and again without realising.

Get Real

You find a lady clad in a filthy ragged coat. She is holding onto something in her right hand and flipping it at a regular pace. She is mumbling, almost as if chanting, and her eyes are transfixed on something. What separates her from what she is staring at is a glass window.

The scene is not set in a Buddhist monastery in Cambodia or a busy temple in India but in a chemistry lab. The lady is a chemist wearing a stained old lab coat, holding her lucky charm and staring at the flask in her fume cupboard, in a hope that her reaction works this time.

Get Real – Chemistry World, March 2011 issue