Eric Schmidt would approve of the new Quartz homepage

The homepages of all news websites are pretty much the same: some pictures and lots of headlines, all linked to full stories. Until last week, as far as I know, every news website in the world had a homepage except one. Now even that exceptional news website—Quartz—has succumbed.

Launched in 2012, Quartz wanted to be like The Economist but for the 21st century… “embodying the era in which it is being created”. When you visited qz.com, you didn’t reach a homepage. Instead you were dumped on to whatever was the top story at the moment. If you didn’t like it, simply scroll down for the 2nd most important story, or choose one from the sidebar.

Their logic for this design was pretty simple: most people were coming to news websites from the side door of social media. A few months ago, a leaked New York Times report showed that unique visitors to their homepage fell from 160 million in 2011 to 60 million in 2013, which only reaffirmed the Quartz stance. They proclaimed, while acknowledging the self-serving argument, that the “homepage is dead“.

quartz-s-audience-growth-since-launch-unique-visitors-trailing-three-month-average_chartbuilder

The audience growth chart would make it seem that the “no homepage” strategy seems to be working. And, yet, this week Quartz introduced a homepage. Living up to their spirit of experimenting, the homepage design is unusual.

Homepages are boring…

Most homepage designs are boring, partly because of their function (and partly because of the old mindset of the newspaper industry). If you want to give your reader what you think are top stories and still leave choice for them, you need a page where headlines and pictures can be placed strategically so that you can nudge the reader towards stories that you deem important.

Not having a homepage may seem to be the lazy approach. The common belief is that an editorial staff is paid not only to produce important and interesting stories, but also to help the reader navigate this complex world by showing them which stories are more worthy of their attention. When you don’t have a homepage, you are letting the reader come to their own conclusions about what is important and what is not. And for this extra effort that you demand from them, they may decide not to read your website.

But Quartz didn’t seem to care, and neither did their readers. In the two years since launch, only 10% of the visitors were coming to Quartz stories via qz.com. Rest were taking the side-door: social media, direct referrals, search engines and email.

…but they still matter

Homepages are designed to increase reader loyalty. This is one reason that despite falling traffic to them, they remain central to how news websites function. People go to news websites when they are bored at work or when they want to know what’s going on.

When you visit the website of a large news organisation, you are guaranteed that they will have at least one story (of course linking back to their own website) of the most important happenings in the world. But if you are a startup news website with a small editorial team, how do you compete with the big dogs?

Quartz found the answer in their email newsletter. In less than two years, their daily newsletter—the digital equivalent of a printed newspaper—was being to sent to 70,000 subscribers. More than 40% of those subscribers opened the newsletter every day, which is a surprisingly large proportion of readers.

The success of the newsletter—called the Daily Brief—spurred Quartz to create a homepage in a bid to leverage this loyalty further. The new homepage consists of tailored summaries about “your world right now”, which are continuously updated. Right now, there are 10 summaries with multiple links in each. And, like the newsletter, the links aren’t always those that take you to a Quartz story.

Eric Schmidt would approve

Not simply linking to Quartz stories is how their homepage could compete with the big dogs. This move gives Quartz the freedom to choose the best story from any news organisation in the world, and still build a loyal readership to their own homepage. Eric Schmidt, Google’s executive chairman, would approve of the homepage. Recently he said, “The best way to stay ahead is a laser focus on building great products that people need.”

What people need from a news website’s homepage is an update about the world around them and high quality information to put things in context. It doesn’t matter to the reader whether that information comes from your own reporters or that from a Guardian reporter. As long as your homepage is providing the links to the best information, loyal readers will come back. This is one reason why news aggregation websites have become so popular.

The downside is that readers may not click through to Quartz stories as often. But that trade-off is worth it if the total number of visitors to the homepage goes up, because then the absolute number of clicks will increase both on Quartz and non-Quartz stories.

The new-style homepage is fertile ground to experiment. For instance, based on the number of clickthroughs, Quartz can gauge reader interest in particular stories. If a non-Quartz story is doing very well, it could inform the newsroom to cover that story in their own style. And when they do, they can simply swap the link and retain the reader.

If nothing else, as senior editor Zach Seward told Nieman Lab, “If you don’t build a homepage for people to go to, they’re not going to come to it.” I have a feeling that I will use the Quartz homepage more often than I use the Daily Brief.

An attempt at setting the information balance right

The problem is not that there is too much information, but that there is too little of the right kind

In his brilliant book, A Short History of Nearly Everything, Bill Bryson shows through many examples how history often credits the wrong person. They show how being in the right place at the right time or publishing your ideas in the right publication so that the right people notice them is some times more important than having the idea.

For instance, today the great astronomer Edward Hubble is credited to have discovered that we live in an ever-expanding universe. However, it was an astronomer with the cheerily intergalactic name Vesto Slipher who should have got the credit.

PSM_V31_D740_Carl_Wilhelm_Scheele
Carl Scheele (Credit: Wikimedia)

Or take the example of Carl Scheele, a Swedish chemist, who discovered eight new elements—oxygen, nitrogen, chlorine, fluorine, manganese, barium, molybdenum and tungsten—and received credit for none of them. His work was either overlooked or made it late to publication after someone else had made the same discovery independently. The credit instead went to chemists of the English-speaking world.

And, if you’re still not convinced, try Josiah Willard Gibbs, who Bryson calls the “most brilliant person that most people have never heard of”. Between 1875 and 1878 he produced a series of papers on the thermodynamic principles of nearly everything but published them in the Transactions of the Connecticut Academy of Arts and Science, a journal that “managed to be obscure even in Connecticut”. Although Gibbs was recognised later in life, most of the work he did remained hidden for too long at great cost to the scientific enterprise.

Information games

The reason I am telling you all this is not just because it is interesting, but also because there is a lesson we can learn from these examples. All three scientists who got scooped lived in the age when information was scarce and the fastest it travelled was at the speed of a moving vehicle. 

We live in the age of information excess and the fastest it travels is the fastest it will ever travel (ie at the speed of light). However, we are still stuck with one problem that those gentlemen of the 19th century faced and perhaps it has become worse—who receives what information matters even more today. With the internet throwing up interesting things on our screens every day, are we getting the information we really need?

I’ll explain the problem with two examples. The first comes from how we learn history. This came to my attention during my first months in Oxford. Ask any non-Indian what they thought about British colonisation of India and you got a view that was quite different to what I was taught in school in India.

Partisan views

Most people acknowledged that there was imperial excess. They bemoaned the human cost of the partition between India and Pakistan, for instance. But they, particularly British friends, also praised how the British gave Indians railways, law and order, the English language, and, some even suggested, the Indian identity. Most importantly, they saw Indian independence as British leaving India. This was a time, one said, that Britain was relinquishing control of other colonies too following the troubles that World War II had caused for Britain domestically.

Sardar Patel. Credit: Govt of India
Sardar Patel. Credit: Govt of India

This wasn’t what I was taught in school. Indian textbooks glorify the independence struggle, it seemed. I was surrounded by some of the smartest people I had ever met, and instead of questioning them I had to go back and read history. What I realised soon though was that neither my British friends nor I had quite the balanced view that one ought to have of this important period in world history.

I wasn’t expecting my British friends to know about Mangal Pandey or Bhagat Singh, but I thought they would be aware of the Quit India Movement and Jallianwallah Bagh massacre. That they would recognise the importance of the role of freedom fighters, such as Sardar Patel, in accelerating India’s independence. Equally, I suspect my British friends thought I should have more appreciation for the things the British left in India.

The point here is that where the information came from changed how people viewed the world. In this case the topic was a well-studied period of history, and so I was able to educate myself enough to get a balanced view. But what about more recent events?

History’s value

In The Sceptical Patriot, Sidin Vadukut analysed history textbooks used by Indian kids today. He found that none of them devote any space to post-Independence period. Beyond a little bit on the Indian constitution, there is nothing about the wars with Pakistan and China or about the Naxal movement, which is considered the greatest threat to India’s internal security.

Why should Indian kids learn about this stuff? Vadukut’s story might give you an answer:

Sometime in 2002 or 2003, a group of Japanse Hibakusha, or atomic bomb survivors, visited Chennai. The city was a stop on what I think was a global tour to promote peace and condemn nuclear weapons. They decided to visit a primary school and tell the students why the idea of nuclear weapons was a bad one.

I read about the school visit in one of the local newspapers. I don’t recall which one, and no amount of searching online has thrown up the original news report. But the broad details of what happened are seared into my mind.

After the presentation, the Hibakusha asked the children:  should countries go to war? No, they all said in chorus. Should countries use nuclear weapons? No. Should India use nuclear weapons? Never. What if the enemy is Pakistan? Oh, Pakistan is a special case, the kids said, we should totally nuke them.

Every time I retell this story at a public forum, there is an explosion of laughter followed by an awkward silence.

Absence of proper facts

This brings me to the second example, which is closer to me—science journalism. The media landscape of the west is changing. The old media houses are losing audience and revenue at such a pace that it feels like a crisis. The first to lose their jobs in such cases tend to be specialist journalists such as those covering science, environment and health. Sometimes the sections live on, but they shrink in size and depth. The reporting gets done by general reporters instead.

There have been some positive changes with new media organisations trying to fill the gap. And despite all that, even with shrinking newsrooms, western media’s science coverage remains so much better than Indian media’s. A survey I did a few years ago of Indian newspapers gives the same results today—proper science reporting doesn’t exist.

This is a problem. Most of the time if you come across a science story in Indian newspapers, it happens to be one from an international wire service, such as AP, Reuters or The Guardian. Despite the international nature of science, Indian readers are fed the writing of western journalists. The stories become less relevant and thus less interesting. While the hard-science stories are still somewhat valuable, those about the environment, health or even technology aren’t.

This matters because policy depends on the quality of information that decision-makers get. One more example from Vadukut’s book makes that case absolutely clear. In March 2008, Daggubati Purandeswari, the then minister of state for human resource development, when talking about India’s education system, parroted “facts” about Indians abroad, which had been forwarded in a hoax chain email.

“Sir, as rightly pointed out by the honourable member, our students have been placed very well globally. For example 12% scientists in the United States are Indians. We have 38% if the doctors in the US who are again Indians. 36% of NASA scientists are again Indians. So, the students are doing very well, and they are reaching places which again reflects on the quality of education that is being provided to our children in the country.”

All of these “facts” can be easily verified as being false, but the honourable MP did not think she needed to do that. Her words were then reported in Times of India the next day as “facts”. And that is how facts get made up.

Science suffers

That aside, had there been good science journalists writing about the achievements of scientists in India, perhaps Daggubati would not have relied on information from dodgy sources. India’s premier institutes, such as the Indian Institutes of Technology, never figure in the world university rankings. From the information available to the minister today, she won’t be able to figure out whether the absence of IITs in such rankings says something about the poor quality of research or the lack of communication of that research.

There is also another side to this story. In my time at Oxford, whenever there was a paper published in an Indian or Chinese journal, I was explicitly advised not to give it too much value. The suggestion was that research in these journals was not reliable, which in other words means that the researchers were making up data.

This kind of opinion may have been formed by experience. But mostly this opinion was the result of western media reporting negative science stories emerging from China and India, which make academics wary of trusting such research. This, I believe, must lead to missing out on genuinely good research being produced in these countries.

The solution, of course, is one that will require change from the big editorial houses. In my conversations with Indian journalists, I’ve been told many times that there is thirst for good science content but no publication is ready to provide it.

A different solution that a friend and I are pursuing is to collate good science stories from around the web that have their focus on the other side of the world. While this doesn’t solve the need for more science reporting, at least it provides a central place to find good content related to India. Some of it exists, but sadly it tends not to be at one place but spread across different publications. (You can sign up for our newsletter here.)

Those gentlemen in the 19th century suffered personal loss because key information didn’t reach the right audience. Today’s tragedy is the same but the reason is different—key information is not reaching the right audience because either there is too much information and very poor filters or there aren’t enough people to collect and present the information that is so desperately needed.

Lead image: teresaling. Thanks to Deeksha Sharma and Vasudevan Mukunth for reading drafts of this post.

Soliciting negative feedback is hard, but you must do it often if you care about progressing

Last week, I tried an experiment in self-promotion. I made a birthday wish that I shared among my Facebook friends, wishing that they would read more of my writing. The experiment went well. That post became one of the most read pieces since I moved my blog to this website last year. I got lots of friends to subscribe, and many told me how they had already been enjoying my work.

But I wanted feedback on whether this experiment was really worth it. After all, I didn’t want it to sound like a sales pitch. This was a genuine request, and I wanted to know if it came across that way.

I’m lucky to have a group of really smart people whom I can ask for critical feedback. The group approved of my experiment, and the prompt led to a valuable discussion on feedback, which started, as many things do, with Elon Musk.

Feedback loop

Most of the time we walk around thinking that we are doing the right thing. That is important, of course, because if we were not confident in our abilities then we would not be able to function. But from time to time we must solicit feedback to help us spot faults and find better ways of doing things.

This might seem like common sense, but Elon Musk, one of the most successful entrepreneurs alive, says that most people don’t seek feedback that matters. He says we must not just seek feedback, but we must specifically seek negative feedback. (As a side, the operative word here should be critical, which means negative and analytically founded.)

When asking for feedback if you don’t ask for negative feedback, chances are you will never get it because people usually withhold such feedback for fear of hurting our feelings. This human folly to be soft on others leads to ineffectiveness. Even the times when negative feedback has to be given, it is usually sugar-coated, which often does not lead to the action that is needed.

Forget niceties

Truth is a hard apple to throw and a hard apple to bite.” These are slightly modified words of the American author Donald Barthelme. One way of allowing such hard apples to reach you, at least on an individual level, is to set up a system for soliciting feedback anonymously. With such an option, those giving feedback can forget niceties and really get to the point. It is also easy to do. For instance, here is a simple Google form where you can leave anonymous feedback for me.

However, before you jump to setting up your own form, you have to remember that negative feedback can (and will) hurt. You need to be sure that you are ready to hear nasty stuff. Smarter people than I have thought about this and they’ve developed rules that might help.

If you are thinking of soliciting anonymous feedback, try to abide by Crocker’s Rules (in full):

Declaring yourself to be operating by “Crocker’s Rules” means that other people are allowed to optimise their messages for information, not for being nice to you.

It means that you have accepted full responsibility for the operation of your own mind—if you’re offended, it’s your fault. Anyone is allowed to call you a moron and claim to be doing you a favour.

While Crocker’s Rules are simple, they are not easy to follow. In launching my own anonymous form, I’m taking a risk. But I do believe that the payoff will be worth it.

Mass change

On an organisation level, most places already have regular appraisals in place. However, these tend to be too formal for their own good. This can hurt an organisation, especially one that is growing rapidly or one where roles change quite often.

For this to work, on such a level, there will need to be behavioural change, which is hard. People will need to be encouraged to give feedback and a system will need to be in place to help them manage this feedback. Organisations can’t force people to follow Crocker’s Rules. But the human resource department can do something to help, if they want such a culture to flourish.

An experiment that has worked at some leading tech firms is that of radical transparency. Except for 100% personal emails, every email is shared with everyone else in the organisation. So someone new to a project can go read all the emails, all the way back if they want, and problems are uncovered more quickly. It’s hard to pretend everything’s going well with the customer when the email thread shows it’s not. (Of course email volume will be high, but email filters and selective reading can go a long way.)

One way or another, you must do your best to solicit negative feedback and do it often. If you care about progressing quickly, that is.

***

Thanks to Alex Flint, Christo Fogelberg and Xiao Cai for ideas and feedback. Image: gforsythe

***


Crocker’s Rules in full

Declaring yourself to be operating by “Crocker’s Rules” means that other people are allowed to optimise their messages for information, not for being nice to you.

Crocker’s Rules means that you have accepted full responsibility for the operation of your own mind—if you’re offended, it’s your fault. Anyone is allowed to call you a moron and claim to be doing you a favour. (Which, in point of fact, they would be. One of the big problems with this culture is that everyone’s afraid to tell you you’re wrong, or they think they have to dance around it.)

Two people using Crocker’s Rules should be able to communicate all relevant information in the minimum amount of time, without paraphrasing or social formatting. Obviously, don’t declare yourself to be operating by Crocker’s Rules unless you have that kind of mental discipline.

These rules don’t mean you can insult people; it means that other people don’t have to worry about whether they are insulting you. Crocker’s Rules are a discipline, not a privilege. Taking advantage of Crocker’s Rules does not imply reciprocity. How could it? Crocker’s Rules are something you do for yourself, to maximise information received—not something you grit your teeth over and do as a favour. The rules are named after Lee Daniel Crocker.

If you are cursing Arvind Kejriwal for throwing away a golden opportunity, you are wrong

The Aam Aadmi Party has done the impossible. One month ago, the Facebook class of India loved them for showing what Indian politics really needs. Now they hate them for doing things that we won’t expect even corrupt politicians to do.

To understand this schizophrenia that the AAP has created among the educated middle class of India, we need to look at our own expectations. The success of Kejriwal in the Delhi elections, we thought, was the result of our disgust of the current political class. We hate those politicians who go after vote banks only so that they can fill up their own pockets with bribes. We wanted someone who could clean the muck and put worthy leaders in charge. Like Obama, he was the beacon of hope in a mess that had been decaying for decades.

As soon as he took up his place as Delhi’s chief minister, we expected Kejriwal to behave like a model chief minister—honest, sober and efficient. We wanted him to get things done. But unlike Obama who crushed the American dream after being re-elected, Kejriwal’s party members seem to be on track to crush our dreams within months of being in the government.

The law minister Somnath Bharti was accused of taking law into his own hands. The resident poet Kumar Vishwas was caught on camera making racial comments. And the leader Kejriwal was blamed to act un-statesmanlike when he resorted to protesting against his state’s police—something no chief minister had ever done before .

And yet, we must realise that Kejriwal is smart to know that pandering to our desires is impossible in the few months he has before national elections. He is partly right to blame the media for creating a hate campaign, for whom he has been a bonus. With Kejriwal, the media has one more top politician to poke beyond Rahul Gandhi and Narendra Modi.

Kejriwal’s best strategy for the national elections is going to be the same strategy he had going into Delhi elections—try to get enough seats so that AAP can be a trouble-making opposition. And for achieving that goal, it would be best to stick to the principles that won him Delhi. Something that even the Congress and the BJP have admitted to learn lessons from. The principles were to provide the people an alternative to the current crop of filthy politicians.

Of course, national elections are going to be a whole different game. Delhi’s population is all urban, but India remains largely rural.

In the Delhi elections, AAP got 30% of the votes, slightly behind BJP’s 33% and slightly ahead Congress’s 25%. He won not because of the few votes from the Facebook class, but mostly because of the votes from the poor—the rickshaw drivers, the slum-dwellers and lower middle class (which can afford mobile phones but has no use for the internet).

He succeeded because he acted as he had done for many years before—like a revolutionary. While it would be foolish think that he does not enjoy the power of being a chief minister, I would like to believe him when he says that he does not want that power if he has to compromise on his ideals. The Congress can pull remove their unconditional support and he won’t be chief minister anymore.

At this point I should disclose that I do not support Kejriwal’s economic policies, at least the ones he has shown so far. But I’m willing to experiment with having a clean politician who improves the functioning of the government and exposes its predecessor’s wrongdoings, even if the immediate effect on people’s lives will not be beneficial.

Rise of the muffler man

Be5Y2PhCQAAHwX_

I like Vir Sanghvi’s clarity of thoughts. That is why when he said, “Kejriwal has become no more than a media-blaming, vote-bank politician”, I entertained his well-laid arguments seriously. Sadly, the conclusion drawn is only partly right.

Yes Kejriwal blames the media, but he is no vote-bank politician. Sanghvi’s analysis fails because he is being short-sighted. He claims that giving free water and cutting electricity prices in Delhi will get Kejriwal votes at the national level.

Kejriwal got into politics to play the long game. He recognises that what few votes he can get at the national level will come from exposing the tainted politicians, offering a good alternative and listening to the vast majority of people. He is moving the debate away from pitting personalities against each other to talking about values and ideas.

Like in Delhi, there is no way that AAP can form a majority government at the national level this year. It would be lucky if it even played a small part in forming the government. Instead, the AAP is aiming for the election of 2019. In five years, the party will have matured and the country will have grown tired of Modi. That would be the time when AAP will be a serious alternative—something Indians have been wanting for decades.

Image credit: Kumar Vishwas (not original source)

My worst days as a kid gave me the most valuable productivity hack

Radiolab, one of my favourite things on the internet, ran an episode on morality a few years ago. The main question they asked was whether our sense of morality is something we are born with or something we learn. If it is the latter, then how? The one-hour story is worth your time, but there is one aspect of the three-part story that I want to discuss.

In the first part, they observed chimps to find that even among our primate cousins there exists a rudimentary system of morality. Their example comes from how these chimps share the food that is given to a group. Some neuroscientists and philosophers argue that each one of us has this rudimentary level of morality that we have inherited through millions of years of evolution—this is the “inner chimp” hypothesis as one scientist puts it.

In humans this inner chimp starts acting when we are two or three years old. And then we refine it as we go through the experience of life. Most of this refinement, not surprisingly, happens when we are kids, as we see in the second part in the episode, which involved an example from the school life of Amy O’Leary, now a New York Times reporter.

In grade four, one of Amy’s teacher got the kids to play a game that would help them learn some history. The game was called Homestead and it was something like Monopoly where each student received certain resources and had to play by simple rules to win. The main rule of the game was: “Do what you think is right.”

Amy cheated. She looted some of her classmates who wanted to be part of an in-group and even flooded the market with fake money. The teacher who ran the game realised what was happening and called a meeting of those who were involved. He asked Amy, who had become a leader by now, “What are you going to do?”

When she said “nothing”, the teacher used it as one of those “teachable” moments and showed his disappointment in her without directly intervening in the game. That look on her teacher’s face has stuck with Amy for all her life.

Just as she finished telling the story, I was flooded with memories of the many guilty moments that have left such moral lessons in my life.

These moments are painful. Most of them involve a scene at the end where my mum is tired of reprimanding me, or my dad is about to get angry, which happened on rare occasions and terrified the hell out of me. Some are absolutely clear, as if the expressions on people’s faces were recorded on a photographic plate in my head. Others are vague, with the characteristic fog that blurs the details leaving only a strong sense of shame.

This I’m sure has happened to many of us. I asked a few of my friends and all seemed to share anecdotes of times they remember when they did something wrong, but only realised that it in hindsight. A painful lesson was learnt and they carried it for all their life.

One such lesson that I have etched into me has helped me tremendously. It involves mischief and feedback.

Beat me to it

As a naughty child I may have done wrong things, but mostly I did annoying things. Of course, in my head, I was only having fun and that was no crime. But people around me seemed to have low tolerance, and I was told off too many times. I hated it. Mostly because I wasn’t given a proper reason for why my actions were annoying. Of course, if I didn’t know why what I did was wrong, I wasn’t going to learn to do the right thing.

That is why the annual school open day was the worst day of the year. It gave the teachers an opportunity to complain to my parents and get the guilt off their chest. They did this in a polite manner, which I always thought was wrong. Wouldn’t it have been better to talk to me in a polite manner in the first place?

All that admonishment left scars. It might have taught me something about morality, but more importantly it gave me a really useful hack.

In hindsight, I realise that I didn’t understand the reason for stopping me because my natural empathy levels were low. I had that realisation partly because even as an adult I sometimes struggle with it. And to counter my lack of empathy, I developed a system of wanting a constant stream of feedback. I want to know as often and as early as possible if I’m going wrong somewhere.

But I also needed to develop a thick skin. When you are open to feedback, some would be constructive and much hurtful. This is the reason, as the thinker Seth Godin puts it, adults don’t seek feedback as often as they should.

When you can get this system to work, however, it’s absolutely fantastic. It is the best productivity hack I know for becoming better at something quickly: ride the cycle of practice and feedback. It has been a painful but useful lesson.

Image credit: Zen

Most don’t understand the English passive, but they are ready to criticise it anyway

Geoffrey Pullum, professor of general linguistics at the University of Edinburgh, has published a wonderful paper titled Fear and Loathing of the English Passive. His main claim, which he demonstrates easily with many examples, is that most people, including professionals writers, journalists and authors of usage guides, don’t understand the English passive enough to criticise it properly.

Pullum is a regular blogger and his 23-page academic paper is quite readable. But I’m pulling out a few examples here that make it easy to understand Pullum’s frustration. First one from a respected book of English usage, The Elements of Style by William Strunk:

The active voice is usually more direct and vigorous than the passive:
I shall always remember my first visit to Boston.

This is much better than
My first visit to Boston will always be remembered by me.

The latter sentence is less direct, less bold, and less concise.

Pullum says, “Directness, boldness, and concision are not even relevant here, because Strunk’s disrecommended example … cannot be used in any normal kind of context.” And if you think about it, that makes sense. The passive construction is just odd and would never be used in spoken English, let alone written English.

Here’s another one from the BBC News Style Guide:

Compare these examples. The first is in the passive, the second active:

1. There were riots in several towns in Northern England last night, in which
police clashed with stone-throwing youths.

2. Youths throwing stones clashed with police during riots in several towns in
Northern England last night.

The main reason for recommending that passive should not be used is that it tends to obscure or attenuate agency (ie the doer). But, as Pullum writes, “the former is not a passive, and no clear agency or responsibility issue arises (in both versions the youths threw the stones, and in neither version is the instigator of the riots named or implied).”

The best example comes quite early in the essay because of its “strangely ill-chosen metaphor”, where Sherry Roberts writes in 11 Ways to Improve Your Writing and Your Business:

A sentence written in passive voice is the shifty desperado who tries to win the gun-fight by shooting the sheriff in the back, stealing his horse, and sneaking out of town.

As the underlined word indicates, and Pullum writes, “Notice that she unthinkingly uses a passive while making the above statements.”

Finally, Pullum nails it with an analysis of Orwell’s own writing. It was Orwell who wrote in his now-famous essay Politics and the English Language: “Never use the passive when you can use the active.”

By my count, about 17% of the transitive verbs (those that require an object) in random prose are likely to be passive, while a careful count of the whole of Orwell’s essay shows that 26% are passive … Orwell uses more than one and a half times as many passives as typical writers.

In writing the paper, Pullum addresses the main criticism that could be levelled against him that may be he is being too prescriptive, that language changes and that people’s definition of passive is broader than he lays it out. But the sheer breadth of examples that his blog readers have brought to his notice makes it clear to him that that isn’t the case.

How and why has this happened?

Oversimplification and overkill by well-meaning advisers may have a lot to do with it. It is right and good, of course, to instruct students and novice writers in how they might improve their writing. But handing them simplistic prescriptions and prohibitions is not doing them any favors. ‘Avoid the passive’ is typical of such virtually useless advice.

As I have always understood it, the use of passive voice should be avoided if it affects clarity. But Pullum argues that, when most people cannot even recognise what is a passive and what is not, this standard teaching about shunning the passive “should be abandoned entirely”.

Even if they managed to follow the advice rigorously (which they can hardly do if it is not clear to them what a passive is), it would usually not improve their writing one whit. It would certainly make them write less like great writers of the past—and more like a little child.

Taking Pullum’s advice seriously, it is important that we understand how to spot a passive. Here’s a short guide from the University of North Carolina’s Writing Centre:

  1. Look for the passive voice: “to be” + a past participle (usually, but not always, ending in “ed”). If you don’t see both components, move on.
  2. Does the sentence describe an action? If so, where is the actor? Is he/she/it in the grammatical subject position (at the front of the sentence) or in the object position (at the end of the sentence, or missing entirely)?
  3. Does the sentence end with “by…”? Many passive sentences include the actor at the end of the sentence in a “by” phrase, like “The ball was hit by the player” or “The shoe was chewed up by the dog.” “By” by itself isn’t a conclusive sign of the passive voice, but it can prompt you to take a closer look.

There are however some problems with such simplistic advice. For example, rule 1 could exclude many types of passives:

  1. Prepositional passives: eg. He was laughed at.
  2. Bare passives: subject+past participle. eg. That said, however, I like girls. One of its ads shows a washed-out manager, arms folded, sitting in a corner.
  3. Get passives: got+past participle. eg. Marie got photographed.
  4. Adjectival passives: eg. The door seemed locked, as far as I could tell
  5. Concealed passives: eg. The situation needs looking into by experts.

Similarly, rule 2 won’t be able to catch an adjectival passive. And rule 3 won’t catch many short passives which exclude a by-phrase (and where one may not be obvious).

All that to say: perhaps the best advice to follow on the debate about passives is that we must worry less about using (or spotting) passives and more about achieving clarity in writing by whatever means possible. This advice, I think, might be less controversial.

Thinkers should wallow in the middle ground, but doers should choose a side

A widely-accepted definition of progress is that it is the improvement in the standard of living of the greatest number of people, and by that definition the world has progressed much since the beginning of civilisation.

6a00d83451b14d69e200e5509cfcff8833-800wi

As a crude indicator of progress, in the last 2000 years per capita GDP (gross domestic product) has increased from a few hundred dollars to about $7000 (in 2000). Even if on average humanity has been progressing rapidly, most of that progress has happened in fits and starts—in different times it has benefited different groups of people.

Consider, for example, the fact that real incomes in the UK scarcely doubled from the beginning of the common era to 1570. They then tripled from 1570 to 1875, and more than tripled from 1875 to 1975. Yet, from 1770 to about 1830, during the industrial revolution, real wages in Britain remained stagnant.

Ryan Avent, economics correspondent for The Economist, makes a case that technological progress disproportionately benefits those with capital, before raising everyone’s income in the long term. During these short periods of high innovation, the creation of inequality in society may be inevitable. (He further argues that we may be in just such a phase right now.)

This is why techno-optimists (including myself) need to be careful. There is an expectation among this breed that technology will always lead to progress within their lifetimes—say that to the textile workers of the industrial revolution. When slagging off technocritics, like Evgeny Morozov, it is worth keeping in mind that neither extremes of the argument are correct.

Neither left nor right

Another place where disillusionment is common is on the left-right political divide. Those on the left think progress will come through reducing inequality and providing everyone with the same opportunities. Those on the right think survival of the fittest through competition is the only way humanity has progressed so far. History proves both of them wrong.

Take the example of US presidents. Republican presidents, widely representing the right, have had 88 years in power, whereas Democratic ones, widely representing the left, have had 85. In the UK the corresponding numbers for prime ministers are skewed slightly to the left, but not by a lot.

More often than not, however, in new elections people elect a party with an opposing ideology as they get fed up with the policies of the ruling party. Continuous power of the same ideology at the top for a long time is an exception than the norm.

This signifies that progress is often achieved by a mixture of left and right policies. Competition is good, but it can lead to crony capitalism. Egalitarianism is great, but it can lead to stagnation as the history of communist governments make clear.

(An exception here is that of the likes of China and Singapore, which have single party rule and have still done spectacularly well when it comes to “progress”. So what I’m proposing here must be taken to be applicable to countries which conduct free and fair elections, at least to a large extent.)

Being in the middle is not cool

Politicians on the left and right bring their own baggage of biases during their time as leaders. The flip-flop between the ideologies of those elected to lead, in some ways, shows that people try to correct for the biases of their leaders. When the left-leaning party pushes a country far to the left, say, by making it less competitive in the global market, people elect a right-leaning party to correct the situation. (There may be other factors at play, including randomness, but I would argue on the whole pre-election voter sentiment seems to agree to this hypothesis.)

So if this is the case, why is the following among centrist parties of the world so small? I’m not sure, but I think the answer may lie in the fact that human herd behaviour works best when people believe in a certain set of tenets very strongly. This must work better when there is a left-right divide than when those in the middle take beliefs from either side.

Another reason may be that it is easier to act in unison on certain kinds of beliefs, say by being a blind techno-optimist, than it is to be in a position where one is continuously re-evaluating which side to lean to. In other words, rationality among an individual or a small group matters less than rationality of a crowd which may be split into two moderately extreme sides.

All this leads me to conclude that, for a thinker, it may be good to wallow in the middle ground. But for a doer, it would be better to choose one side and stick to it.

Thanks to Alex Flint and Deeksha Sharma for reading a draft of this article.