Villagers armed with smartphones can help stem the rising rate of suicides in India

The stigma attached to mental illnesses is hurting India. Few are brave to speak about it to someone and fewer still get treated. The result is that, for every 100,000 Indians between 15 and 29 years old, 36 commit suicide annually—the highest rate among the youth in the world.

Worse still, according to Vikram Patel, professor of mental health at the London School of Tropical Hygiene and Medicine and one of Time Magazine’s 100 most influential people of 2015, without urgent improvement in treating mental disorders, suicides will soon become the leading cause of death among the young.

Read more on Quartz. Also published in Lokmat Times.

Image credit: cgiarclimate under CC-BY-NC-SA license.

Advertisements

Glorifying the past is just a way of avoiding today’s grave problems

History beat out Marathi, marginally, as my least favourite subject at school. I would have loved history textbooks if I were allowed to read them like novels. But, no, we were made to mug up facts. Battle of Plassey took place on 23 June, 1757. The University of Oxford received its Royal Charter on June 26, 1214. Archduke Franz Ferdinand was assassinated on 28 June, 1914 … and so on the facts kept coming in thick packets and without time to digest.

And I kept asking, “What’s the point of studying history?” But never got a satisfactory answer till my teenage hormones had been supplanted by adult maturity. When I did get one, I could finally lay to rest all the unjust curses various historical figures had to bear because they committed historically important acts on bizarre dates and under twisted circumstances.

The real value of history is not, as most think, in “teaching” us to avoid mistakes made in the past. For history is never repeated and no two years are ever alike, how much ever writers would love to draw parallels. Adam Gopnik of the New Yorker contends that not studying history would commit humanity to the trouble of “presentism”, where we might exaggerate “our present problems out of all proportion to those that have previously existed”. Thus believing that “things are much worse than they have ever been”.

Making history do our bidding

In India, the exact opposite happens. Historical facts are misinterpreted or, worse, made up and turned into jingoistic propaganda. Instead of worrying about the troubles we face today, we proudly boast about our historical achievements and claim that independent India’s potential is no different.

When talking about the country’s achievements, our leaders like to skip the period when conquerors pillaged and the British ruled, and look at the “golden past”. A time when, they believe, India’s wealth in the world was unparalleled and our achievements unprecedented.

Because a lot of Indian kings of that supposed golden era were no benevolent dictators, these leaders choose to talk about our intellectual achievements, especially those in science. You must have heard from respectable people about how we had invented planes that could fly to Mars and back, how plastic surgery was used to stitch an elephant’s head on a human, and how we made medicine to bring the dead back to life.

“This effort of creating a false history of science in India is a spectacularly bad example of the absurd lengths to which attempts at glorification of our past can go,” said leading scientist Roddam Narasimha in an editorial in Current Science.

If Gopnik’s worry for the West about not studying history is suffering from presentism, then Indians need to worry about suffering from pastism. Our perception of our past is blocking us from working on the grave problems we face today.

And we find ourselves in this position because of two reasons. First, we have not invested enough in studying the history of science in India. Second, we ignore the voices of the few scholars who have uncovered at least some of the true history of science in India.

In 2009, the Indian National Science Academy celebrated 50 years since the conception of the history of science programme. In an article that year, AK Bag, editor of the Indian Journal of History of Science, said that despite the programme’s efforts only about 40 source manuscripts have been thoroughly studied, leaving more than 100 such documents untouched in oriental libraries.

To be sure, there have been some remarkable achievements made by ancient Indian science. These include the first recorded use of plastic surgery to heal broken noses, the development and application of many key theorems in algebra, and even correctly predicting the motions of the solar system (centuries ahead of the Greeks). And, as we scour source documents, more are bound to be revealed. But that is no reason to make up fantastical notions of what our ancestors achieved.

This kind of behaviour may come about because, according to Narasimha, we do not have reliable history-of-science books for the masses. Without the right facts, teachers suffer, education is incomplete and it is easy to manipulate public perception. “Somebody needs to write such books,” Narasimha concluded.

First published in Lokmat Times. Image from Wikipedia. This post was corrected to attribute the Current Science quote to Narasimha.

Rotavac is not India’s first indigenous vaccine

While the recently released low-cost rotavirus vaccine, Rotavac, is a great achievement for the country, it is not the “first indigenously developed vaccine”, as the prime minister’s office claims and then was parroted by newspapers. The honour goes to the bubonic plague vaccine developed in Bombay in 1897.

This is also not just a matter of semantics, where we ought to assume that the prime minister’s office implies “indigenous” to mean developed in Independent India. Because, as we have seen in the past, the prime minister is only too happy to (wrongly) claim centuries-old achievements to be “Indian innovations”.

India has played an important role in the history of vaccine’s use to fight disease. Their use means that the poorest of the poor today can live well beyond the age at which most kings died not too long ago. And the least we can do when we take vaccination forward in India today is to honour our past achievements.

Honouring history

The use of the first vaccine was pioneered by English scientist Edward Jenner in 1798. In the two centuries since, we have developed vaccines to fight 25 diseases. Fittingly, the disease against which the first vaccine was developed – smallpox – has been eradicated globally. The next disease on the list of diseases to be eradicated by the use vaccines could be polio.

But for nearly 100 years after the smallpox vaccine came in to use, the process of developing vaccines against other diseases remained difficult. This is because a vaccine then needed a naturally existing weak form of the disease. In the case of smallpox, that weak form was found in cowpox.

However, almost by accident, Louis Pasteur developed a laboratory method to generate a weak form of a disease. He used the method to create a vaccine against anthrax and chicken cholera. This is what revolutionised the work against infectious diseases.When injected into or ingested by the human body, vaccines work by stimulating the immune system and preparing it for when the real thing attacks in the future. Many vaccines provide lifelong immunity to a disease.

A young Russian, Waldemar Haffkine, was keenly following Pasteur’s work. At the time, cholera epidemics were common worldwide and someone had claimed to have isolated the bacteria that caused the disease. Despite Pasteur and Jenner’s work, many believed that that bacteria can’t be the sole cause behind cholera.

However, Haffkine agreed with the theory and worked hard on developing a cholera vaccine. He achieved success in 1892 and conducted the first human trial of the vaccine on himself. Having survived, he made the findings public but was dismissed by senior scientists.

The Plague Laboratory

Determined to see his invention have some impact on the world, he travelled to India where cholera epidemics had caused hundreds of thousands of deaths. His trials in Uttar Pradesh succeeded and he managed to vaccinate thousands. In 1895 he returned to France having caught malaria. But in 1896 was requested by the Governor of Bombay to help develop a vaccine against plague, which was ravaging the population of Bombay and Poona.

Against the advise of his French doctor, Haffkine travelled back to India and worked persistently to develop a plague vaccine. He succeeded within months, and, like the last time, tested the vaccine on himself. Within a few years, the vaccine was used to inoculate millions of people.

In 1899, a former residence of the Governor of Bombay was turned in to the Plague Laboratory and Haffkine made its director. The lab was renamed the Haffkine Institute in 1925, and remains an active institute for biological research in the country.

Haffkine was knighted by Queen Victoria in 1897. A London magazine wrote this about the announcement: “a Russian Jew, trained in the schools of European science, saves the lives of helpless Hindoos and Mohammedans and is decorated by the descendant of William the Conqueror and Alfred the Great.”

If you forgive the colonial tone, it is an apt eulogy in a rapidly globalising world that was being created then. His work is arguably no less “indigenous” to India than Rotavac, so it is sad that we forget such a legacy in celebrating the country’s new achievements.

First published in Lokmat Times. Image from Wikipedia

Why India must invest more in science

You must have read that 38% of doctors in the US are Indians, as are 36% of scientists at the space agency NASA and so are 34% of Microsoft employees. Daggubati Purandeswari, former MP and the minister of state for human resource development under Manmohan Singh’s first term as prime minister, had heard this too. Unfortunately Purandeswari parroted those numbers in a Rajya Sabha session without verifying and, because those statistics are not true, shewas ridiculed in the press for it.

The true numbers are are estimated to be smaller (but still significant). About 5% of doctors in the US may be Indians or of Indian origin, same for about 5% of NASA scientists and perhaps a similar proportion of Microsoft employees. What is probably also true is that most of these people with an Indian connection studied abroad in top universities before taking on these prestigious jobs.

We love to brag about what our fellow citizens achieve, and it is great to see that patriotism. But, sadly, we also love to exaggerate our achievements and inflate our egos in the process.

While it is great that Indians abroad have achieved and keep achieving great things, we must also reflect upon the fact that the country doesn’t do enough towards training its young to achieve those things at home. And a reminder of these facts and fables is necessary today in light of the recently released Indian budget.

Stopping the brain drain

India invests less than 1% of its GDP  in scientific research. Compare that with China’s investment of nearly 2%, the US’s of about 3% and South Korea’s of almost 4%. India’s ambition of featuring among the big nations of the world is being cut short because of our poor investment in science, and the results show.

The Times Higher Education ranking of world’s universities released today does not feature a single Indian educational institute in the top 100. Brazil, Russia and China—countries often considered India’s equal in the global economy—all feature at least one university in the prestigious list.

This mattes because, while we like to proclaim proudly that we are a nation full of engineers and doctors, we must also face the reality that the best of them go abroad to fulfil their dreams and aspirations. India needs to do better to hold on to its talents, and the easiest way to ensure that is to invest more in creating world-class universities for higher education and research.

Many bright minds have paid attention to this problem, and there has been some progress. The new Indian Institutes of Technology and the Indian Institutes of Science Education and Research, for instance, are welcome developments, but they must be provided enough support to flourish. The recent budget for science has even not managed to keep up with inflation—it is only 3.4% more than that in 2014.

Dheeraj Singh of the Indian Institute of Technology Kanpur thinks science departments need at least 15% more funds each year to live up to their promise. CNR Rao, former head of India’s Scientific Advisory council, told Nature, “There are scientists in India who want to do cutting-edge science. But to be competitive you need more funds.”

Paisa wise, Rupee foolish

And this plea is not without precedent. According to Space Foundation, for every $1 that the US government spends on NASA, the country’s gets $10 in economic benefit. Such a return is made possible because of the knowledge and expertise that NASA creates. Technologies such as global positioning systems used by smartphone users to find their location on a map and weather forecasting using satellites were developed by NASA researchers. The recent success of the Mars Orbiter Mission shows that Indian space research can do similar things with smaller investments.

Some may argue that India needs to invest in healthcare before it invents new rockets. While that argument has problems, even if accept it, the recent budget did worse for healthcare than for science. A 15% cut to the healthcare budget means that India’s health crisis is going to get worse. For all that prime minister Narendra Modi has promised, his actions speak louder than words. His government prefers the short-term benefits much more than long-term growth.

First published in Lokmat Times. Image credit: NASA.

Drug resistance risks sending humanity back to the 19th century

Swine Flu is back, and it is spreading with a vengeance. The total number of recorded cases in India has crossed the 20,000-mark and it has led to more than 1,000 deaths. While this number is much smaller than the number of those killed by the 2009 swine flu pandemic, it is worrying because the new strain appears to be harder to treat. According to Om Jaslok, director of the infectious diseases department at Jaslok Hospital in Mumbai, patients need to be treated with anti-viral medicine oseltamivir for an average of 10 days, which is twice the length of treatment in previous outbreaks.

The World Health Organisation (WHO) had warned of such a swine flu outbreak as far back as 2013. The phenomenon of pathogens developing resistance to drugs is not new. Those few pathogens that survive the onslaught of drugs pass on their traits to the next generation, giving birth to a drug-resistant strain. What is worrying is that drug resistance is increasing at a much faster rate than the pace at which humans can come up with new drugs or treatments.

Disaster-in-waiting

While flu epidemics come and go, drug resistance in persistent diseases is even more worrying. Take the example of malaria, the biggest single killer of humans since our species began walking on this planet. In the last century, the increased attention to malaria produced drugs that could effectively treat the disease. One such drug is chloroquine, which was widely used across the world and saved millions of lives. However, since the 90s, chloroquine is no longer a weapon in the medical arsenal. Mosquito parasites—responsible for causing and spreading malaria—have become resistant to the drug, and we have been forced to develop newer drugs.

A bigger worry is the development of antibiotic resistance. Antibiotics, first discovered in the early 20th century, turned lethal diseases, such as pneumonia, into treatable conditions. These drugs have become the bedrock of effective modern medical treatments. They are used in surgeries to protect patients from acquiring diseases via open wounds, and given to cancer patients to help boost their immunity against pathogens. But fewer of them are as effective today.

These antibiotic-resistant microbes are referred to as superbugs, and they kill ruthlessly. According to a report in the respected journal The Lancet, more than 58,000 newborns in India were killed in 2013 because of these superbugs, such as Staphylococcus aureus and Clostridium difficile. One particular enzyme, New Delhi metallo-beta-lactamase 1 (NDM-1), which helps create superbugs has been named after where it is believed to have originated, and we currently have no way of countering the enzyme’s spread.

You can act

The economic cost of drug resistance is potentially huge. According to the 2014 Review on Antimicrobial Resistance, the accumulated loss to global output by 2050 can be as much as $100 trillion, which is more than 50 times the current GDP of India. Even such a dramatic prediction may be an underestimate, according to Jeremy Farrar, the director of Wellcome Trust, one of the world’s biggest biomedical charities, because the review only includes direct costs. For instance, if routine surgeries become impossible because of antibiotic-resistant bugs, they will have an effect on healthcare as a whole and lead to more deaths.

There are ways in which we can fight against drugs resistance. One way is to invest more in finding new and more effective drugs. But success down this route has become much harder to achieve. The pharmaceutical industry is undergoing a crisis, producing fewer new drugs. The other way is to slow down the march of drug-resistant microbes. This requires that governments, doctors and citizens to work together.

Farrar notes that we have been taking antibiotics for granted, treating them as consumer goods. They seem to be “ours to demand from doctors and ours to take or stop taking as we see fit.” This is a huge problem. People stop taking pills as soon as they feel well, but that doesn’t mean the microbes have been completely eliminated. Instead the partial course of antibiotics leaves a great number of mildly resistant microbes alive, which then spawn new generations and spread drug resistance.

If we are not to return our healthcare back to the 19th century, where infectious diseases killed rampantly, we must learn to respect these wonder drugs. Doctors must be conservative in giving out prescriptions. And, if there is no option, the least patients can do is take the full course prescribed.

First published in Lokmat Times. Image by NIAID.

Big Pharma cannot ignore developing world diseases anymore

We may not realise this but a lot of us today live better lives than those of medieval kings. Most people then died under the age of 40 and lived in constant fear of contracting diseases. The worst killers were infectious diseases, such as plague, caused by bacteria and spread by poor sanitation. The last century has seen us banish these evils with the help of antibiotics and easier access to better hygiene.

Sadly the benefits of medicine haven’t been spread evenly throughout the world. In developing countries, there remain many diseases that were eliminated in the developed world decades ago. One of the biggest causes for this is that Big Pharma finds it unprofitable. Because it is not the rich world that they affect, the economics of drug discovery – which admittedly requires huge investments – to treat poor people’s diseases do not work out in favour of the pharmaceutical industry.

A matter of timing

Alexander Fleming’s discovery of penicillin – the first antibiotic – in 1928 was an accident, but the world had been waiting a long time for something like it. The timing couldn’t have been better. By then the pharmaceutical industry had begun to systematically search for new drugs. Governments had recognised the role that new medicines could play in improving health and, in turn, economic productivity. By the time World War II started, they had set up the right safeguards to allow the industry to mass produce drugs. Historical records suggest that Nazi Germany’s inability to produce enough penicillin may have played a role in their eventual loss.

As the decades rolled on, pharmaceutical industry, such as German company Bayer and British company Glaxo, grew bigger and bigger. Today, a bunch of these giants are collectively referred to as Big Pharma. And all of them have most of their operations in the rich world.

While India has a pharmaceutical industry, which is increasingly playing a bigger role in the global market, drug development is too big a challenge for it. That is because the price of drug development has been rising quickly. Today the development of a single drug can cost billions of dollars (lakhs of crores of rupees). And this has created a negative cycle, where Big Pharma mostly invests in the development of those drugs that will provide them a return on the billions of dollars they invest to develop it. The upshot is that poor people’s diseases are neglected.

Ignore no more

So severe has this neglicence been in recent years that the World Health Organisation now lists 17 diseases under a priority list of “Neglected Tropical Diseases”. These include dengue,chikungunya, rabies and leprosy. Even beyond that list, however, other diseases remain under-researched. These include malaria, tuberculosis and diarrhoea. All these don’t exist in the developed world, but cause millions of deaths in developing countries.

This must change. There are moral reasons for why letting millions die from preventable causes is wrong, but the nature of modern corporations is such that moral reasons work only in extreme circumstances. Fortunately, there is now a growing economic case. As markets in the west become saturated, pharma industry is looking to the emerging world, especially countries such as India and China, for a new market.

As they start coming they will first cater to the rich and the growing middle class, but these companies won’t be able to survive without serving the poor too. For instance, India’s patent laws force Western firms to provide compulsory licenses or provide their own drugs at cheaper rates, if the country’s courts find the drugs are essential but unaffordable. Instead of giving up their exclusivity, many firms are choosing the latter option. Profit margins will fall but sale volumes will rise.

With such changes underfoot, it is now time that Big Pharma also look to cater to poor people’s diseases. While the economic case to profit from such work is becoming stronger, governments could help through subsidies to attract these companies to begin their work sooner. The expertise and knowledge that they will bring could revolutionise healthcare for the poor.

First published in Lokmat Times.

Genetic testing is all the rage, but its promise is limited

New technologies often take decades to reach Indian shores. Not so in the case of genetic testing. Within 10 years of the launch of the world’s first direct-to-consumer service, genetic testing has found a booming market in India.

Your DNA, unless you have an identical twin, is unique. The idea behind any genetic test is to understand whether the sequence of bases in your DNA have something useful to tell you. Those on offer in India can cost anywhere from ₹1,000 to ₹50,000.

Who’s your daddy?

One of the most popular genetic tests in India is used to test paternity. Be it a doubting husband or a long-lost son, these “peace-of-mind tests” can set the record straight. Their effectiveness is so high that Indian courts have used paternity tests as definitive evidence. Take the example of Congress politician ND Tiwari. In 2008, 28-year-old Rohit Shekhar claimed that Tiwari was his biological father. After a long-drawn battle, the court ordered a paternity test in 2012 and closed the case in favour of Shekhar.

This is how paternity testing works. A child inherits half their DNA from each parent. For the test, DNA samples in the form of cheek swabs are taken from all three individuals. These samples are then treated with restriction enzymes, which cut each DNA at pre-determined places. These cut-up pieces are then suspended in a solution and run through a gel, which lets shorter pieces run faster than bigger pieces. The pieces show up as dark spots on a light background. If the parents are indeed those making the claim, the child’s DNA patterns will appear to be a combination of the patterns of the parents.

This technique, called DNA fingerprinting, was developed in 1984 and has also been used to produce forensic evidence in thousands of criminal cases. Instead of comparing a DNA sample of a child with two others, say, it could be used to compare DNA found in some hair at a crime scene with that of the accused perpetrator.

Not the oracle

Not all genetic tests are so effective at giving useful information though. Many companies market genetic test results as a fortune-telling scroll. They claim that, based on your genetic information, they can predict whether you will get a disease or not. This is far from the truth. At best, genetic testing for health outcomes can be seen as a weather map, where predictions can be true but quite often they aren’t.

Even if genetic testing companies make this clear in their fine print, they haven’t done enough to correct public perception. For instance, a 2010 European survey revealed that nearly half of those asked felt “all children will (soon) be tested at a young age to find out what disease they get at a later age”.

While certain diseases, such as Huntington’s disease, have specific genetic mutations to blame, most diseases are a combination of environment, lifestyle and genes. There is no “gene for breast cancer”. Genes are indeed powerful, and they influence our appearance, intelligence, behaviour and health. But unlike what the public believes, genes do not determine those outcomes.

These public beliefs matter because they can and will affect policy. After 13 years of debate, in 2008 the US passed the Genetic Information Non-discrimination Act to ensure that insurance providers do not discriminate customers based on their genes. Before the genetic testing market in India explodes to ₹800 crores by 2018, as some predict, we need a similar act to safeguard people’s privacy. And even after that, treat any genetic test results with skepticism and care.

First published in Lokmat Times.