Jellyfish are the most energy efficient swimmers, new metric confirms

Even though a blue whale is much heavier than a tuna, the mammal consumes less energy per unit weight than the fish when they travel the same distance. For years, these sort of comparisons have dominated our understanding of the energy efficiency of animal movement, which is important for designing vehicles inspired by nature, such as underwater drones.

But Neelesh Patankar, professor of mechanical engineering at Northwestern University, believes that this measure has only limited benefit. Instead, with his colleagues, he has come up with a new measure that allows comparison of animals as small as bees or zebrafish with animals as large as albatrosses or blue whales.

The new measure has two implications. First, among those that have typical swimming and flying actions, which includes most fish and all birds, each animal is as energy efficient as it can be. This means that, given their size and shape, each animal is able to spend the least amount of energy to move the most distance. Second, this measure confirms a previous finding that jellyfish are unusually energy efficient, beating all the thousands of fish and birds Patankar studied.

“Put another way, a whale and a tuna are equally energy efficient,” Patankar said. “Except jellyfish, which have an unusual action that makes them more efficient.”

A new measure

To understand why jellyfish are special, we need to first answer the question why we need a new measure for energy efficiency. Patankar offers an analogy: if there are two cars that are of equal weight, would you expect them to have the same mileage? Just as in cars, animals’ motion will vary based on factors other than their weight.

John Dabiri, professor of aeronautics and bioengineering at California Institute of Technology, said, “It is not immediately obvious how to compare the swimming efficiency of a bacterium and a blue whale, for example, but Patankar and colleagues have developed one.”

To make the comparison, Patankar borrowed from a well-known concept in physics called the Reynolds number, which explains the relationship between two forces that act on any body that is moving through a fluid. The first is viscous force, which is, crudely put, the push you feel when you put your hand out of a moving vehicle. The second is inertial forces, which is the tendency of a moving object to keep moving (or that of a stationary object to remain stationary).

Depending on the size of a body and the speed at which it travels, the body faces either a low Reynolds number, where the forces acting on a body are mostly viscous forces, or a high Reynolds number, where inertial forces dominate. This creates a natural difference in how much energy is spent countering these forces.

Reynolds number was developed to look at the aerodynamics of stiff bodies, such as aeroplanes and ships. But Patankar reckoned he could use it to help compare animals of different sizes. He gathered data from thousands of birds and fish to come up with a metric called the energy-consumption coefficient, which he has described in the Proceedings of the National Academy of Sciences. Using it, he found that all the animals he looked at (except jellyfish) are as energy-efficient as they can be.

Note that Y-axis is for energy-consumption coefficient, not for energy efficiency.
Rahul Bale

“The idea that animals are tuned for energy-efficient locomotion is not surprising, but the authors have devised a fresh approach to the issue of how to compare the efficiencies of different animals,” Dabiri said.

Patankar finds, as he had hoped, that small animals find themselves in low Reynolds number situations, and large animals find themselves in high Reynolds number situations. This means they expend energies differently, which is what Patankar’s coefficient represents. Using the coefficient, one can compare the energy efficiency of bodies weighing few grams to many tonnes.

The coefficient also indicates that animals that fly are less energy-efficient than those that swim. This, Patankar thinks, must be because those in flight have to expend more energy to counteract gravity than those in water.

Jelly’s secrets

While working on the energy-consumption coefficient, he came across recent work done by Dabiri and his colleagues which showed that the unique contract-and-relax action of jellyfish allowed it to recapture some of the energy it spends on motion. This means a jellyfish can travel a lot more distance for the same amount of energy spent by other animals adjusted for its weight and size.

When Patankar used Dabiri’s data and plotted it on his energy-consumption coefficient chart, he found that the only animals that were more energy efficient than he had predicted were jellyfish.

“We found that each swimming or flying animal can spend all the energy it has at its disposal. However, our coefficient is a fair way to conclusively show that indeed jellyfish are more efficient,” Patankar said.

Dabiri is already working on exploiting jellyfish propulsion. However, he thinks that, apart from providing a new metric to compare different types of animals on the energy-efficiency scale, Patankar’s measure could be a used for evaluating the performance of aerial and underwater drones that are being developed, especially those with designs that are inspired by flying and swimming animals.The Conversation

First published on The Conversation.

Search for alien life could remain fruitless

Given that we are unlikely to be visiting an exoplanet any time soon, astronomers have been contemplating whether it might be possible to detect indications of simple life – a biosignature – from a distance. Many think that the strongest case for extraterrestrial life would be the discovery of oxygen and methane on the same body. They also think that the likelihood of finding such a biosignature is greatest on an Earth-like planet that is orbiting a sun-like star.

Astronomers who hope to search for these biosignatures in expolanets, however, may be in for a disappointment. New research finds that there is no way we can confirm that such a signature is actually the result of extraterrestrial life. The problem, it turns out, is that an exomoon’s atmosphere will be indistinguishable from the one of the planet it orbits.

Finding E.T.

Searching for extraterrestrial life is no easy feat. Astronomers have to first search for a star that has planets. Then they have to ensure that there is at least one planet that orbits this star in the habitable zone, which is a region around the star in which we might expect liquid water. Finally, they have to record the faint light that originated from the bright star and was reflected off the exoplanet after having passed through its atmosphere.

This faint light, even if only a handful of photons, when compared with light from the parent star is enough to give some indication of the chemicals in the atmosphere of this planet. Life as we know it creates two gases that wouldn’t naturally be present in an atmosphere at the same time – oxygen from photosynthesis and methane from microbes.

Both oxygen and methane can be created independently by non-living processes, so their individual presence is of little interest. What scientists are looking for is both of them in the atmosphere of a single body. If these reactive gases are not constantly replenished by living things, they will react with each other, creating carbon dioxide and water. As a result, we should not observe them in the same atmosphere without a large, living source.

False hopes

In the new study, published in the Proceedings of the National Academy of Sciences, Hanno Rein at the University of Toronto and his colleagues wanted to know whether anything else could mimic this biosignature. While working through potential false positives, which are signals that would show signs of life but in reality there isn’t life, he found a big one: exomoons. Rein found that observers on Earth will not be able to tell whether the signs of methane and oxygen originate from a single celestial body, or come from two nearby worlds.

This could happen because, just as Earth has a moon, there is a chance that exoplanets will have exomoons. While we have yet to find an exomoon, looking at the various moons of our solar system’s planets suggests that exomoons ought to be plentiful. However, even if they are plentiful, chances are that exomoons will be difficult to spot.

If both these celestial bodies have an atmosphere and in their atmospheres the exoplanet has oxygen and the exomoon has methane (or vice-versa), then an observer on Earth will record an oxygen-methane biosignature. This might seems like evidence for life, whereas in reality both these gases are being produced by non-living processes on two separate celestial bodies. Since they can’t react with each other, they will be able to build up to high levels.

Futile technology

“Even if we somehow developed ways of finding exomoons, we won’t be able to tease out the difference between their atmospheres given the limited amount of light that reaches us,” Rein said. This fundamental limit on the light that reaches us is called photo noise.

Rein limited his analysis to biosignatures coming from Earth-like planets orbiting a sun-like star, which is the combination that astronomers are betting has the greatest chance of hosting life. The American space agency NASA recently announced that they had found such an Earth-sized planet less than 500 light years away, although the star it orbits isn’t sun-like.

While their analysis might seem quite restrictive and involves a number of assumptions, it does not really matter: interpretation of biosignatures needs to be flawless. According to David Cullen at the University of Cranfield, “This study seems to highlight a real issue that will needed to be considered when interpreting biosignatures.”

Rein himself was surprised to find such a limitation. However, he sees the results of his work in positive light. “Finding such a limitation tells us what we should focus on in the future. Rather than a restricted search for Earth-like planets orbiting sun-like stars, we should broaden our search,” he said.

What this research shows is a need to move away from a highly focused search for extraterrestrial life that is currently in place. Rein points out that the chances of eliminating such false positive biosignatures increases as the star becomes dimmer or larger planets are considered. Perhaps alien life is not just unlike that on Earth, but it is also resides in a place that is unlike Earth.The Conversation

First published on The Conversation. Image credit: bflv.

My year with the real wonks: how academia enriches journalism

I stepped out of a chemistry lab to receive a shiny doctorate a little more than two years ago. Then, against the wisdom of many, I decided to become a journalist. That decision was made not because I despised academia, but because it seemed to me that journalism was where my strengths would give me the best chance to succeed.

In doing so, I was leaving behind a world that I had tremendous respect for. Dedicating one’s life to pursuing hard questions in a narrow field of knowledge enriches the world in countless ways. That enrichment is the result of two things: production of new knowledge and new knowledge-bearers (ie students). What you read in popular press about universities is mostly what new research has found about the world. A less talked about, and perhaps greater, contribution that universities make is in educating new students.

Teaching the same course year after year sounds boring to me, but I’ve been assured by many that it is one of the reasons they enjoy being academics. This yearly practice of coming up with new and better ways of explaining fundamental concepts combined with the struggles on the edge of knowledge in a particular field gives these academics the power of conveying the meaning of complex concepts in simple and powerful ways.

A new experiment

If I were asked to give one reason for choosing science journalism, it would be that I get to learn new things about the world all the time. Hardly a days goes by when there isn’t something awesome in science news to read and write about. That is why when I was offered, a year ago today, to be the launch editor for the science and technology section of The Conversation’s UK edition, I wasn’t going to let the job go.

But there was another reason why the job appealed to me: the idea was to get academics to write for the public. The hope was that, with their expertise and skills at explaining ideas, they would help put news in broader context and convey the “meaning” of events to help improve public dialogue on important topics.

While the scientist in me was dancing with joy, the journalist was sceptical. What academics usually write is meant for fellow academics. Their use of passive tense and jargon can put off even the most interested non-experts. They also work on vastly different timescales. Journal articles can take from months to years to get published. News articles usually take only few hours or days to get to the reader.

Marrying the two professions for a public service project was a great idea, but would it work? Could the third major contribution of universities be educating the public (not just a promise, but a reality)?

Is there demand?

“Professors, we need you!” said Nicholas Kristof in the New York Times. The Conversation Media Group, founded in 2011 in Australia, got to work before Kristof made the public demand. By the time it launched in the UK in May 2013, it had shown that the Australian public had an appetite for this experiment.

The success down under was swift for one more reason—The Conversation represented a “third choice”. Until 2011, most newspapers and online news websites were owned by either Fairfax or News Corp, which allowed The Conversation to tap into a readership eagerly looking for alternatives.

The UK was different. It had (and still has) some of the most respected publications in the world. There was plenty of choice for an average reader across the political divide. Yet, it seemed that The Conversation stood a chance. Many of the best publications were under financial constraints, cutting staff, especially specialist reporters in science, environment and health. There was scope for explaining news better, and bringing new stories that journalists missed or didn’t have the time to cover.

Readership figures show that the experiment has been successful so far. For the last few months, which is less than a year since launch, the UK edition alone has been reaching more than 2 million readers, and that number is growing quickly. All this with a small team (seven editors at launch, then 14 since February) and no marketing budget.

As a Creative Commons publisher, The Conversation’s authors and their articles have featured in some of the top publications worldwide, which have different aims and leanings—The Guardian, Washington Post, New York Times, The Independent, The Hindu, Daily Mail, New Statesman, The Week, The Atlantic, Quartz, Business Insider, Scientific American, Popular Science, Discover Magazine, Ars Technica and Slate, among others.

Much of my scepticism about this job was reasonable. But, right from the start, I was pleasantly surprised at the both the quality and the speed of writing. When given a brief and a deadline, academics usually delivered. Sure first-time authors needed (and still need) lots of help, but most of them were also prepared to learn and improve in this form of communication. What surprised me the most was their enthusiasm. Whoever thinks academics don’t like to engage with the public should spend just one day in our office.

For the first few months, about four in five stories were those where I had to approach an academic with an idea and commission them to write an article. But as The Conversation’s name started spreading, I started getting in more pitches. This was what I was waiting for. Academics who understand what The Conversation does, who get what the public reads, and who were willing to spend the time to write such articles. These academics were bringing through new stories or new angles to old stories, all of which journalists had missed. Here, I realised, were the true wonks.

What is true wonkery?

Recently Felix Salmon of Reuters asked, “Is there a wonk bubble?” In answering that question, he mainly referred to the launch of two websites Vox.com, which wants to “explain the news”, and FiveThirtyEight.com, which wants to use data to tell news stories. I agree with Salmon that both these experiments are great for journalism, but I don’t think that they represent “wonkery” in the true sense.

The new publications are being built on the back of the wonkery of its Editors-in-Chief: Ezra Klein (politics and economics wonk) and Nate Silver (data wonk). The rest of the editorial staff, while quite capable and of high calibre, can’t all be classed in the same category as wonks, definitely not in Salmon’s narrow definition of journalists who know their subject really well and built their reputation through blogging (mostly about policy and politics).

Wonk’s definition as “a person who is obsessively interested in a specified subject” is actually much more accurate for academics (or even PhD students). That is why I class them as the true wonks. Being able to tap into this wonkery, or expertise (as most people would call it), can bring through stories that journalists would just not find on their own.

Economists such as Tyler Cowen, Paul KrugmanSimon Wren-Lewis and David Blanchflower command large audiences already. Scientists have had a long tradition of popularising science, be it Carl Sagan or Brian Cox. Now, beyond promoting the good work of already engaged academics, what The Conversation provides is a platform for new and diverse voices with fresh ideas, which would have otherwise remained in the ivory towers. More than 11,000 academics from over 700 institutions have already contributed to this new conversation.

To give you a flavour of what I mean, I have selected some of my favourite science stories on The Conversation from the past year. They have been split into four categories: the first is explanatory (The Contextual) and the other three are stories that journalists missed or couldn’t dig up  (The Newsworthy, The Amazing and The Strange). I trust you can judge for yourself whether the experiment is worth it.

The Contextual

The Newsworthy

The Amazing

The Strange

Image credit: Lucas Warren

Education, breastfeeding and gender affect the microbes on our bodies

Trillions of microbes live in and on our body. We don’t yet fully understand how these microbial ecosystems develop or the full extent to which they influence our health. Some provide essential nutrients, while others cause disease. A new study now provides some unexpected influences on the contents of these communities, as scientists have found that life history, including level of education, can affect the sorts of microbes that flourish. They think this could help in the diagnosis and treatment of disease.

A healthy human provides a home for about 100 trillion bacteria and other microbes. These microbes are known as the microbiome, and normally they live on the body in communities, with specialised populations on different organs.

Evolution has assured that both humans and bacteria benefit from this relationship. In exchange for somewhere to live, bacteria protect their hosts from harmful pathogens. Past analysis of the gut microbiome has shown that, when this beneficial relationship breaks down, it can lead to illnesses such as Crohn’s disease, a chronic digestive disorder.

You’ve been swabbed

One of the largest research projects looking at the delicate connection between humans and their resident microbes is called the Human Microbiome Project (HMP). As part of the project, hundreds of individuals are being sampled for microbes on various parts of their bodies, with the hope that the data will reveal interesting relationships.

In the new study, published in Nature, Patrick Schloss at the University of Michigan and his colleagues set out to use data from the HMP to investigate whether events in a person’s life could influence their microbiome.

Their data came from 300 healthy individuals, with men and women equally represented, ranging in age between 18 and 40. Life history events, such as level of education, country of birth, diet, and recent use of antibiotics were among 160 data pieces were recorded. Finally, samples were swabbed from 18 places across the body to analyse their microbiome communities at two different time intervals, 12 to 18 months apart.

Those swabs underwent genomic analysis. A select group of four bacterial communities were selected to test what proportion of each was found on different body parts. That data was then compared with life history events. Only three life history events out of about 160 tested could be associated with a specific microbial community. These were: gender, level of education, and whether or not the subject was breastfed as a child.

This complicated issue may help diagnosis and treatment of illnesses. “If a certain community of bacteria is associated with a specific life history trait,” Schloss said, “it is not such a stretch to imagine that there may be microbiome communities associated with illnesses such as cancer.”

To be sure, these associations are only correlations. Neither Schloss nor hundreds of other scientists working on microbiome data can be sure why certain communities end up on certain body parts of only certain individuals. “We really don’t have a good idea for what determines the type of community you’ll have at any given body site,” Schloss said.

Lack of such knowledge means that Schloss cannot explain odd correlations, such as why women with a baccalaureate degree have specific communities in their vaginal microbiome. Because level of education is also associated with a range of other factors such as wealth and social status – we can’t know that it is only education affecting the vaginal microbiome. Janneke Van de Wijgert at the University of Liverpool said, “I think that it is impossible to tease out the individual effects of education, sexual behaviour, vaginal hygiene behaviour, ethnicity, and social status.”

Van de Wijgert believes the data has other limitations. “The study population of a mere 300 was homogenous and healthy – young, white women and men from Houston and St Louis – which likely means that much additional microbiome variation has been missed.”

With better tools, genomic data analysis has substantially improved since the project launched in 2008. Van de Wijgert thinks that future studies need to sample a lot more individuals and look for changes at shorter time intervals.

She is hopeful that microbiome data can be used to improve medicine, make it more tailored to individual. But before manipulations of the microbiome are used to treat illnesses, she said, it should be confirmed that the offending bacteria communities cause – and are not symptom of – disease. If the bacteria causes an illness, then efforts can be made – such as a change in diet or microbial transplant – to treat disease.The Conversation

Written with Declan Perry. First published on The Conversation. Image: NIAID

Scientists pinpoint when harmless bacteria became flesh-eating monsters

Bacterial diseases cause millions of deaths every year. Most of these bacteria were benign at some point in their evolutionary past, and we don’t always understand what turned them into disease-causing pathogens. In a new study, researchers have tracked down when this switch happened in a flesh-eating bacteria. They think the knowledge might help predict future epidemics.

The flesh-eating culprit in question is called GAS, or Group A β-hemolytic streptococcus, a highly infective bacteria. Apart from causing flesh-eating disease, GAS is also responsible for a range of less harmful infections. It affects more than 600m people every year, and causes an estimated 500,000 deaths.

These bacteria appeared to have affected humans since the 1980s. Scientists think that GAS must have evolved from a less harmful streptococcus strain. The new study, published in the Proceedings of the National Academy of Sciences, reconstructs that evolutionary history.

James Musser of the Methodist Hospital Research Institute and lead researcher of the study said, “This is the first time we have been able to pull back the curtain to reveal the mysterious processes that gives rise to a virulent pathogen.”

Genetic gymnastics

Musser’s work required analysis of the bacterial genetic data from across the world – a total of about 3,600 streptococcus strains were collected and their genomes recorded. It revealed that a series of distinct genetic events turned this bacteria rogue.

First, foreign DNA moved into the original harmless streptococcus by horizontal gene transfer – a phenomenon that is common among bacteria. Such DNA is often provided by bacteriophages, viruses that specifically target bacteria. Picking up foreign genes can be useful because it can improve the bacteria’s survival.

In this case, the foreign DNA that was incorporated in the host’s genome allowed the streptococcus cell to produce two harmful toxins. A further mutation to one of these toxin genes made it even more virulent.

According to Musser, another horizontal gene transfer event made a good disease-causing pathogen into a very good one. The additional set of genes allowed it to produce proteins that suppress the immune system of those infected, making the infection worse.

Marco Oggioni of the University of Leicester said, “Because this study used data of the entire genome, all the genetic change could be observed. This makes it possible to identify molecular events responsible for virulence, as you get the full picture.”

Musser could also accurately date the genetic changes in GAS by using statistical models to, as it were, turn back the clock on evolution. They say the last genetic change, which made GAS a highly virulent bacteria, must have occurred in 1983.

Continental drift

That timing makes a lot of sense. “The date we deduced coincided with numerous mentions of streptococcus epidemics in the literature,” Musser said. Since 1983, there have been several outbreaks of streptococcus infections across the world. For example, in the UK, streptococcus infections increased in number and severity between 1983 and 1985.

It is the same story for many other countries, with Sweden, Norway, Canada and Australia falling victim to what is now an inter-continental epidemic. The symptoms ranged from pharyngitis to the flesh-eating disease, necrotizing fasciitis.

“In the short term, this discovery will help us determine the pattern of genetic change within a bacteria, and may help us work out how often bacterial vaccines need to be updated,” Musser said. “In the long term, this technique may have an important predictive application – we may be able to nip epidemics in the bud before they even start.”

What Musser is suggesting is that if enough bacterial genomes are regularly recorded and monitored, there is a chance that mutations or gene transfers, such as those GAS experienced, could be found ahead of time.

But Oggioni is sceptical. “While making such predictions may not be possible, this research will have other applications,” he said. “Knowing which genetic changes happen when can help tailor drug discovery research in a certain direction.”

Oggioni added that Musser’s work with GAS is only a model. Using Musser’s methods to record the evolutionary histories of other pathogens could be quite useful to tackle the diseases they cause now and, perhaps, even those that they may cause in the future.The Conversation

Written with Declan Perry. First published on The Conversation. Image credit: Zappys Technology

Massive asteroid may have kickstarted the movement of continents

Earth was still a violent place shortly after life began, with regular impactors arriving from space. For the first time, scientists have modelled the effects of one such violent event – the strike of a giant asteroid. The effects were so catastrophic that, along with the large earthquakes and tsunamis it created, this asteroid may have also set continents into motion.

The asteroid to blame for this event would have been at least 37km in diameter, which is roughly four times the size of the asteroid that is alleged to have caused the death of dinosaurs. It would have hit the surface of the Earth at the speed of about 72,000kmph and created a 500km-wide crater.

At the time of the event, about 3.26 billion years ago, such an impact would have caused 10.8 magnitude earthquakes – roughly 100 times the size of the 2011 Japanese earthquake, which is among the biggest in recent history. The impact would have thrown vapourised rock into the atmosphere, which would have encircled the globe before condensing and falling back to the surface. During the debris re-entry, the temperature of the atmosphere would have increased and the heat wave would have caused the upper oceans to boil.

AGU

Donald Lowe and Norman Sleep at Stanford University, who published their research in the journal Geochemistry, Geophysics, Geosystems, were able to say all this based on tiny, spherical rocks found in the Barberton greenstone belt in South Africa. These rocks are the only remnants of the cataclysmic event.

According to Simon Redfern at the University of Cambridge, there are two reasons why Lowe and Sleep were able to find these rocks. First, the Barberton greenstone belt is located on a craton, which is the oldest and most stable part of the crust. Second, at the time of the event, this area was at the bottom of the ocean with ongoing volcanic activity. The tiny rocks, after having been thrown into the atmosphere, cooling, and falling to the bottom of the ocean, then ended up trapped in the fractures created by volcanic activity.

This impact may have been among the last few major impacts from the Late Heavy Bombardment period between 3 and 4 billion years ago. The evidence of most of these impacts has been lost because of erosion and the movement of the Earth’s crust, which recycles the surface over geological time.

However, despite providing such rich details about the impact, Lowe and Sleep are not able to pinpoint the location of the impact. It would be within thousands of kilometres of the Barberton greenstone system, but that is about all they can say. The exact location may not be that important, Lowe argued: “With this study, we are trying to understand the forces that shaped our planet early in its evolution and the environments in which life evolved.”

One of the most intriguing suggestions the authors make is that this three-billion-year-old impact may have initiated the the movement of tectonic plates, which created the continents that we observe on the planet.

The continents ride on plates that make up Earth’s thin crust; the crust sits on top of the mantle, which is above a core of liquid iron and nickel. The heat trapped in the mantle creates convection, which pushes against the overlying plates.

All the rocky planets in our solar system – Mercury, Venus, Earth and Mars – have the same internal structure. But only Earth’s crust shows signs of plate motion.

A possible reason why Earth has moving plates may be to do with the heat trapped in the mantle. Other planets may not have as much heat trapped when they formed, which means the convection may not be strong enough to move the plates.

However, according to Redfern: “Even with a hot mantle you would need something to destabilise the crust.” And it is possible that an asteroid impact of this magnitude could have achieved that.The Conversation

First published on The Conversation.

Cassini points to a hidden ocean on Saturn’s icy moon

Finding liquid water on a celestial body within the solar system is exciting. The only thing that is probably more exciting is finding an ocean full of it. Today such news comes via Cassini, which has made measurements that show that Saturn’s moon Enceladus has a hidden ocean beneath its icy surface.

While orbiting Saturn in 2005, Cassini found jets of salty water spewing from the south polar region of Enceladus. According to Luciano Iess of Sapienza University of Rome, lead author of the new study published in Science, “The discovery of the jets was unexpected.”

Geysers require liquid water, and we wouldn’t expect Enceladus to have any. It is too far from the Sun to absorb much energy and too small (just 500km in diameter) to have trapped enough internal energy to keep its core molten. The answer to how the water got there might lie in the details of the moon’s internal structure.

Water beneath an icy crust

The data to understand Enceladus’s internal structure came from by measuring changes in Cassini’s speed as it flew close to the moon. When passing the denser parts of the moon, it sped up by a few extra thousandths of a metre per second. That minute change was tracked through recordings of the radio signals Cassini was sending to NASA’s Deep Space Network station.

In making such tiny measurements, scientists had to filter out other factors that could influence Cassini’s speed. These include pressure on the spacecraft from sunlight, the nudge from heat radiating from its nuclear-powered electrical generator, and the drag of the particles it strikes as it passes through the south polar plumes.

Iess and his colleagues have produced a model of the internal structure of Enceladus using the measurements. They conclude that there is a core that is roughly 200km in diameter; above that lies a 10km-thick layer of liquid water, which is followed by 40km of ice crust. The water layer may extend all the way to the north pole, but its thickest part lies at the south pole.

NASA/JPL-Caltech

It is possible that Saturn’s powerful gravity is responsible for the liquid water under Enceladus’s surface. Its pull could heat up the interior through a process called tidal kneading, which creates tides in the ocean causing internal friction and thus heat.

After the initial discovery of the plumes, Cassini’s minders put a lot of effort into determining Enceladus’s internal structure, but it still took nearly ten years to do so. This is because the time the spacecraft spends around Saturn is very valuable, and there are lots of other things worth studying.

Cassini can only make a handful of flybys near Enceladus while still paying attention to other moons, such as Titan. When approaching Enceladus, the controllers also had to make a choice about how to study the moon because of a limitation in how Cassini’s instruments are arranged. When making gravitational recordings it needs to point its antenna towards Earth, but in doing so all its other instruments face away from Enceladus. Of the 19 flybys, only three were used to make gravitational recordings.

“After spending eight years in the Saturnian system, one may think that the measurements are becoming repetitive and that Cassini has discovered everything in the reach of its instruments. This is far from being true,” Iess said.

Time is running out

“The evidence adds up to a large and active body of water under Enceladus’s southern polar region”, Helen Maynard-Casely of Australian Nuclear Science and Technology Organisation said. But she warned, “It is going to be a long time before we can verify if this ocean is there, if ever.”

The plutonium-powered spacecraft has enough energy to power itself till 2017. The trouble is that, in three years, it will only be able to make three more flybys of Enceladus, which is not enough to take more gravity data. Its end is slated to come when controllers drive it into Saturn’s atmosphere for incineration, because scientists are keen to avoid having it crash into Saturn’s pristine moons.

There is a push to send another mission to Saturn, but Jupiter’s moon Europa might be a better candidate to search for life. At 3,100km in diameter, it is much larger than Enceladus, and, in December, astronomers spotted water vapour coming from its south pole, as well.

The possibility of finding a large amount of liquid water is exciting because, for life to exist as we understand it, we need liquid water. Even on Earth, whenever untouched sources of liquid water, such as Lakes Vostok and Ellsworth under Antarctica, are studied, there is always the hope that we may discover new forms of life.The Conversation

First published on The Conversation. Image credit: NASA/JPL/SSI/J Major