Scientific American reports: Songbirds stutter, babble when young, become mute if parts of their brains are damaged, learn how to sing from their elders and can even be “bilingual” — in other words, songbirds’ vocalizations share a lot of traits with human speech. However, that similarity goes beyond behavior, researchers have found. Even though humans and birds are separated by millions of years of evolution, the genes that give us our ability to learn speech have much in common with those that lend birds their warble.
A four-year long effort involving more than 100 researchers around the world put the power of nine supercomputers into analyzing the genomes of 48 species of birds. The results, published this week in a package of eight articles in Science and 20 papers in other journals, provides the most complete picture of the bird family tree thus far. The project has also uncovered genetic signatures in song-learning bird brains that have surprising similarities to the genetics of speech in humans, a finding that could help scientists study human speech.
The analysis suggests that most modern birds arose in an impressive speciation event, a “big bang” of avian diversification, in the 10 million years immediately following the extinction of dinosaurs. This period is more recent than posited in previous genetic analyses, but it lines up with the fossil record. By delving deeper into the rich data set, research groups identified when birds lost their teeth, investigated the relatively slow evolution of crocodiles and outlined the similarities between birds’ and humans’ vocal learning ability, among other findings. [Continue reading…]
Patrick Bateson writes: I am disturbed by the way we have created a social environment in which so much emphasis is laid on competition – on forging ahead while trampling on others. The ideal of social cooperation has come to be treated as high-sounding flabbiness, while individual selfishness is regarded as the natural and sole basis for a realistic approach to life. The image of the struggle for existence lies at the back of it, seriously distorting the view we have of ourselves and wrecking mutual trust.
The fashionable philosophy of individualism draws its respectability in part from an appeal to biology and specifically to the Darwinian theory of evolution by natural selection. Now, Darwin’s theory remains the most powerful explanation for the way that each plant and animal evolved so that it is exquisitely adapted to its environment. The theory works just as well for behaviour as it does for anatomy. Individual animals differ in the way they behave. Those that behave in a manner that is better suited to the conditions in which they live are more likely to survive. Finally, if their descendants resemble them in terms of behaviour, then in the course of evolution, the better adapted forms of behaviour will replace those that are not so effective in keeping the individual alive.
It is the Darwinian concept of differential survival that has been picked up and used so insistently in political rhetoric. Biology is thought to be all about competition – and that supposedly means constant struggle. This emphasis has had an insidious effect on the public mind and has encouraged the belief in individual selfishness and in confrontation. Competition is now widely seen as the mainspring of human activity, at least in Western countries. Excellence in the universities and in the arts is thought to be driven by the same ruthless process that supposedly works so well on the sportsfield or the market place, and they all have a lot in common with what supposedly happens in the jungle. The image of selfish genes, competing with each other in the course of evolution has fused imperceptibly with the notion of selfish individuals competing with each other in the course of their life-times. Individuals only thrive by winning. The argument has become so much a part of conventional belief that it is hard at first to see what is wrong with it.
To put it bluntly, thought has been led seriously astray by the rhetoric. [Continue reading…]
The discovery of what appear to have been deliberately etched markings made by a human ancestor, Homo erectus, on the surface of a shell, call for a reconsideration of assumptions that have been made about the origins of abstract thought.
While the meaning of these zigzag markings will most likely remain forever unknown, it can reasonably be inferred that for the individual who created them, the marks had some significance.
In a report in Nature, Josephine Joordens, a biologist at Leiden University whose team discovered the markings, says:
“We’ve looked at all possibilities, but in the end we are really certain that this must have been made by an agent who did a very deliberate action with a very sharp implement,” says Joordens. Her team tried replicating the pattern on fresh and fossilized shells, “and that made us realize how difficult it really was”, she says.
Saying much more about the engraving is tricky. “If you don’t know the intention of the person who made it, it’s impossible to call it art,” says Joordens.
“But on the other hand, it is an ancient drawing. It is a way of expressing yourself. What was meant by the person who did this, we simply don’t know, ” she adds. “It could have been to impress his girlfriend, or to doodle a bit, or to mark the shell as his own property.”
Clive Finlayson, a zoologist at the Gibraltar Museum who was part of the team that described cross-hatch patterns linked to Neanderthals, is also agnostic about whether to call the H. erectus doodles art. What is more important, he says, is the growing realization that abilities such as abstract thinking, once ascribed to only H. sapiens, were present in other archaic humans, including, now, their ancestors.
“I’ve been suggesting increasingly strongly that a lot of these things that are meant to be modern human we’re finding in other hominids,” he says. “We really need to revisit these concepts and take stock.”
Palaeoanthropology, by necessity, is a highly speculative discipline — therein lies both its strength and its weakness.
The conservatism of hard science recoils at the idea that some scratches on a single shell amount to sufficient evidence to prompt a reconsideration about the origins of the human mind, and yet to refrain from such speculation seems like an effort to restrain the powers of the very thing we are trying to understand.
Rationally, there is as much reason to assume that abstract thinking long predates modern humans and thus searching for evidence of its absence and finding none would leave us agnostic about its presence or absence, than there is reason to assume that at some juncture it was born.
My inclination is to believe that any living creature that has some capacity to construct a neurological representation of their surroundings is by that very capacity employing something akin to abstract thinking.
This ability for the inner to mirror the outer has no doubt evolved, becoming progressively more complex and more deeply abstract, and yet mind, if defined as world-mirroring, seems to have been born when life first moved.
Live Science: Human ancestors may have begun evolving the knack for consuming alcohol about 10 million years ago, long before modern humans began brewing booze, researchers say.
The ability to break down alcohol likely helped human ancestors make the most out of rotting, fermented fruit that fell onto the forest floor, the researchers said. Therefore, knowing when this ability developed could help researchers figure out when these human ancestors began moving to life on the ground, as opposed to mostly in trees, as earlier human ancestors had lived.
“A lot of aspects about the modern human condition — everything from back pain to ingesting too much salt, sugar and fat — goes back to our evolutionary history,” said lead study author Matthew Carrigan, a paleogeneticist at Santa Fe College in Gainesville, Florida. “We wanted to understand more about the modern human condition with regards to ethanol,” he said, referring to the kind of alcohol found in rotting fruit and that’s also used in liquor and fuel. [Continue reading…]
Ed Yong writes: In the late 17th century, the Dutch naturalist Anton van Leeuwenhoek looked at his own dental plaque through a microscope and saw a world of tiny cells “very prettily a-moving.” He could not have predicted that a few centuries later, the trillions of microbes that share our lives — collectively known as the microbiome — would rank among the hottest areas of biology.
These microscopic partners help us by digesting our food, training our immune systems and crowding out other harmful microbes that could cause disease. In return, everything from the food we eat to the medicines we take can shape our microbial communities — with important implications for our health. Studies have found that changes in our microbiome accompany medical problems from obesity to diabetes to colon cancer.
As these correlations have unfurled, so has the hope that we might fix these ailments by shunting our bugs toward healthier states. The gigantic probiotics industry certainly wants you to think that, although there is little evidence that swallowing a few billion yogurt-borne bacteria has more than a small impact on the trillions in our guts. The booming genre of microbiome diet books — self-help manuals for the bacterial self — peddles a similar line, even though our knowledge of microbe-manipulating menus is still in its infancy.
This quest for a healthy microbiome has led some people to take measures that are far more extreme than simply spooning up yogurt. [Continue reading…]
Kat McGowan writes: Tales about the origins of our species always start off like this: A small band of hunter-gatherers roams the savannah, loving, warring, and struggling for survival under the African sun. They do not start like this: A fat guy falls off a New York City subway platform onto the tracks.
But what happens next is a quintessential story of who we are as human beings.
On Feb. 17, 2013, around 2:30 a.m., Garrett O’Hanlon, a U.S. Air Force Academy cadet third class, was out celebrating his 22nd birthday in New York City. He and his sister were in the subway waiting for a train when a sudden silence came over the platform, followed by a shriek. People pointed down to the tracks.
O’Hanlon turned and saw a man sprawled facedown on the tracks. “The next thing that happened, I was on the tracks, running toward him,” he says. “I honestly didn’t have a thought process.”
O’Hanlon grabbed the unconscious man by the shoulders, lifting his upper body off the tracks, but struggled to move him. He was deadweight. According to the station clock, the train would arrive in less than two minutes. From the platform, O’Hanlon’s sister was screaming at him to save himself.
Suddenly other arms were there: Personal trainer Dennis Codrington Jr. and his friend Matt Foley had also jumped down to help. “We grabbed him, one by the legs, one by the shoulders, one by the chest,” O’Hanlon says. They got the man to the edge of the platform, where a dozen or more people muscled him up and over. More hands seized the rescuers’ arms and shoulders, helping them up to safety as well.
In the aftermath of the rescue, O’Hanlon says he has been surprised that so many people have asked him why he did it. “I get stunned by the question,” he says. In his view, anybody else would’ve done the same thing. “I feel like it’s a normal reaction,” he says. “To me that’s just what people do.”
More precisely, it is something only people do, according to developmental psychologist Michael Tomasello, codirector of the Max Planck Institute for Evolutionary Anthropology.
For decades Tomasello has explored what makes humans distinctive. His conclusion? We cooperate. Many species, from ants to orcas to our primate cousins, cooperate in the wild. But Tomasello has identified a special form of cooperation. In his view, humans alone are capable of shared intentionality—they intuitively grasp what another person is thinking and act toward a common goal, as the subway rescuers did. This supremely human cognitive ability, Tomasello says, launched our species on its extraordinary trajectory. It forged language, tools, and cultures—stepping-stones to our colonization of every corner of the planet. [Continue reading…]
The Guardian reports: Paintings of wild animals and hand markings left by adults and children on cave walls in Indonesia are at least 35,000 years old, making them some of the oldest artworks known.
The rock art was originally discovered in caves on the island of Sulawesi in the 1950s, but dismissed as younger than 10,000 years old because scientists thought older paintings could not possibly survive in a tropical climate.
But fresh analysis of the pictures by an Australian-Indonesian team has stunned researchers by dating one hand marking to at least 39,900 years old, and two paintings of animals, a pig-deer or babirusa, and another animal, probably a wild pig, to at least 35,400 and 35,700 years ago respectively.
The work reveals that rather than Europe being at the heart of an explosion of creative brilliance when modern humans arrived from Africa, the early settlers of Asia were creating their own artworks at the same time or even earlier.
Archaeologists have not ruled out that the different groups of colonising humans developed their artistic skills independently of one another, but an enticing alternative is that the modern human ancestors of both were artists before they left the African continent.
“Our discovery on Sulawesi shows that cave art was made at opposite ends of the Pleistocene Eurasian world at about the same time, suggesting these practices have deeper origins, perhaps in Africa before our species left this continent and spread across the globe,” said Dr Maxime Aubert, an archaeologist at the University of Wollongong. [Continue reading…]
Quanta Magazine: In his fourth-floor lab at Harvard University, Michael Desai has created hundreds of identical worlds in order to watch evolution at work. Each of his meticulously controlled environments is home to a separate strain of baker’s yeast. Every 12 hours, Desai’s robot assistants pluck out the fastest-growing yeast in each world — selecting the fittest to live on — and discard the rest. Desai then monitors the strains as they evolve over the course of 500 generations. His experiment, which other scientists say is unprecedented in scale, seeks to gain insight into a question that has long bedeviled biologists: If we could start the world over again, would life evolve the same way?
Many biologists argue that it would not, that chance mutations early in the evolutionary journey of a species will profoundly influence its fate. “If you replay the tape of life, you might have one initial mutation that takes you in a totally different direction,” Desai said, paraphrasing an idea first put forth by the biologist Stephen Jay Gould in the 1980s.
Desai’s yeast cells call this belief into question. According to results published in Science in June, all of Desai’s yeast varieties arrived at roughly the same evolutionary endpoint (as measured by their ability to grow under specific lab conditions) regardless of which precise genetic path each strain took. It’s as if 100 New York City taxis agreed to take separate highways in a race to the Pacific Ocean, and 50 hours later they all converged at the Santa Monica pier.
The findings also suggest a disconnect between evolution at the genetic level and at the level of the whole organism. [Continue reading…]
Noah Berlatsky writes: Chance is an uncomfortable thing. So Curtis Johnson argues in Darwin’s Dice: The Idea of Chance in the Thought of Charles Darwin, and he makes a compelling case. The central controversy, and the central innovation, in Darwin’s work is not the theory of natural selection itself, according to Johnson, but Darwin’s more basic, and more innovative, turn to randomness as a way to explain natural phenomena. This application of randomness was so controversial, Johnson argues, that Darwin tried to cover it up, replacing words like “accident” and “chance” with terms like “spontaneous variation” in later editions of his work. Nonetheless, the terminological shift was cosmetic: Randomness remained, and still remains, the disturbing center of Darwin’s theories.
Johnson, a political theorist at Lewis & Clark College, explains that there are two basic kinds of chance in Darwin’s thought. The first—most familiar and least disconcerting—is chance as probability. According to the theory of natural selection, individuals with advantageous adaptations are most likely to survive. A giraffe with a longer neck has a better shot of reaching those lofty leaves and living to munch another day; a polar bear blessed with a warmer coat has a higher probability of surviving a frigid winter than one with less hair. The long-necked giraffe may not always win—it may, for example, be pulverized by a meteor before it can pass on its long-necked genes. But over time, the odds will go its way. There is randomness here, but it is controlled and predictable: It works in accordance with a rule. Natural selection makes sense.
The second kind of chance in Darwin’s work, though, is more mysterious. For natural selection to work, you need to have a range of traits to select among. That range is provided by individual variation, the fact that two different animals (whether giraffe or bear) are different from each other. Some giraffes have longer necks than others. Some bears have thicker fur than others. Why should this be? Darwin’s answer was chance. [Continue reading…]
Michael Graziano writes: About four thousand years ago, somewhere in the Middle East — we don’t know where or when, exactly — a scribe drew a picture of an ox head. The picture was rather simple: just a face with two horns on top. It was used as part of an abjad, a set of characters that represent the consonants in a language. Over thousands of years, that ox-head icon gradually changed as it found its way into many different abjads and alphabets. It became more angular, then rotated to its side. Finally it turned upside down entirely, so that it was resting on its horns. Today it no longer represents an ox head or even a consonant. We know it as the capital letter A.
The moral of this story is that symbols evolve.
Long before written symbols, even before spoken language, our ancestors communicated by gesture. Even now, a lot of what we communicate to each other is non-verbal, partly hidden beneath the surface of awareness. We smile, laugh, cry, cringe, stand tall, shrug. These behaviours are natural, but they are also symbolic. Some of them, indeed, are pretty bizarre when you think about them. Why do we expose our teeth to express friendliness? Why do we leak lubricant from our eyes to communicate a need for help? Why do we laugh?
One of the first scientists to think about these questions was Charles Darwin. In his 1872 book, The Expression of the Emotions in Man and Animals, Darwin observed that all people express their feelings in more or less the same ways. He argued that we probably evolved these gestures from precursor actions in ancestral animals. A modern champion of the same idea is Paul Ekman, the American psychologist. Ekman categorised a basic set of human facial expressions — happy, frightened, disgusted, and so on — and found that they were the same across widely different cultures. People from tribal Papua New Guinea make the same smiles and frowns as people from the industrialised USA.
Our emotional expressions seem to be inborn, in other words: they are part of our evolutionary heritage. And yet their etymology, if I can put it that way, remains a mystery. Can we trace these social signals back to their evolutionary root, to some original behaviour of our ancestors? To explain them fully, we would have to follow the trail back until we left the symbolic realm altogether, until we came face to face with something that had nothing to do with communication. We would have to find the ox head in the letter A.
I think we can do that. [Continue reading…]
Embedded in the mud, glistening green and gold and black, was a butterfly, very beautiful and very dead.
“Not a little thing like that! Not a butterfly!” cried Eckels.
It fell to the floor, an exquisite thing, a small thing that could upset balances and knock down a line of small dominoes and then big dominoes and then gigantic dominoes, all down the years across Time. Eckels’ mind whirled. It couldn’t change things. Killing one butterfly couldn’t be that important! Could it? — Ray Bradbury, A Sound of Thunder, 1952
As one of the massive and probably irreversible consequences of climate change, the melting of the Northern Hemisphere’s permafrost is not an example of the butterfly effect. Yet the discovery of a giant virus which has come back to life after 30,000 years of frozen dormancy, suggests many possibilities including some akin to those envisaged by Ray Bradbury is his famous science fiction story.
Whereas his narrative required that the reader suspend disbelief by entertaining the idea of time travel, the thawing tundra may produce a very real kind of time travel if any viruses or other microbes were to emerge as new invasive species.
Rather than being transported geographically as a result of human activity, these will spring suddenly from a distant past into an environment that may lack necessary evolutionary adaptations to accommodate their presence.
We are assured that Pithovirus sibericum poses no threat to humans — it just attacks amoebas. But our concern shouldn’t be limited to fears about the reemergence of something like an ancient strain of smallpox.
The rebirth of a pathogen that could strike phytoplankton — producers of half the world’s oxygen — would have a devastating impact on the planet.
BBC News reports: The ancient pathogen was discovered buried 30m (100ft) down in the frozen ground.
Called Pithovirus sibericum, it belongs to a class of giant viruses that were discovered 10 years ago.
These are all so large that, unlike other viruses, they can be seen under a microscope. And this one, measuring 1.5 micrometres in length, is the biggest that has ever been found.
The last time it infected anything was more than 30,000 years ago, but in the laboratory it has sprung to life once again.
Tests show that it attacks amoebas, which are single-celled organisms, but does not infect humans or other animals.
Co-author Dr Chantal Abergel, also from the CNRS, said: “It comes into the cell, multiplies and finally kills the cell. It is able to kill the amoeba – but it won’t infect a human cell.”
However, the researchers believe that other more deadly pathogens could be locked in Siberia’s permafrost.
“We are addressing this issue by sequencing the DNA that is present in those layers,” said Dr Abergel.
“This would be the best way to work out what is dangerous in there.”
The researchers say this region is under threat. Since the 1970s, the permafrost has retreated and reduced in thickness, and climate change projections suggest it will decrease further.
It has also become more accessible, and is being eyed for its natural resources.
Prof Claverie warns that exposing the deep layers could expose new viral threats.
He said: “It is a recipe for disaster. If you start having industrial explorations, people will start to move around the deep permafrost layers. Through mining and drilling, those old layers will be penetrated and this is where the danger is coming from.”
He told BBC News that ancient strains of the smallpox virus, which was declared eradicated 30 years ago, could pose a risk. [Continue reading…]
Helmholtz Centre for Environmental Research: Plants are also able to make complex decisions. At least this is what scientists from the Helmholtz Center for Environmental Research (UFZ) and the University of Göttingen have concluded from their investigations on Barberry (Berberis vulgaris), which is able to abort its own seeds to prevent parasite infestation. The results are the first ecological evidence of complex behaviour in plants. They indicate that this species has a structural memory, is able to differentiate between inner and outer conditions as well as anticipate future risks, scientists write in the renowned journal American Naturalist — the premier peer-reviewed American journal for theoretical ecology.
The European barberry or simply Barberry (Berberis vulgaris) is a species of shrub distributed throughout Europe. It is related to the Oregon grape (Mahonia aquifolium) that is native to North America and that has been spreading through Europe for years. Scientists compared both species to find a marked difference in parasite infestation: “a highly specialized species of tephritid fruit fly, whose larvae actually feed on the seeds of the native Barberry, was found to have a tenfold higher population density on its new host plant, the Oregon grape”, reports Dr. Harald Auge, a biologist at the UFZ.
This led scientists to examine the seeds of the Barberry more closely. Approximately 2000 berries were collected from different regions of Germany, examined for signs of piercing and then cut open to examine any infestation by the larvae of the tephritid fruit fly (Rhagoletis meigenii). This parasite punctures the berries in order to lay its eggs inside them. If the larva is able to develop, it will often feed on all of the seeds in the berry. A special characteristic of the Barberry is that each berry usually has two seeds and that the plant is able to stop the development of its seeds in order to save its resources. This mechanism is also employed to defend it from the tephritid fruit fly. If a seed is infested with the parasite, later on the developing larva will feed on both seeds. If however the plant aborts the infested seed, then the parasite in that seed will also die and the second seed in the berry is saved. [Read more…]
Paul Willis writes: Science is not a democracy. A consensus of evidence may be interesting, but technically it may not be significant. The thoughts of a majority of scientists doesn’t mean a hill of beans. It’s all about the evidence. The science is never settled.
These are refrains that I and other science communicators have been using over and over again when we turn to analysing debates and discussions based on scientific principles. I think we get torn between remaining true to the philosophical principles by which science is conducted and trying to make those principles familiar to an audience that probably does not understand them.
So let me introduce a concept that is all-too-often overlooked in science discussions, that can actually shed some light deep into the mechanisms of science and explain the anatomy of a scientific debate. It’s the phonically beautiful term ‘consilience’.
Consilience means to use several different lines of inquiry that converge on the same or similar conclusions. The more independent investigations you have that reach the same result, the more confidence you can have that the conclusion is correct. Moreover, if one independent investigation produces a result that is at odds with the consilience of several other investigations, that is an indication that the error is probably in the methods of the adherent investigation, not in the conclusions of the consilience.
Let’s take an example to unpack this concept, an example where I first came across the term and it is a beautiful case of consilience at work. Charles Darwin’s On Origin Of Species is a masterpiece of consilience. Each chapter is a separate line of investigation and, within each chapter there are numerous examples, investigations and experiments that all join together to reach the same conclusion: that life changes through time and that life has evolved on Earth. Take apart On Origin Of Species case by case and no single piece of evidence that Darwin mustered conclusively demonstrates that evolution is true. But add those cases back together and the consilience is clear: evidence from artificial breeding, palaeontology, comparative morphology and a host of other independent lines of investigation combine to confirm the same inescapable conclusion.
That was 1859. Since then yet more investigations have been added to the consilience for evolution. What’s more, these investigations within the biological and geological sciences have been joined with others from physics and chemistry as well as completely new areas of science such as genetics, radiometric dating and molecular biology. Each independent line of investigation builds the consilience that the world and the universe are extremely old and that life has evolved through unfathomable durations of time here on our home planet.
So, when a new line of investigation comes along claiming evidence and conclusions contrary to evolution, how can that be accommodated within the consilience? How does it relate to so many independent strains conjoined by a similar conclusion at odds with the newcomer? Can one piece of evidence overthrow such a huge body of work?
Such is the thinking of those pesky creationists who regularly come up with “Ah-Ha!” and “Gotcha!” factoids that apparently overturn, not just evolution, but the whole consilience of science. [Continue reading…]
Evolution explains how life changes, but it doesn’t explain how it came into existence. A young physicist at MIT has now come up with a mathematical formula which suggests that given the right set of conditions, the emergence of living forms is not merely possible; it almost seems inevitable.
Let there be light, shining on atoms, and there will eventually be life.
Quanta magazine: Why does life exist?
Popular hypotheses credit a primordial soup, a bolt of lightning and a colossal stroke of luck. But if a provocative new theory is correct, luck may have little to do with it. Instead, according to the physicist proposing the idea, the origin and subsequent evolution of life follow from the fundamental laws of nature and “should be as unsurprising as rocks rolling downhill.”
From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. Jeremy England, a 31-year-old assistant professor at the Massachusetts Institute of Technology, has derived a mathematical formula that he believes explains this capacity. The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life.
“You start with a random clump of atoms, and if you shine light on it for long enough, it should not be so surprising that you get a plant,” England said.
England’s theory is meant to underlie, rather than replace, Darwin’s theory of evolution by natural selection, which provides a powerful description of life at the level of genes and populations. “I am certainly not saying that Darwinian ideas are wrong,” he explained. “On the contrary, I am just saying that from the perspective of the physics, you might call Darwinian evolution a special case of a more general phenomenon.”
His idea, detailed in a recent paper and further elaborated in a talk he is delivering at universities around the world, has sparked controversy among his colleagues, who see it as either tenuous or a potential breakthrough, or both.
England has taken “a very brave and very important step,” said Alexander Grosberg, a professor of physics at New York University who has followed England’s work since its early stages. The “big hope” is that he has identified the underlying physical principle driving the origin and evolution of life, Grosberg said.
“Jeremy is just about the brightest young scientist I ever came across,” said Attila Szabo, a biophysicist in the Laboratory of Chemical Physics at the National Institutes of Health who corresponded with England about his theory after meeting him at a conference. “I was struck by the originality of the ideas.”
Others, such as Eugene Shakhnovich, a professor of chemistry, chemical biology and biophysics at Harvard University, are not convinced. “Jeremy’s ideas are interesting and potentially promising, but at this point are extremely speculative, especially as applied to life phenomena,” Shakhnovich said.
England’s theoretical results are generally considered valid. It is his interpretation — that his formula represents the driving force behind a class of phenomena in nature that includes life — that remains unproven. But already, there are ideas about how to test that interpretation in the lab. [Continue reading…]
(Note: Because of the misleading way in which Pew presents its own findings, multiple reports run with a headline similar to this one in USA Today: “One-third of Americans reject human evolution.” That would appear to imply that two-thirds of Americans accept the theory of evolution that provides the foundation for evolutionary biology. However, the rejectionists that the survey identifies are those who believe in the literal truth of Genesis, Adam and Eve etc.. Those who subscribe to Intelligent Design or other non-scientific Creationist evolutionary narratives are viewed by Pew as believing in human evolution.)
I am not a militant atheist. I have little patience for the anti-religion campaigning engaged in by Richard Dawkins, Sam Harris, and their ilk. The idea of trying to rid the world of religion makes no more sense than trying to abolish sport.
Human beings are not governed by reason and people who become enslaved by rationality, inevitably become emotionally malformed. The human capacity to express and experience love is a capacity without which we would cease to be human. As Pascal said: “The heart has its reasons which reason knows not.”
We live in a world constructed by thought and shared ideas and our ability to make sense of life springs in large part from the fact that we continuously filter our experience through stories — stories through which we tell ourselves who we are, where we live, and why we live.
Because of this, I don’t think that science should or can be thrust down anyone’s throat…
And yet to learn that less than a third of Americans believe in evolution is deeply depressing — even if not surprising.
Those who want to put a strong political spin on the results of a new Pew Research Center poll on views about evolution are emphasizing the fact that the greatest concentration of skepticism on evolution is among Republicans while pointing to the figure of 67% of Democrats believing in evolution.
The pollsters, however, fudged the basic question by implying that it’s possible to believe in evolution without accepting its scientific basis.
Pew’s primary interest was in differentiating between those Americans who take Genesis literally and those who don’t. Those Americans who believe “a supreme being guided the evolution of living things for the purpose of creating humans and other life in the form it exists today” are counted as believing in evolution, even though they don’t believe in natural selection.
The fact that Pew chose to slice the question in this way is itself illustrative of the weak influence science has in American culture. “Evolution” is being treated as an object of belief coming in many varieties, rather than as hard, incontrovertibly proven scientific fact.
No one would conduct a poll asking Americans whether they believe the Earth revolves around the Sun and yet when it comes to the subject of evolution, the deference to religious belief is so engrained that evolution is treated as a completely subjective term — evolution, whatever that means to you.
Why does this matter?
The world cannot tackle climate change if America turns its back on science. And yet as a culture, America currently stands somewhere between the sixteenth and the twentieth century. Copernicus was successful but the jury’s still out on Darwin.
If two-thirds of the population is skeptical about evolution, what chance is there of persuading them that climate change is caused by human activity?
It hardly seems coincidental that almost exactly the same number of Americans who believe in human-caused climate change also believe in evolution through natural selection. (I would hazard a guess that it’s not just the same number, but also the same Americans.)
Earth Island Journal: In your new book, Cooked, you explore the art of cooking through the elements of Fire, Water, Air, and Earth. I’m sure you love all your children equally, but of those four, which taught you the most?
Michael Pollan: Fermentation – without a doubt. I began this education about microbiology. I’ve always been interested in nature and other species, and this symbiotic relationship we have with them, and I have mostly paid attention to it in the plant world. I just had no idea of how rich our engagement with microbes was, and how invisible it is to us. I began it when I was doing the Air section and learning about sourdough cultures. But then I got into that last chapter and started learning about fermentation: how much of our food is fermented, the fact that you could cook without the use of any heat, and the fact that we are dependent on these microbes. They’re using us; we’re using them. For me that was most fascinating.
You point out that our feelings about microbes are an expression of our attitude toward the natural world.
Yeah, and our drive for control, at all costs. Microbes are frightening for a couple reasons. One is, they’re invisible. They’re an unseen enemy. And they are pathogens, I mean some of them. You know, conquering infectious disease was a tremendous achievement for our civilization. But as so often happens, we cast things in black and white. So microbes are all bad because some microbes cause disease, and we fail to realize how dependent we are on them for our health. I think we’re going to get to a point where we will discover the unit in evolution and natural selection is not the species as an individual, but what is called the “holobiont,” the group of species that travel together. And that’s what selection is acting on very often, is the super-organism of humans or cats or plants.
Plants, you know, they, too, have their own microbiome; I didn’t talk about this in the piece, but their microbiome is outside their bodies. It surrounds their roots. It’s in what’s called the rhizosphere. There’s a little ecosystem around the root of every plant, and I think we’re going to come to learn that it’s as important to plant health as our flora is to us. I think we’re going to start looking at all species as collectivities, and microbes will be the part of that. And that changes a lot. It changes how you approach agriculture. It certainly changes how you approach health. So I think we’re really on the verge of a paradigm shift around that. [Continue reading…]
David Dobbs writes: A couple of years ago, at a massive conference of neuroscientists — 35,000 attendees, scores of sessions going at any given time — I wandered into a talk that I thought would be about consciousness but proved (wrong room) to be about grasshoppers and locusts. At the front of the room, a bug-obsessed neuroscientist named Steve Rogers was describing these two creatures — one elegant, modest, and well-mannered, the other a soccer hooligan.
The grasshopper, he noted, sports long legs and wings, walks low and slow, and dines discreetly in solitude. The locust scurries hurriedly and hoggishly on short, crooked legs and joins hungrily with others to form swarms that darken the sky and descend to chew the farmer’s fields bare.
Related, yes, just as grasshoppers and crickets are. But even someone as insect-ignorant as I could see that the hopper and the locust were wildly different animals — different species, doubtless, possibly different genera. So I was quite amazed when Rogers told us that grasshopper and locust are in fact the same species, even the same animal, and that, as Jekyll is Hyde, one can morph into the other at alarmingly short notice.
Not all grasshopper species, he explained (there are some 11,000), possess this morphing power; some always remain grasshoppers. But every locust was, and technically still is, a grasshopper — not a different species or subspecies, but a sort of hopper gone mad. If faced with clues that food might be scarce, such as hunger or crowding, certain grasshopper species can transform within days or even hours from their solitudinous hopper states to become part of a maniacally social locust scourge. They can also return quickly to their original form.
In the most infamous species, Schistocerca gregaria, the desert locust of Africa, the Middle East and Asia, these phase changes (as this morphing process is called) occur when crowding spurs a temporary spike in serotonin levels, which causes changes in gene expression so widespread and powerful they alter not just the hopper’s behaviour but its appearance and form. Legs and wings shrink. Subtle camo colouring turns conspicuously garish. The brain grows to manage the animal’s newly complicated social world, which includes the fact that, if a locust moves too slowly amid its million cousins, the cousins directly behind might eat it.
How does this happen? Does something happen to their genes? Yes, but — and here was the point of Rogers’s talk — their genes don’t actually change. That is, they don’t mutate or in any way alter the genetic sequence or DNA. Nothing gets rewritten. Instead, this bug’s DNA — the genetic book with millions of letters that form the instructions for building and operating a grasshopper — gets reread so that the very same book becomes the instructions for operating a locust. Even as one animal becomes the other, as Jekyll becomes Hyde, its genome stays unchanged. Same genome, same individual, but, I think we can all agree, quite a different beast.
Transforming the hopper is gene expression — a change in how the hopper’s genes are ‘expressed’, or read out. Gene expression is what makes a gene meaningful, and it’s vital for distinguishing one species from another. We humans, for instance, share more than half our genomes with flatworms; about 60 per cent with fruit flies and chickens; 80 per cent with cows; and 99 per cent with chimps. Those genetic distinctions aren’t enough to create all our differences from those animals — what biologists call our particular phenotype, which is essentially the recognisable thing a genotype builds. This means that we are human, rather than wormlike, flylike, chickenlike, feline, bovine, or excessively simian, less because we carry different genes from those other species than because our cells read differently our remarkably similar genomes as we develop from zygote to adult. The writing varies — but hardly as much as the reading.
This raises a question: if merely reading a genome differently can change organisms so wildly, why bother rewriting the genome to evolve? How vital, really, are actual changes in the genetic code? Do we even need DNA changes to adapt to new environments? Is the importance of the gene as the driver of evolution being overplayed?
You’ve probably noticed that these questions are not gracing the cover of Time or haunting Oprah, Letterman, or even TED talks. Yet for more than two decades they have been stirring a heated argument among geneticists and evolutionary theorists. As evidence of the power of rapid gene expression mounts, these questions might (or might not, for pesky reasons we’ll get to) begin to change not only mainstream evolutionary theory but our more everyday understanding of evolution. [Continue reading…]
Paul Davies writes: The recent announcement by a team of astronomers that there could be as many as 40 billion habitable planets in our galaxy has further fueled the speculation, popular even among many distinguished scientists, that the universe is teeming with life.
The astronomer Geoffrey W. Marcy of the University of California, Berkeley, an experienced planet hunter and co-author of the study that generated the finding, said that it “represents one great leap toward the possibility of life, including intelligent life, in the universe.”
But “possibility” is not the same as likelihood. If a planet is to be inhabited rather than merely habitable, two basic requirements must be met: the planet must first be suitable and then life must emerge on it at some stage.
What can be said about the chances of life starting up on a habitable planet? Darwin gave us a powerful explanation of how life on Earth evolved over billions of years, but he would not be drawn out on the question of how life got going in the first place. “One might as well speculate about the origin of matter,” he quipped. In spite of intensive research, scientists are still very much in the dark about the mechanism that transformed a nonliving chemical soup into a living cell. But without knowing the process that produced life, the odds of its happening can’t be estimated.
When I was a student in the 1960s, the prevailing view among scientists was that life on Earth was a freak phenomenon, the result of a sequence of chemical accidents so rare that they would be unlikely to have happened twice in the observable universe. “Man at last knows he is alone in the unfeeling immensity of the universe, out of which he has emerged only by chance,” wrote the biologist Jacques Monod. Today the pendulum has swung dramatically, and many distinguished scientists claim that life will almost inevitably arise in Earthlike conditions. Yet this decisive shift in view is based on little more than a hunch, rather than an improved understanding of life’s origin. [Continue reading…]