Mysterious new genes may arise from ‘junk’ DNA

Emily Singer writes: Genes, like people, have families — lineages that stretch back through time, all the way to a founding member. That ancestor multiplied and spread, morphing a bit with each new iteration.

For most of the last 40 years, scientists thought that this was the primary way new genes were born — they simply arose from copies of existing genes. The old version went on doing its job, and the new copy became free to evolve novel functions.

Certain genes, however, seem to defy that origin story. They have no known relatives, and they bear no resemblance to any other gene. They’re the molecular equivalent of a mysterious beast discovered in the depths of a remote rainforest, a biological enigma seemingly unrelated to anything else on earth.

The mystery of where these orphan genes came from has puzzled scientists for decades. But in the past few years, a once-heretical explanation has quickly gained momentum — that many of these orphans arose out of so-called junk DNA, or non-coding DNA, the mysterious stretches of DNA between genes. “Genetic function somehow springs into existence,” said David Begun, a biologist at the University of California, Davis. [Continue reading…]

facebooktwittermail

The difference between Americans who do or don’t believe in evolution

Dan Kahan writes: It’s well established that there is no meaningful correlation between what a person says he or she “believes” about evolution and having the rudimentary understanding of natural selection, random mutation, and genetic variance necessary to pass a high school biology exam (Bishop & Anderson 1990; Shtulman 2006).

There is a correlation between “belief” in evolution and possession of the kinds of substantive knowledge and reasoning skills essential to science comprehension generally.

But what the correlation is depends on religiosity: a relatively nonreligious person is more likely to say he or she “believes in” evolution, but a relatively religious person less likely to do so, as their science comprehension capacity goes up (Kahan 2015).

That’s what “belief in” evolution of the sort measured in a survey item signifies: who one is, not what one knows.

Americans don’t disagree about evolution because they have different understandings of or commitments to science. They disagree because they subscribe to competing cultural worldviews that invest positions on evolution with identity-expressive significance. [Continue reading…]

facebooktwittermail

Everything we thought we knew about the genome is turning out to be wrong

Claire Ainsworth writes: Ask me what a genome is, and I, like many science writers, might mutter about it being the genetic blueprint of a living creature. But then I’ll confess that “blueprint” is a lousy metaphor since it implies that the genome is two-dimensional, prescriptive and unresponsive.

Now two new books about the genome show the limitation of that metaphor for something so intricate, complex, multilayered and dynamic. Both underscore the risks of taking metaphors too literally, not just in undermining popular understanding of science, but also in trammelling scientific enquiry. They are for anyone interested in how new discoveries and controversies will transform our understanding of biology and of ourselves.

John Parrington is an associate professor in molecular and cellular pharmacology at the University of Oxford. In The Deeper Genome, he provides an elegant, accessible account of the profound and unexpected complexities of the human genome, and shows how many ideas developed in the 20th century are being overturned.

Take DNA. It’s no simple linear code, but an intricately wound, 3D structure that coils and uncoils as its genes are read and spliced in myriad ways. Forget genes as discrete, protein-coding “beads on a string”: only a tiny fraction of the genome codes for proteins, and anyway, no one knows exactly what a gene is any more.[Continue reading…]

facebooktwittermail

The dysevolution of humanity

Jeff Wheelwright writes: I sat in my padded desk chair, hunched over, alternately entering notes on my computer and reading a book called The Story of the Human Body. It was the sort of book guaranteed to make me increasingly, uncomfortably aware of my own body. I squirmed to relieve an ache in my lower back. When I glanced out the window, the garden looked fuzzy. Where were my glasses? My toes felt hot and itchy: My athlete’s foot was flaring up again.

I returned to the book. “This chapter focuses on just three behaviors … that you are probably doing right now: wearing shoes, reading, and sitting.” OK, I was. What could be more normal?

According to the author, a human evolutionary biologist at Harvard named Daniel Lieberman, shoes, books and padded chairs are not normal at all. My body had good reason to complain because it wasn’t designed for these accessories. Too much sitting caused back pain. Too much focusing on books and computer screens at a young age fostered myopia. Enclosed, cushioned shoes could lead to foot problems, including bunions, fungus between the toes and plantar fasciitis, an inflammation of the tissue below weakened arches.

Those are small potatoes compared with obesity, Type 2 diabetes, osteoporosis, heart disease and many cancers also on the rise in the developed and developing parts of the world. These serious disorders share several characteristics: They’re chronic, noninfectious, aggravated by aging and strongly influenced by affluence and culture. Modern medicine has come up with treatments for them, but not solutions; the deaths and disabilities continue to climb.
lieberman

An evolutionary perspective is critical to understanding the body’s pitfalls in a time of plenty, Lieberman suggests. [Continue reading…]

facebooktwittermail

Were we happier in the Stone Age?

Yuval Noah Harari writes: Over the last decade, I have been writing a history of humankind, tracking down the transformation of our species from an insignificant African ape into the master of the planet. It was not easy to understand what turned Homo sapiens into an ecological serial killer; why men dominated women in most human societies; or why capitalism became the most successful religion ever. It wasn’t easy to address such questions because scholars have offered so many different and conflicting answers. In contrast, when it came to assessing the bottom line – whether thousands of years of inventions and discoveries have made us happier – it was surprising to realise that scholars have neglected even to ask the question. This is the largest lacuna in our understanding of history.

Though few scholars have studied the long-term history of happiness, almost everybody has some idea about it. One common preconception – often termed “the Whig view of history” – sees history as the triumphal march of progress. Each passing millennium witnessed new discoveries: agriculture, the wheel, writing, print, steam engines, antibiotics. Humans generally use newly found powers to alleviate miseries and fulfil aspirations. It follows that the exponential growth in human power must have resulted in an exponential growth in happiness. Modern people are happier than medieval people, and medieval people were happier than stone age people.

But this progressive view is highly controversial. Though few would dispute the fact that human power has been growing since the dawn of history, it is far less clear that power correlates with happiness. The advent of agriculture, for example, increased the collective power of humankind by several orders of magnitude. Yet it did not necessarily improve the lot of the individual. For millions of years, human bodies and minds were adapted to running after gazelles, climbing trees to pick apples, and sniffing here and there in search of mushrooms. Peasant life, in contrast, included long hours of agricultural drudgery: ploughing, weeding, harvesting and carrying water buckets from the river. Such a lifestyle was harmful to human backs, knees and joints, and numbing to the human mind.

In return for all this hard work, peasants usually had a worse diet than hunter-gatherers, and suffered more from malnutrition and starvation. Their crowded settlements became hotbeds for new infectious diseases, most of which originated in domesticated farm animals. Agriculture also opened the way for social stratification, exploitation and possibly patriarchy. From the viewpoint of individual happiness, the “agricultural revolution” was, in the words of the scientist Jared Diamond, “the worst mistake in the history of the human race”.

The case of the agricultural revolution is not a single aberration, however. Themarch of progress from the first Sumerian city-states to the empires of Assyria and Babylonia was accompanied by a steady deterioration in the social status and economic freedom of women. The European Renaissance, for all its marvellous discoveries and inventions, benefited few people outside the circle of male elites. The spread of European empires fostered the exchange of technologies, ideas and products, yet this was hardly good news for millions of Native Americans, Africans and Aboriginal Australians.

The point need not be elaborated further. Scholars have thrashed the Whig view of history so thoroughly, that the only question left is: why do so many people still believe in it? [Continue reading…]

facebooktwittermail

Resurrecting ancient proteins to illuminate the origins of life

Emily Singer writes: About 4 billion years ago, molecules began to make copies of themselves, an event that marked the beginning of life on Earth. A few hundred million years later, primitive organisms began to split into the different branches that make up the tree of life. In between those two seminal events, some of the greatest innovations in existence emerged: the cell, the genetic code and an energy system to fuel it all. All three of these are essential to life as we know it, yet scientists know disappointingly little about how any of these remarkable biological innovations came about.

“It’s very hard to infer even the relative ordering of evolutionary events before the last common ancestor,” said Greg Fournier, a geobiologist at the Massachusetts Institute of Technology. Cells may have appeared before energy metabolism, or perhaps it was the other way around. Without fossils or DNA preserved from organisms living during this period, scientists have had little data to work from.

Fournier is leading an attempt to reconstruct the history of life in those evolutionary dark ages — the hundreds of millions of years between the time when life first emerged and when it split into what would become the endless tangle of existence.

He is using genomic data from living organisms to infer the DNA sequence of ancient genes as part of a growing field known as paleogenomics. In research published online in March in the Journal of Molecular Evolution, Fournier showed that the last chemical letter added to the code was a molecule called tryptophan — an amino acid most famous for its presence in turkey dinners. The work supports the idea that the genetic code evolved gradually. [Continue reading…]

facebooktwittermail

The man who drank cholera and launched the yogurt craze

Lina Zeldovich writes: What do Jamie Lee Curtis, gut bacteria, and a long forgotten Russian scientist have in common? Why, yogurt, of course. But wait, the answer is not that easy. Behind it stretches a tale that shows you can never predict cultural influence. It wends its way through the Pasteur Institute, the Nobel Prize, one of the hottest fields of scientific research today, the microbiome, and one of the trendiest avenues in nutrition, probiotics. It all began in the 19th century with a hyperactive kid in Russia who had a preternatural ability to connect dots where nobody saw dots at all.

When Ilya Metchnikoff was 8 and running around on his parents’ Panassovka estate in Little Russia, now Ukraine, he was making notes on the local flora like a junior botanist. He gave science lectures to his older brothers and local kids whose attendance he assured by paying them from his pocket money. Metchnikoff earned the nickname “Quicksilver” because he was in constant motion, always wanting to see, taste, and try everything, from studying how his father played card games to learning to sew and embroider with the maids. His wife later wrote in The Life of Ellie Metchnikoff that Metchnikoff asked the “queerest” questions, often exasperating his caretakers. “He could only be kept quiet when his curiosity was awakened by observation of some natural objects such as an insect or a butterfly.”

At 16, Metchnikoff borrowed a microscope from a university professor to study the lower organisms. Darwin’s On the Origin of Species shaped his comparative approach to science during his university years — he viewed all organisms, and physiological processes that took place in them, as interconnected and related.

That ability led him to the discovery of a particular cell and enabled him to link digestive processes in primitive creatures to the human body’s immune defenses. In lower organisms, which lack the abdominal cavity and intestines, digestion is accomplished by a particular type of cells — mobile mesodermal cells — that move around engulfing and dissolving food particles. While staring at mesodermal cells inside transparent starfish larvae, Metchnikoff, 37 at the time, had a thought. “It struck me that similar cells might serve in the defense of the organisms against intruders,” he wrote. He fetched a few rose thorns from the garden and stuck them into the larvae. If his hypothesis was correct, the larva’s body would recognize thorns as intruders and mesodermal cells would aggregate around the thorns in an attempt to gobble them up. As Metchnikoff expected, the mesodermal cells surrounded the thorns, proving his theory. He named his cells phagocytes, which in Greek means “devouring cells,” and likened them to an “army hurling itself upon the enemy.” [Continue reading…]

facebooktwittermail

Inside the new social science of genetics

David Dobbs writes: A few years ago, Gene Robinson, of Urbana, Illinois, asked some associates in southern Mexico to help him kidnap some 1,000 newborns. For their victims they chose bees. Half were European honeybees, Apis mellifera ligustica, the sweet-tempered kind most beekeepers raise. The other half were ligustica’s genetically close cousins, Apis mellifera scutellata, the African strain better known as killer bees. Though the two subspecies are nearly indistinguishable, the latter defend territory far more aggressively. Kick a European honeybee hive and perhaps a hundred bees will attack you. Kick a killer bee hive and you may suffer a thousand stings or more. Two thousand will kill you.

Working carefully, Robinson’s conspirators — researchers at Mexico’s National Center for Research in Animal Physiology, in the high resort town of Ixtapan de la Sal — jiggled loose the lids from two African hives and two European hives, pulled free a few honeycomb racks, plucked off about 250 of the youngest bees from each hive, and painted marks on the bees’ tiny backs. Then they switched each set of newborns into the hive of the other subspecies.

Robinson, back in his office at the University of Illinois at Urbana-Champaign’s Department of Entomology, did not fret about the bees’ safety. He knew that if you move bees to a new colony in their first day, the colony accepts them as its own. Nevertheless, Robinson did expect the bees would be changed by their adoptive homes: He expected the killer bees to take on the European bees’ moderate ways and the European bees to assume the killer bees’ more violent temperament. Robinson had discovered this in prior experiments. But he hadn’t yet figured out how it happened.

He suspected the answer lay in the bees’ genes. He didn’t expect the bees’ actual DNA to change: Random mutations aside, genes generally don’t change during an organism’s lifetime. Rather, he suspected the bees’ genes would behave differently in their new homes — wildly differently.

This notion was both reasonable and radical. Scientists have known for decades that genes can vary their level of activity, as if controlled by dimmer switches. Most cells in your body contain every one of your 22,000 or so genes. But in any given cell at any given time, only a tiny percentage of those genes is active, sending out chemical messages that affect the activity of the cell. This variable gene activity, called gene expression, is how your body does most of its work. [Continue reading…]

facebooktwittermail

The genetic code is less like a blueprint than a first draft

Nessa Carey writes: When President Obama delivered a speech at MIT in 2009, he used a common science metaphor: “We have always been about innovation,” he said. “We have always been about discovery. That’s in our DNA.” Deoxyribonucleic acid, the chemical into which our genes are encoded, has become the metaphor of choice for a whole constellation of ideas about essence and identity. A certain mystique surrounds it. As Evelyn Fox Keller argues in her book The Century of the Gene, the genome is, in the popular imagination at least, the secret of life, the holy grail. It is a master builder, the ultimate computer program, and a modern-day echo of the soul, all wrapped up in one. This fantasy does not sit easily, however, with geneticists who have grown more aware over the last several decades that the relationship between genes and biological traits is much less than certain.

The popular understanding of DNA as a blueprint for organisms, with a one-to-one correspondence between genes and traits (called phenotypes), is the legacy of the early history of genetics. The term “gene” was coined in 1909 to refer to abstract units of inheritance, predating the discovery of DNA by forty years. Biologists came to think of genes like beads on a string that lined up neatly into chromosomes, with each gene determining a single phenotype. But, while some genes do correspond to traits in a straightforward way, as in eye color or blood group, most phenotypes are far more complex, set in motion by many different genes as well as by the environment in which the organism lives.

It turns out that the genetic code is less like a blueprint and more like a movie script, subject to revision and reinterpretation by a director. This process is called epigenetic modification (“epi” meaning “above” or “in addition to”). Just as a script can be altered with crossed-out words, sentences or scenes, epigenetic editing allows entire sections of DNA to be activated or de-activated. Genes can be as finely tuned as actors responding to stage directions to shout, whisper, or cackle. [Continue reading…]

facebooktwittermail

Darwin learned more about evolution from plants than Galapagos Finches

Henry Nicholls writes: When the HMS Beagle dropped anchor on San Cristobal, the easternmost island in the Galapagos archipelago, in September 1835, the ship’s naturalist Charles Darwin eagerly went ashore to gather samples of the insects, birds, reptiles, and plants living there. At first, he didn’t think much of the arid landscape, which appeared to be “covered by stunted, sun-burnt brushwood…as leafless as our trees during winter” But this did not put him off. By the time the Beagle left these islands some five weeks later, he had amassed a spectacular collection of Galapagos plants.

It is fortunate that he took such trouble. Most popular narratives of Darwin and the Galapagos concentrate on the far more celebrated finches or the giant tortoises. Yet when he finally published On the Origin of Species almost 25 years later, Darwin made no mention of these creatures. In his discussion of the Galapagos, he dwelt almost exclusively on the islands’ plants.

By the early 19th century, there was increasing interest in what we now refer to as biogeography, the study of the distribution of species around the globe. Many people still imagined that God had been involved in the creation of species, putting fully formed versions down on Earth that continued to reproduce themselves, dispersing from a divine “center of creation” to occupy their current habitats. To explain how the plants and animals reached far-flung places such as the isolated Galapagos, several naturalists imagined that there had to have been land bridges, long-since subsided, that had once connected them to a continent. But in the wake of the Beagle voyage, the collection of Galapagos plants suggested an alternate scenario.

Even if there had once been a land bridge to the islands, it could not account for the fact that half of the plant species Darwin collected were unique to the Galapagos, and that most of them were particular to just one island. “I never dreamed that islands, about fifty or sixty miles apart, and most of them in sight of each other, formed of precisely the same rocks, placed under a quite similar climate, rising to a nearly equal height, would have been differently tenanted,” wrote Darwin in his Journal of Researches. His observations could be best explained if species were not fixed in nature but somehow changed as the seeds traveled to different locations. [Continue reading…]

facebooktwittermail

Republicans still afraid of evolution

Mark Oppenheimer writes: On Wednesday, in an interview in London, Gov. Scott Walker of Wisconsin, a potential Republican presidential candidate, sidestepped the question of whether he believed in evolution.

“I’m going to punt on that one,” he said to an audience at a research organization in London, which he was visiting for a trade mission. “I’m here to talk about trade, not to pontificate on other issues. I love the evolution of trade in Wisconsin.”

Mr. Walker’s response was not all that surprising — evolution is a sensitive issue for the evangelical Christian base of the Republican Party and presidential candidates have had to tread carefully around it.

The theory of evolution may be supported by a consensus of scientists, but none of the likely Republican candidates for 2016 seem to be convinced. Former Gov. Jeb Bush of Florida said it should not be taught in schools. Former Gov. Mike Huckabee of Arkansas is an outright skeptic. Senator Ted Cruz of Texas will not talk about it. When asked, in 2001, what he thought of the theory, Gov. Chris Christie of New Jersey said, “None of your business.”

After Mr. Walker’s response, the interviewer in London, an incredulous Justin Webb of the BBC, said to the governor: “Any British politician, right or left wing, would laugh and say, ‘Yes, of course evolution is true.’ ” [Continue reading…]

facebooktwittermail

Charles Darwin, natural novelist

Adam Gopnik writes: Darwin’s Delay is by now nearly as famous as Hamlet’s, and involves a similar cast of characters: a family ghost, an unhappy lover, and a lot of men digging up old bones. Although it ends with vindication and fame, rather than with slaughter and self-knowledge, it was resolved by language, too — by inner soliloquy forcing itself out into the world, except that in this case the inner voice had the certainties and the outer one the hesitations.

The delay set in between Darwin’s first intimations of his Great Idea, the idea of evolution by natural selection, in the eighteen-thirties (he was already toying with it during his famous voyage on the H.M.S. Beagle), and the publication of “On the Origin of Species,” in 1859. By legend, the two events were in the long run one: Darwin saw the adapted beaks of his many finches, brooded on what they meant, came up with a theory, sought evidence for it, and was prodded into print at last by an unwelcome letter from an obscure naturalist named Alfred Russel Wallace, who had managed to arrive at the same idea.

It seems to have been more complicated than that. One reason Darwin spent so long getting ready to write his masterpiece without getting it written was that he knew what it would mean for faith and life, and, as Janet Browne’s now standard biography makes plain, he was frightened about being attacked by the powerful and the bigoted. Darwin was not a brave man — had the Inquisition been in place in Britain, he never would have published — but he wasn’t a humble man or a cautious thinker, either. He sensed that his account would end any intellectually credible idea of divine creation, and he wanted to break belief without harming the believer, particularly his wife, Emma, whom he loved devotedly and with whom he had shared, before he sat down to write, a private tragedy that seemed tolerable to her only through faith. The problem he faced was also a rhetorical one: how to say something that had never been said before in a way that made it sound like something everybody had always known — how to make an idea potentially scary and subversive sound as sane and straightforward as he believed it to be.

He did it, and doing it was, in some part, a triumph of style. Darwin is the one indisputably great scientist whose scientific work is still read by amateurs. [Continue reading…]

facebooktwittermail

Ancient skull sheds light on human dispersal out of Africa

Ivan Semeniuk reports: Francesco Berna still remembers his first visit to Manot Cave, accidentally discovered in 2008 on a ridge in northern Israel. A narrow passage steeply descends into darkness. It then opens onto a 60-metre-long cavern with side chambers, all dramatically ornamented with stalactites and stalagmites.“It’s a spectacular cave,” said Dr. Berna, a geoarcheologist at Simon Fraser University in Burnaby, B.C. “It’s basically untouched.”

Now Manot Cave has yielded a tantalizing sign of humanity’s initial emergence out of Africa and a possible forerunner of the first modern humans in Europe, an international team of researchers that includes Dr. Berna said on Wednesday.

The find also establishes the Levant region (including Israel, Lebanon and part of Syria) as a plausible setting where our species interbred with its Neanderthal cousins.

The team’s key piece of evidence is a partial human skull found during the initial reconnaissance of the cave.

Based on its features and dimensions, the skull is unquestionably that of an anatomically modern human, the first such find in the region. The individual would probably have looked like the first Homo sapiens that appeared in Africa about 200,000 years ago and been physically indistinguishable from humans today.

“He or she would look very modern. With a tie on, you would not be able to tell the difference,” said Israel Hershkovitz, a biological anthropologist at Tel Aviv University and lead author of a paper published this week in the journal Nature that documents the Manot Cave find.

The age of the fossil is the crucial detail. The team’s analysis shows it is about 55,000 years old. That is more recent than the fragmentary remains of some not-so-modern-looking humans that drifted into the region at an earlier stage. But it coincides exactly with a period when a wetter climate may have opened the door to the first modern human migration out of Africa.

Fossils of modern humans that are only slightly less old than the Manot Cave skull have been found in the Czech Republic and Romania, making the new find a potential forerunner of the first Europeans. [Continue reading…]

Much of the reporting on these findings makes reference to “the first Europeans” and even though anthropologists might be clear about what they mean when they use to term Europe, they might consider avoiding using it, given the common meaning that is usually attached to the word.

Indeed, the lead researcher cited above, Israel Hershkovitz, illustrates the problem as he reinforces cultural stereotypes by implying that the human has fully evolved once he adorns the symbol of European, masculine power: a necktie. The irony is compounded by the fact that he and his team were trumpeting the significance of their discovery of a woman’s skull.

(No doubt many Europeans and others with European affectations have been disturbed this week to see Greece’s new prime minister, in the birthplace of democracy, assuming power without a necktie.)

The Oxford archeologist, Barry Cunliffe, has referred to the region of land that recently got dubbed “Europe” as “the westerly excrescence of the continent of Asia.”

Europeans might object to the suggestion that they inhabit an excrescence — especially since the terms suggests an abnormality — but in terms of continental topography, it points to Europe’s unique feature: its eastern boundaries have always been elastic and somewhat arbitrary.

More importantly, when it comes to human evolution, to frame this in terms of the advance into Europe revives so many echoes of nineteenth century racism.

It cannot be overstated that the first Europeans were not European.

Europe is an idea that has only been around for a few hundred years during which time it has been under constant revision.

Migration is also a misleading term since it evokes images of migrants: people who travel vast distances to inhabit new lands.

Human dispersal most likely involved rather short hops, one generation at a time, interspersed with occasional actual migrations driven by events like floods or famine.

facebooktwittermail

The strange inevitability of evolution

Philip Ball writes: Is the natural world creative? Just take a look around it. Look at the brilliant plumage of tropical birds, the diverse pattern and shape of leaves, the cunning stratagems of microbes, the dazzling profusion of climbing, crawling, flying, swimming things. Look at the “grandeur” of life, the “endless forms most beautiful and most wonderful,” as Darwin put it. Isn’t that enough to persuade you?

Ah, but isn’t all this wonder simply the product of the blind fumbling of Darwinian evolution, that mindless machine which takes random variation and sieves it by natural selection? Well, not quite. You don’t have to be a benighted creationist, nor even a believer in divine providence, to argue that Darwin’s astonishing theory doesn’t fully explain why nature is so marvelously, endlessly inventive. “Darwin’s theory surely is the most important intellectual achievement of his time, perhaps of all time,” says evolutionary biologist Andreas Wagner of the University of Zurich. “But the biggest mystery about evolution eluded his theory. And he couldn’t even get close to solving it.”

What Wagner is talking about is how evolution innovates: as he puts it, “how the living world creates.” Natural selection supplies an incredibly powerful way of pruning variation into effective solutions to the challenges of the environment. But it can’t explain where all that variation came from. As the biologist Hugo de Vries wrote in 1905, “natural selection may explain the survival of the fittest, but it cannot explain the arrival of the fittest.” Over the past several years, Wagner and a handful of others have been starting to understand the origins of evolutionary innovation. Thanks to their findings so far, we can now see not only how Darwinian evolution works but why it works: what makes it possible. [Continue reading…]

facebooktwittermail

Massive genetic effort confirms bird songs related to human speech

Scientific American reports: Songbirds stutter, babble when young, become mute if parts of their brains are damaged, learn how to sing from their elders and can even be “bilingual” — in other words, songbirds’ vocalizations share a lot of traits with human speech. However, that similarity goes beyond behavior, researchers have found. Even though humans and birds are separated by millions of years of evolution, the genes that give us our ability to learn speech have much in common with those that lend birds their warble.

A four-year long effort involving more than 100 researchers around the world put the power of nine supercomputers into analyzing the genomes of 48 species of birds. The results, published this week in a package of eight articles in Science and 20 papers in other journals, provides the most complete picture of the bird family tree thus far. The project has also uncovered genetic signatures in song-learning bird brains that have surprising similarities to the genetics of speech in humans, a finding that could help scientists study human speech.

The analysis suggests that most modern birds arose in an impressive speciation event, a “big bang” of avian diversification, in the 10 million years immediately following the extinction of dinosaurs. This period is more recent than posited in previous genetic analyses, but it lines up with the fossil record. By delving deeper into the rich data set, research groups identified when birds lost their teeth, investigated the relatively slow evolution of crocodiles and outlined the similarities between birds’ and humans’ vocal learning ability, among other findings. [Continue reading…]

facebooktwittermail

Co-operation

Patrick Bateson writes: I am disturbed by the way we have created a social environment in which so much emphasis is laid on competition – on forging ahead while trampling on others. The ideal of social cooperation has come to be treated as high-sounding flabbiness, while individual selfishness is regarded as the natural and sole basis for a realistic approach to life. The image of the struggle for existence lies at the back of it, seriously distorting the view we have of ourselves and wrecking mutual trust.

The fashionable philosophy of individualism draws its respectability in part from an appeal to biology and specifically to the Darwinian theory of evolution by natural selection. Now, Darwin’s theory remains the most powerful explanation for the way that each plant and animal evolved so that it is exquisitely adapted to its environment. The theory works just as well for behaviour as it does for anatomy. Individual animals differ in the way they behave. Those that behave in a manner that is better suited to the conditions in which they live are more likely to survive. Finally, if their descendants resemble them in terms of behaviour, then in the course of evolution, the better adapted forms of behaviour will replace those that are not so effective in keeping the individual alive.

It is the Darwinian concept of differential survival that has been picked up and used so insistently in political rhetoric. Biology is thought to be all about competition – and that supposedly means constant struggle. This emphasis has had an insidious effect on the public mind and has encouraged the belief in individual selfishness and in confrontation. Competition is now widely seen as the mainspring of human activity, at least in Western countries. Excellence in the universities and in the arts is thought to be driven by the same ruthless process that supposedly works so well on the sportsfield or the market place, and they all have a lot in common with what supposedly happens in the jungle. The image of selfish genes, competing with each other in the course of evolution has fused imperceptibly with the notion of selfish individuals competing with each other in the course of their life-times. Individuals only thrive by winning. The argument has become so much a part of conventional belief that it is hard at first to see what is wrong with it.

To put it bluntly, thought has been led seriously astray by the rhetoric. [Continue reading…]

facebooktwittermail

The thoughts of our ancient ancestors

The discovery of what appear to have been deliberately etched markings made by a human ancestor, Homo erectus, on the surface of a shell, call for a reconsideration of assumptions that have been made about the origins of abstract thought.

While the meaning of these zigzag markings will most likely remain forever unknown, it can reasonably be inferred that for the individual who created them, the marks had some significance.

In a report in Nature, Josephine Joordens, a biologist at Leiden University whose team discovered the markings, says:

“We’ve looked at all possibilities, but in the end we are really certain that this must have been made by an agent who did a very deliberate action with a very sharp implement,” says Joordens. Her team tried replicating the pattern on fresh and fossilized shells, “and that made us realize how difficult it really was”, she says.

Saying much more about the engraving is tricky. “If you don’t know the intention of the person who made it, it’s impossible to call it art,” says Joordens.

“But on the other hand, it is an ancient drawing. It is a way of expressing yourself. What was meant by the person who did this, we simply don’t know, ” she adds. “It could have been to impress his girlfriend, or to doodle a bit, or to mark the shell as his own property.”

Clive Finlayson, a zoologist at the Gibraltar Museum who was part of the team that described cross-hatch patterns linked to Neanderthals, is also agnostic about whether to call the H. erectus doodles art. What is more important, he says, is the growing realization that abilities such as abstract thinking, once ascribed to only H. sapiens, were present in other archaic humans, including, now, their ancestors.

“I’ve been suggesting increasingly strongly that a lot of these things that are meant to be modern human we’re finding in other hominids,” he says. “We really need to revisit these concepts and take stock.”

Palaeoanthropology, by necessity, is a highly speculative discipline — therein lies both its strength and its weakness.

The conservatism of hard science recoils at the idea that some scratches on a single shell amount to sufficient evidence to prompt a reconsideration about the origins of the human mind, and yet to refrain from such speculation seems like an effort to restrain the powers of the very thing we are trying to understand.

Rationally, there is as much reason to assume that abstract thinking long predates modern humans and thus searching for evidence of its absence and finding none would leave us agnostic about its presence or absence, than there is reason to assume that at some juncture it was born.

My inclination is to believe that any living creature that has some capacity to construct a neurological representation of their surroundings is by that very capacity employing something akin to abstract thinking.

This ability for the inner to mirror the outer has no doubt evolved, becoming progressively more complex and more deeply abstract, and yet mind, if defined as world-mirroring, seems to have been born when life first moved.

facebooktwittermail

Long before we learned how to make wine our ancestors acquired a taste for rotten fruit

Live Science: Human ancestors may have begun evolving the knack for consuming alcohol about 10 million years ago, long before modern humans began brewing booze, researchers say.

The ability to break down alcohol likely helped human ancestors make the most out of rotting, fermented fruit that fell onto the forest floor, the researchers said. Therefore, knowing when this ability developed could help researchers figure out when these human ancestors began moving to life on the ground, as opposed to mostly in trees, as earlier human ancestors had lived.

“A lot of aspects about the modern human condition — everything from back pain to ingesting too much salt, sugar and fat — goes back to our evolutionary history,” said lead study author Matthew Carrigan, a paleogeneticist at Santa Fe College in Gainesville, Florida. “We wanted to understand more about the modern human condition with regards to ethanol,” he said, referring to the kind of alcohol found in rotting fruit and that’s also used in liquor and fuel. [Continue reading…]

facebooktwittermail