Eric D. Green, James D. Watson& Francis S. Collins write: Twenty-five years ago, the newly created US National Center for Human Genome Research (now the National Human Genome Research Institute; NHGRI), which the three of us have each directed, joined forces with US and international partners to launch the Human Genome Project (HGP). What happened next represents one of the most historically significant scientific endeavours: a 13-year quest to sequence all three billion base pairs of the human genome.
Even just a few years ago, discussions surrounding the HGP focused mainly on what insights the project had brought or would bring to our understanding of human disease. Only now is it clear that, as well as dramatically accelerating biomedical research, the HGP initiated a new way of doing science.
As biology’s first large-scale project, the HGP paved the way for numerous consortium-based research ventures. The NHGRI alone has been involved in launching more than 25 such projects since 2000. These have presented new challenges to biomedical research — demanding, for instance, that diverse groups from different countries and disciplines come together to share and analyse vast data sets. [Continue reading…]
The Independent reports: The most comprehensive study of the human genome has discovered that a sizeable minority of people are walking around with some of their genes missing without any apparent ill-effects, scientists have found.
A project to sequence and analyse the entire genetic code of more than 2,500 people drawn from 26 different ethnic populations from around the world has revealed that some genes do not seem to be as essential for health and life as previously believed.
The finding is just one to have emerged from the 1,000 Genomes Project set up in 2008 to study the genetic variation in at least this number of people in order to understand the variety of DNA types within the human population, the researchers said. [Continue reading…]
From the earliest of times, philosophers and scientists have tried to understand the relationship between animate and inanimate matter. But the origin of life remains one of the major scientific riddles to be solved.
The building blocks of life as we know it essentially consist of four groups of chemicals: proteins, nucleic acids, lipids (fats) and carbohydrates. There was much excitement about the possibility of finding amino acids (the ingredients for proteins) on comets or distant planets because some scientists believe that life on Earth, or at least its building blocks, may have originally come from outer space and been deposited by meteorites.
But there are now extensive examples of how natural processes on Earth can convert simple molecules into these building blocks. Scientists have demonstrated in the lab how to make amino acids, simple sugars, lipids and even nucleotides – the basic units of DNA – from very simple chemicals, under conditions that could have existed on early earth. What still eludes them is the point in the process when a chemical stew becomes an organism. How did the first lifeforms become alive?
Scientists recently suggested that the Earth’s sixth mass extinction has begun. As terrifying as that sounds, surely humans are too smart and too important to get wiped out? Palaeontologists have long tried to shed light on this question by looking for general rules that might predict the survival of a species.
While this is not exactly a straightforward exercise, research so far indicates that the odds are not in our favour.
As a wildlife veterinarian, I often get asked about bats. I like bats, and I am always eager to talk about how interesting they are. Unfortunately the question is often not about biology but instead “what should I do about the ones in my roof?”.
With some unique talents and remarkable sex lives, bats are actually one of the most interesting, diverse and misunderstood groups of animals. Contrary to popular belief, they are beautiful creatures. Not necessarily in the cuddly, human-like sense – although some fruit bats with doey brown eyes and button noses could be considered so – but they are beautifully designed.
This couldn’t be illustrated better than by the discovery of the oldest known complete bat fossil, more than 53 million-years-old yet with a similar wing design to those flying around today. To put it in perspective, 50m years ago our ancestors were still swinging from the trees and would certainly not be recognised as human. But even then bats already had the combination of thin, long forearms and fingers covered by an extremely thin, strong membrane, which allowed them to master the art of powered, agile flight.
Duncan PJ, CC BY-SA
Soon afterwards, fossils record another game-changing adaptation in the evolution of most bats, and that is the ability to accurately locate prey using sound (what we call echolocation). These two adaptations early in their history gave bats an evolutionary edge compared to some other mammals, and allowed them to diversify into almost all habitats, on every continent except Antarctica.
Emily Singer writes: Genes, like people, have families — lineages that stretch back through time, all the way to a founding member. That ancestor multiplied and spread, morphing a bit with each new iteration.
For most of the last 40 years, scientists thought that this was the primary way new genes were born — they simply arose from copies of existing genes. The old version went on doing its job, and the new copy became free to evolve novel functions.
Certain genes, however, seem to defy that origin story. They have no known relatives, and they bear no resemblance to any other gene. They’re the molecular equivalent of a mysterious beast discovered in the depths of a remote rainforest, a biological enigma seemingly unrelated to anything else on earth.
The mystery of where these orphan genes came from has puzzled scientists for decades. But in the past few years, a once-heretical explanation has quickly gained momentum — that many of these orphans arose out of so-called junk DNA, or non-coding DNA, the mysterious stretches of DNA between genes. “Genetic function somehow springs into existence,” said David Begun, a biologist at the University of California, Davis. [Continue reading…]
Dan Kahan writes: It’s well established that there is no meaningful correlation between what a person says he or she “believes” about evolution and having the rudimentary understanding of natural selection, random mutation, and genetic variance necessary to pass a high school biology exam (Bishop & Anderson 1990; Shtulman 2006).
There is a correlation between “belief” in evolution and possession of the kinds of substantive knowledge and reasoning skills essential to science comprehension generally.
But what the correlation is depends on religiosity: a relatively nonreligious person is more likely to say he or she “believes in” evolution, but a relatively religious person less likely to do so, as their science comprehension capacity goes up (Kahan 2015).
That’s what “belief in” evolution of the sort measured in a survey item signifies: who one is, not what one knows.
Americans don’t disagree about evolution because they have different understandings of or commitments to science. They disagree because they subscribe to competing cultural worldviews that invest positions on evolution with identity-expressive significance. [Continue reading…]
Claire Ainsworth writes: Ask me what a genome is, and I, like many science writers, might mutter about it being the genetic blueprint of a living creature. But then I’ll confess that “blueprint” is a lousy metaphor since it implies that the genome is two-dimensional, prescriptive and unresponsive.
Now two new books about the genome show the limitation of that metaphor for something so intricate, complex, multilayered and dynamic. Both underscore the risks of taking metaphors too literally, not just in undermining popular understanding of science, but also in trammelling scientific enquiry. They are for anyone interested in how new discoveries and controversies will transform our understanding of biology and of ourselves.
John Parrington is an associate professor in molecular and cellular pharmacology at the University of Oxford. In The Deeper Genome, he provides an elegant, accessible account of the profound and unexpected complexities of the human genome, and shows how many ideas developed in the 20th century are being overturned.
Take DNA. It’s no simple linear code, but an intricately wound, 3D structure that coils and uncoils as its genes are read and spliced in myriad ways. Forget genes as discrete, protein-coding “beads on a string”: only a tiny fraction of the genome codes for proteins, and anyway, no one knows exactly what a gene is any more.[Continue reading…]
Jeff Wheelwright writes: I sat in my padded desk chair, hunched over, alternately entering notes on my computer and reading a book called The Story of the Human Body. It was the sort of book guaranteed to make me increasingly, uncomfortably aware of my own body. I squirmed to relieve an ache in my lower back. When I glanced out the window, the garden looked fuzzy. Where were my glasses? My toes felt hot and itchy: My athlete’s foot was flaring up again.
I returned to the book. “This chapter focuses on just three behaviors … that you are probably doing right now: wearing shoes, reading, and sitting.” OK, I was. What could be more normal?
According to the author, a human evolutionary biologist at Harvard named Daniel Lieberman, shoes, books and padded chairs are not normal at all. My body had good reason to complain because it wasn’t designed for these accessories. Too much sitting caused back pain. Too much focusing on books and computer screens at a young age fostered myopia. Enclosed, cushioned shoes could lead to foot problems, including bunions, fungus between the toes and plantar fasciitis, an inflammation of the tissue below weakened arches.
Those are small potatoes compared with obesity, Type 2 diabetes, osteoporosis, heart disease and many cancers also on the rise in the developed and developing parts of the world. These serious disorders share several characteristics: They’re chronic, noninfectious, aggravated by aging and strongly influenced by affluence and culture. Modern medicine has come up with treatments for them, but not solutions; the deaths and disabilities continue to climb.
An evolutionary perspective is critical to understanding the body’s pitfalls in a time of plenty, Lieberman suggests. [Continue reading…]
Yuval Noah Harari writes: Over the last decade, I have been writing a history of humankind, tracking down the transformation of our species from an insignificant African ape into the master of the planet. It was not easy to understand what turned Homo sapiens into an ecological serial killer; why men dominated women in most human societies; or why capitalism became the most successful religion ever. It wasn’t easy to address such questions because scholars have offered so many different and conflicting answers. In contrast, when it came to assessing the bottom line – whether thousands of years of inventions and discoveries have made us happier – it was surprising to realise that scholars have neglected even to ask the question. This is the largest lacuna in our understanding of history.
Though few scholars have studied the long-term history of happiness, almost everybody has some idea about it. One common preconception – often termed “the Whig view of history” – sees history as the triumphal march of progress. Each passing millennium witnessed new discoveries: agriculture, the wheel, writing, print, steam engines, antibiotics. Humans generally use newly found powers to alleviate miseries and fulfil aspirations. It follows that the exponential growth in human power must have resulted in an exponential growth in happiness. Modern people are happier than medieval people, and medieval people were happier than stone age people.
But this progressive view is highly controversial. Though few would dispute the fact that human power has been growing since the dawn of history, it is far less clear that power correlates with happiness. The advent of agriculture, for example, increased the collective power of humankind by several orders of magnitude. Yet it did not necessarily improve the lot of the individual. For millions of years, human bodies and minds were adapted to running after gazelles, climbing trees to pick apples, and sniffing here and there in search of mushrooms. Peasant life, in contrast, included long hours of agricultural drudgery: ploughing, weeding, harvesting and carrying water buckets from the river. Such a lifestyle was harmful to human backs, knees and joints, and numbing to the human mind.
In return for all this hard work, peasants usually had a worse diet than hunter-gatherers, and suffered more from malnutrition and starvation. Their crowded settlements became hotbeds for new infectious diseases, most of which originated in domesticated farm animals. Agriculture also opened the way for social stratification, exploitation and possibly patriarchy. From the viewpoint of individual happiness, the “agricultural revolution” was, in the words of the scientist Jared Diamond, “the worst mistake in the history of the human race”.
The case of the agricultural revolution is not a single aberration, however. Themarch of progress from the first Sumerian city-states to the empires of Assyria and Babylonia was accompanied by a steady deterioration in the social status and economic freedom of women. The European Renaissance, for all its marvellous discoveries and inventions, benefited few people outside the circle of male elites. The spread of European empires fostered the exchange of technologies, ideas and products, yet this was hardly good news for millions of Native Americans, Africans and Aboriginal Australians.
The point need not be elaborated further. Scholars have thrashed the Whig view of history so thoroughly, that the only question left is: why do so many people still believe in it? [Continue reading…]
Emily Singer writes: About 4 billion years ago, molecules began to make copies of themselves, an event that marked the beginning of life on Earth. A few hundred million years later, primitive organisms began to split into the different branches that make up the tree of life. In between those two seminal events, some of the greatest innovations in existence emerged: the cell, the genetic code and an energy system to fuel it all. All three of these are essential to life as we know it, yet scientists know disappointingly little about how any of these remarkable biological innovations came about.
“It’s very hard to infer even the relative ordering of evolutionary events before the last common ancestor,” said Greg Fournier, a geobiologist at the Massachusetts Institute of Technology. Cells may have appeared before energy metabolism, or perhaps it was the other way around. Without fossils or DNA preserved from organisms living during this period, scientists have had little data to work from.
Fournier is leading an attempt to reconstruct the history of life in those evolutionary dark ages — the hundreds of millions of years between the time when life first emerged and when it split into what would become the endless tangle of existence.
He is using genomic data from living organisms to infer the DNA sequence of ancient genes as part of a growing field known as paleogenomics. In research published online in March in the Journal of Molecular Evolution, Fournier showed that the last chemical letter added to the code was a molecule called tryptophan — an amino acid most famous for its presence in turkey dinners. The work supports the idea that the genetic code evolved gradually. [Continue reading…]
Lina Zeldovich writes: What do Jamie Lee Curtis, gut bacteria, and a long forgotten Russian scientist have in common? Why, yogurt, of course. But wait, the answer is not that easy. Behind it stretches a tale that shows you can never predict cultural influence. It wends its way through the Pasteur Institute, the Nobel Prize, one of the hottest fields of scientific research today, the microbiome, and one of the trendiest avenues in nutrition, probiotics. It all began in the 19th century with a hyperactive kid in Russia who had a preternatural ability to connect dots where nobody saw dots at all.
When Ilya Metchnikoff was 8 and running around on his parents’ Panassovka estate in Little Russia, now Ukraine, he was making notes on the local flora like a junior botanist. He gave science lectures to his older brothers and local kids whose attendance he assured by paying them from his pocket money. Metchnikoff earned the nickname “Quicksilver” because he was in constant motion, always wanting to see, taste, and try everything, from studying how his father played card games to learning to sew and embroider with the maids. His wife later wrote in The Life of Ellie Metchnikoff that Metchnikoff asked the “queerest” questions, often exasperating his caretakers. “He could only be kept quiet when his curiosity was awakened by observation of some natural objects such as an insect or a butterfly.”
At 16, Metchnikoff borrowed a microscope from a university professor to study the lower organisms. Darwin’s On the Origin of Species shaped his comparative approach to science during his university years — he viewed all organisms, and physiological processes that took place in them, as interconnected and related.
That ability led him to the discovery of a particular cell and enabled him to link digestive processes in primitive creatures to the human body’s immune defenses. In lower organisms, which lack the abdominal cavity and intestines, digestion is accomplished by a particular type of cells — mobile mesodermal cells — that move around engulfing and dissolving food particles. While staring at mesodermal cells inside transparent starfish larvae, Metchnikoff, 37 at the time, had a thought. “It struck me that similar cells might serve in the defense of the organisms against intruders,” he wrote. He fetched a few rose thorns from the garden and stuck them into the larvae. If his hypothesis was correct, the larva’s body would recognize thorns as intruders and mesodermal cells would aggregate around the thorns in an attempt to gobble them up. As Metchnikoff expected, the mesodermal cells surrounded the thorns, proving his theory. He named his cells phagocytes, which in Greek means “devouring cells,” and likened them to an “army hurling itself upon the enemy.” [Continue reading…]
David Dobbs writes: A few years ago, Gene Robinson, of Urbana, Illinois, asked some associates in southern Mexico to help him kidnap some 1,000 newborns. For their victims they chose bees. Half were European honeybees, Apis mellifera ligustica, the sweet-tempered kind most beekeepers raise. The other half were ligustica’s genetically close cousins, Apis mellifera scutellata, the African strain better known as killer bees. Though the two subspecies are nearly indistinguishable, the latter defend territory far more aggressively. Kick a European honeybee hive and perhaps a hundred bees will attack you. Kick a killer bee hive and you may suffer a thousand stings or more. Two thousand will kill you.
Working carefully, Robinson’s conspirators — researchers at Mexico’s National Center for Research in Animal Physiology, in the high resort town of Ixtapan de la Sal — jiggled loose the lids from two African hives and two European hives, pulled free a few honeycomb racks, plucked off about 250 of the youngest bees from each hive, and painted marks on the bees’ tiny backs. Then they switched each set of newborns into the hive of the other subspecies.
Robinson, back in his office at the University of Illinois at Urbana-Champaign’s Department of Entomology, did not fret about the bees’ safety. He knew that if you move bees to a new colony in their first day, the colony accepts them as its own. Nevertheless, Robinson did expect the bees would be changed by their adoptive homes: He expected the killer bees to take on the European bees’ moderate ways and the European bees to assume the killer bees’ more violent temperament. Robinson had discovered this in prior experiments. But he hadn’t yet figured out how it happened.
He suspected the answer lay in the bees’ genes. He didn’t expect the bees’ actual DNA to change: Random mutations aside, genes generally don’t change during an organism’s lifetime. Rather, he suspected the bees’ genes would behave differently in their new homes — wildly differently.
This notion was both reasonable and radical. Scientists have known for decades that genes can vary their level of activity, as if controlled by dimmer switches. Most cells in your body contain every one of your 22,000 or so genes. But in any given cell at any given time, only a tiny percentage of those genes is active, sending out chemical messages that affect the activity of the cell. This variable gene activity, called gene expression, is how your body does most of its work. [Continue reading…]
Nessa Carey writes: When President Obama delivered a speech at MIT in 2009, he used a common science metaphor: “We have always been about innovation,” he said. “We have always been about discovery. That’s in our DNA.” Deoxyribonucleic acid, the chemical into which our genes are encoded, has become the metaphor of choice for a whole constellation of ideas about essence and identity. A certain mystique surrounds it. As Evelyn Fox Keller argues in her book The Century of the Gene, the genome is, in the popular imagination at least, the secret of life, the holy grail. It is a master builder, the ultimate computer program, and a modern-day echo of the soul, all wrapped up in one. This fantasy does not sit easily, however, with geneticists who have grown more aware over the last several decades that the relationship between genes and biological traits is much less than certain.
The popular understanding of DNA as a blueprint for organisms, with a one-to-one correspondence between genes and traits (called phenotypes), is the legacy of the early history of genetics. The term “gene” was coined in 1909 to refer to abstract units of inheritance, predating the discovery of DNA by forty years. Biologists came to think of genes like beads on a string that lined up neatly into chromosomes, with each gene determining a single phenotype. But, while some genes do correspond to traits in a straightforward way, as in eye color or blood group, most phenotypes are far more complex, set in motion by many different genes as well as by the environment in which the organism lives.
It turns out that the genetic code is less like a blueprint and more like a movie script, subject to revision and reinterpretation by a director. This process is called epigenetic modification (“epi” meaning “above” or “in addition to”). Just as a script can be altered with crossed-out words, sentences or scenes, epigenetic editing allows entire sections of DNA to be activated or de-activated. Genes can be as finely tuned as actors responding to stage directions to shout, whisper, or cackle. [Continue reading…]
Henry Nicholls writes: When the HMS Beagle dropped anchor on San Cristobal, the easternmost island in the Galapagos archipelago, in September 1835, the ship’s naturalist Charles Darwin eagerly went ashore to gather samples of the insects, birds, reptiles, and plants living there. At first, he didn’t think much of the arid landscape, which appeared to be “covered by stunted, sun-burnt brushwood…as leafless as our trees during winter” But this did not put him off. By the time the Beagle left these islands some five weeks later, he had amassed a spectacular collection of Galapagos plants.
It is fortunate that he took such trouble. Most popular narratives of Darwin and the Galapagos concentrate on the far more celebrated finches or the giant tortoises. Yet when he finally published On the Origin of Species almost 25 years later, Darwin made no mention of these creatures. In his discussion of the Galapagos, he dwelt almost exclusively on the islands’ plants.
By the early 19th century, there was increasing interest in what we now refer to as biogeography, the study of the distribution of species around the globe. Many people still imagined that God had been involved in the creation of species, putting fully formed versions down on Earth that continued to reproduce themselves, dispersing from a divine “center of creation” to occupy their current habitats. To explain how the plants and animals reached far-flung places such as the isolated Galapagos, several naturalists imagined that there had to have been land bridges, long-since subsided, that had once connected them to a continent. But in the wake of the Beagle voyage, the collection of Galapagos plants suggested an alternate scenario.
Even if there had once been a land bridge to the islands, it could not account for the fact that half of the plant species Darwin collected were unique to the Galapagos, and that most of them were particular to just one island. “I never dreamed that islands, about fifty or sixty miles apart, and most of them in sight of each other, formed of precisely the same rocks, placed under a quite similar climate, rising to a nearly equal height, would have been differently tenanted,” wrote Darwin in his Journal of Researches. His observations could be best explained if species were not fixed in nature but somehow changed as the seeds traveled to different locations. [Continue reading…]
Mark Oppenheimer writes: On Wednesday, in an interview in London, Gov. Scott Walker of Wisconsin, a potential Republican presidential candidate, sidestepped the question of whether he believed in evolution.
“I’m going to punt on that one,” he said to an audience at a research organization in London, which he was visiting for a trade mission. “I’m here to talk about trade, not to pontificate on other issues. I love the evolution of trade in Wisconsin.”
Mr. Walker’s response was not all that surprising — evolution is a sensitive issue for the evangelical Christian base of the Republican Party and presidential candidates have had to tread carefully around it.
The theory of evolution may be supported by a consensus of scientists, but none of the likely Republican candidates for 2016 seem to be convinced. Former Gov. Jeb Bush of Florida said it should not be taught in schools. Former Gov. Mike Huckabee of Arkansas is an outright skeptic. Senator Ted Cruz of Texas will not talk about it. When asked, in 2001, what he thought of the theory, Gov. Chris Christie of New Jersey said, “None of your business.”
After Mr. Walker’s response, the interviewer in London, an incredulous Justin Webb of the BBC, said to the governor: “Any British politician, right or left wing, would laugh and say, ‘Yes, of course evolution is true.’ ” [Continue reading…]
Adam Gopnik writes: Darwin’s Delay is by now nearly as famous as Hamlet’s, and involves a similar cast of characters: a family ghost, an unhappy lover, and a lot of men digging up old bones. Although it ends with vindication and fame, rather than with slaughter and self-knowledge, it was resolved by language, too — by inner soliloquy forcing itself out into the world, except that in this case the inner voice had the certainties and the outer one the hesitations.
The delay set in between Darwin’s first intimations of his Great Idea, the idea of evolution by natural selection, in the eighteen-thirties (he was already toying with it during his famous voyage on the H.M.S. Beagle), and the publication of “On the Origin of Species,” in 1859. By legend, the two events were in the long run one: Darwin saw the adapted beaks of his many finches, brooded on what they meant, came up with a theory, sought evidence for it, and was prodded into print at last by an unwelcome letter from an obscure naturalist named Alfred Russel Wallace, who had managed to arrive at the same idea.
It seems to have been more complicated than that. One reason Darwin spent so long getting ready to write his masterpiece without getting it written was that he knew what it would mean for faith and life, and, as Janet Browne’s now standard biography makes plain, he was frightened about being attacked by the powerful and the bigoted. Darwin was not a brave man — had the Inquisition been in place in Britain, he never would have published — but he wasn’t a humble man or a cautious thinker, either. He sensed that his account would end any intellectually credible idea of divine creation, and he wanted to break belief without harming the believer, particularly his wife, Emma, whom he loved devotedly and with whom he had shared, before he sat down to write, a private tragedy that seemed tolerable to her only through faith. The problem he faced was also a rhetorical one: how to say something that had never been said before in a way that made it sound like something everybody had always known — how to make an idea potentially scary and subversive sound as sane and straightforward as he believed it to be.
He did it, and doing it was, in some part, a triumph of style. Darwin is the one indisputably great scientist whose scientific work is still read by amateurs. [Continue reading…]
Ivan Semeniuk reports: Francesco Berna still remembers his first visit to Manot Cave, accidentally discovered in 2008 on a ridge in northern Israel. A narrow passage steeply descends into darkness. It then opens onto a 60-metre-long cavern with side chambers, all dramatically ornamented with stalactites and stalagmites.“It’s a spectacular cave,” said Dr. Berna, a geoarcheologist at Simon Fraser University in Burnaby, B.C. “It’s basically untouched.”
Now Manot Cave has yielded a tantalizing sign of humanity’s initial emergence out of Africa and a possible forerunner of the first modern humans in Europe, an international team of researchers that includes Dr. Berna said on Wednesday.
The find also establishes the Levant region (including Israel, Lebanon and part of Syria) as a plausible setting where our species interbred with its Neanderthal cousins.
The team’s key piece of evidence is a partial human skull found during the initial reconnaissance of the cave.
Based on its features and dimensions, the skull is unquestionably that of an anatomically modern human, the first such find in the region. The individual would probably have looked like the first Homo sapiens that appeared in Africa about 200,000 years ago and been physically indistinguishable from humans today.
“He or she would look very modern. With a tie on, you would not be able to tell the difference,” said Israel Hershkovitz, a biological anthropologist at Tel Aviv University and lead author of a paper published this week in the journal Nature that documents the Manot Cave find.
The age of the fossil is the crucial detail. The team’s analysis shows it is about 55,000 years old. That is more recent than the fragmentary remains of some not-so-modern-looking humans that drifted into the region at an earlier stage. But it coincides exactly with a period when a wetter climate may have opened the door to the first modern human migration out of Africa.
Fossils of modern humans that are only slightly less old than the Manot Cave skull have been found in the Czech Republic and Romania, making the new find a potential forerunner of the first Europeans. [Continue reading…]
Much of the reporting on these findings makes reference to “the first Europeans” and even though anthropologists might be clear about what they mean when they use to term Europe, they might consider avoiding using it, given the common meaning that is usually attached to the word.
Indeed, the lead researcher cited above, Israel Hershkovitz, illustrates the problem as he reinforces cultural stereotypes by implying that the human has fully evolved once he adorns the symbol of European, masculine power: a necktie. The irony is compounded by the fact that he and his team were trumpeting the significance of their discovery of a woman’s skull.
(No doubt many Europeans and others with European affectations have been disturbed this week to see Greece’s new prime minister, in the birthplace of democracy, assuming power without a necktie.)
The Oxford archeologist, Barry Cunliffe, has referred to the region of land that recently got dubbed “Europe” as “the westerly excrescence of the continent of Asia.”
Europeans might object to the suggestion that they inhabit an excrescence — especially since the terms suggests an abnormality — but in terms of continental topography, it points to Europe’s unique feature: its eastern boundaries have always been elastic and somewhat arbitrary.
More importantly, when it comes to human evolution, to frame this in terms of the advance into Europe revives so many echoes of nineteenth century racism.
It cannot be overstated that the first Europeans were not European.
Europe is an idea that has only been around for a few hundred years during which time it has been under constant revision.
Migration is also a misleading term since it evokes images of migrants: people who travel vast distances to inhabit new lands.
Human dispersal most likely involved rather short hops, one generation at a time, interspersed with occasional actual migrations driven by events like floods or famine.