Category Archives: Anthropology

Wade Davis challenges Jared Diamond’s perspective on traditional societies

In a review of The World Until Yesterday: What Can We Learn from Traditional Societies?, Wade Davis says that Jared Diamond’s approach to anthropology is rooted in many of the prejudices of the nineteenth century which saw societies from traditional to modern as stages in a linear progression of advancement.

The other peoples of the world are not failed attempts at modernity, let alone failed attempts to be us. They are unique expressions of the human imagination and heart, unique answers to a fundamental question: what does it mean to be human and alive? When asked this question, the cultures of the world respond in 7000 different voices, and these answers collectively comprise our human repertoire for dealing with all the challenges that will confront us as a species as we continue this never-ending journey.It is against this backdrop that one must consider the popular but controversial writings of Jared Diamond, a wide-ranging scholar variously described as biogeographer, evolutionary biologist, psychologist, ornithologist and physiologist. In Guns, Germs and Steel, Diamond set out to solve what was for him a conundrum. Why was it that some cultures such as our own rose to technological, economic and political predominance, while others such as the Aborigines of Australia did not? Rejecting notions of race, intelligence, innate biological differences of any kind, he finds his explanation in the environment and geography. Advanced civilisations arose where the environment allowed for plant domestication, leading to the generation of surplus and population growth, which in turn led to political centralisation and social stratification. No surprises there.

In Collapse, Diamond returned to the theme of environmental determinism as he pondered why and how great civilisations come to an end. Evoking the ecological fable of Easter Island, he suggests that cultures fall as people fail to meet the challenges imposed by nature, as they misuse natural resources, and ultimately drift blindly beyond a point of no return.

Again nothing to suggest controversy, save for the shallowness of the arguments, and it is this characteristic of Diamond’s writings that drives anthropologists to distraction. The very premise of Guns, Germs and Steel is that a hierarchy of progress exists in the realm of culture, with measures of success that are exclusively material and technological; the fascinating intellectual challenge is to determine just why the west ended up on top. In the posing of this question, Diamond evokes 19th-century thinking that modern anthropology fundamentally rejects. The triumph of secular materialism may be the conceit of modernity, but it does very little to unveil the essence of culture or to account for its diversity and complexity.

Consider Diamond’s discussion of the Australian Aborigines in Guns, Germs and Steel. In accounting for their simple material culture, their failure to develop writing or agriculture, he laudably rejects notions of race, noting that there is no correlation between intelligence and technological prowess. Yet in seeking ecological and climatic explanations for the development of their way of life, he is as certain of their essential primitiveness as were the early European settlers who remained unconvinced that Aborigines were human beings. The thought that the hundreds of distinct tribes of Australia might simply represent different ways of being, embodying the consequences of unique sets of intellectual and spiritual choices, does not seem to have occurred to him.

In truth, as the anthropologist WEH Stanner long appreciated, the visionary realm of the Aborigines represents one of the great experiments in human thought. In place of technological wizardry, they invented a matrix of connectivity, an intricate web of social relations based on more than 100 named kin relationships. If they failed to embrace European notions of progress, it was not because they were savages, as the settlers assumed, but rather because in their intellectual universe, distilled in a devotional philosophy known as the Dreaming, there was no notion of linear progression whatsoever, no idealisation of the possibility or promise of change. There was no concept of past, present, or future. In not one of the hundreds of Aboriginal dialects and languages was there a word for time. The entire purpose of humanity was not to improve anything; it was to engage in the ritual and ceremonial activities deemed to be essential for the maintenance of the world precisely as it was at the moment of creation. Imagine if all of Western intellectual and scientific passion had focused from the beginning of time on keeping the Garden of Eden precisely as it was when Adam and Eve had their fateful conversation. [Continue reading…]

Facebooktwittermail

Jared Diamond: The World Until Yesterday

Jared Diamond: In Guns, Germs, and Steel, I set out to explain why, after the end of the last Ice Age, the most powerful and technologically advanced societies developed first in the Fertile Crescent and spread from there to Europe and North America. In Collapse, I asked why many societies disintegrated or vanished while others avoided that fate, and what lessons those varying outcomes hold for us today.

Since publication of Collapse, I have been attempting to understand the revolutionary changes in human societies and social relations brought about by the emergence of state government, after seven million years of simpler forms of organization. The differences are so profound, and we take state government as so completely natural, that I couldn’t even pose the differences as questions until I had experienced them first-hand through living in New Guinea, a window on our past.

I first went to New Guinea 41 years ago to study birds and to have adventures. I knew intellectually that New Guineans constituted most of the world’s last remaining “primitive” peoples, who until a few decades ago still used stone tools, little clothing, and no writing. That was what the whole world used to be like until 7,000 years ago, a mere blink of an eye in the history of the human species. Only in a couple of other parts of the world besides New Guinea did our original long-prevailing “primitive” ways survive into the 20th century.

Stone tools, little clothing, and no writing proved to be only the least of the differences between our past and our present. There were other differences that I noticed within my first year in New Guinea: murderous hostility towards any stranger, marriageable young people having no role in choosing their spouse, lack of awareness of the existence of an outside world, and routine multi-lingualism from childhood.

But there were also more profound features, which took me a long time even to notice, because they are so at odds with modem experience that neither New Guineans nor I could even articulate them. Each of us took some aspects of our lifestyle for granted and couldn’t conceive of an alternative. Those other New Guinea features included the non-existence of “friendship” (associating with someone just because you like them), a much greater awareness of rare hazards, war as an omnipresent reality, morality in a world without judicial recourse, and a vital role of very old people.

I’ve encountered myself some features of our primitive lifestyle among Inuits, Amazonian Indians, and Aboriginal Australians. Others have described such features among Kalahari Bushmen, African Pygmies, Ainus, California Indians, and other peoples. Of course, all these peoples differ from each other. New Guinea’s 1,000 tribes themselves are diverse: they constitute 1,000 radically different experiments in constructing a human society. But they all share (or shared) some basic features that used to characterize all human societies until the rise of state societies with laws and government, beginning around 5,500 years ago in the Fertile Crescent and now established over the entire world.

I am once again treating a huge question about human societies, and in using methods of long-term comparative history. In this case I am placing much heavier weight on one area of the world, New Guinea. Yes, I include other parts of the world: I’ve already been gathering material on the Mongols, Cherokees, Zulus, and many others. But New Guinea will play a central role, as the standard of comparison. That’s as it should be, because New Guinea’s 1,000 languages make up one 6th of all surviving languages, and because New Guinea contained by far the largest number of people and tribes still living under pre-state conditions in modem times.

The other difference is that my studies of New Guinea are based much more heavily on my personal experiences and observations, and less on publications by others. Many of my experiences in New Guinea have been intense — a sudden encounter at night with a wild man, the prolonged agony of a nearly-fatal boat accident, one broken little stick in the forest warning us that nomads might be about to catch us as trespassers … Those stories carry a bigger message than mere exciting adventures that have shaped my outlook. They give us a first-hand picture of the human past as it has been for millions of years — a past that has almost vanished within our lifetimes, and that no one will ever be able to experience again.

I’ve gone from wrestling with difficult abstract questions about the structure of societies—about their rise from agricultural origins in Guns, Germs, and Steel, and about their decline or maintenance in Collapse. The abstractness of those questions forced me to work hard to put human faces on them. The New Guinea work is more passionate, personal, and from the heart.

In conversation with John Brockman at Edge, Diamond recounts some of the stories from his new book, The World Until Yesterday.

Facebooktwittermail

The human wanderlust

Map of the South Pacific made by the Polynesian navigator, Tupaia.

National Geographic: In the winter of 1769, the British explorer Captain James Cook, early into his first voyage across the Pacific, received from a Polynesian priest named Tupaia an astonishing gift — a map, the first that any European had ever encountered showing all the major islands of the South Pacific. Some accounts say Tupaia sketched the map on paper; others that he described it in words. What’s certain is that this map instantly gave Cook a far more complete picture of the South Pacific than any other European possessed. It showed every major island group in an area some 3,000 miles across, from the Marquesas west to Fiji. It matched what Cook had already seen, and showed much he hadn’t.

Cook had granted Tupaia a berth on the Endeavour in Tahiti. Soon after that, the Polynesian wowed the crew by navigating to an island unknown to Cook, some 300 miles south, without ever consulting compass, chart, clock, or sextant. In the weeks that followed, as he helped guide the Endeavour from one archipelago to another, Tupaia amazed the sailors by pointing on request, at any time, day or night, cloudy or clear, precisely toward Tahiti.

Cook, uniquely among European explorers, understood what Tupaia’s feats meant. The islanders scattered across the South Pacific were one people, who long ago, probably before Britain was Britain, had explored, settled, and mapped this vast ocean without any of the navigational tools that Cook found essential—and had carried the map solely in their heads ever since.

Two centuries later a global network of geneticists analyzing DNA bread-crumb trails of modern human migration would prove Cook right: Tupaia’s ancestors had colonized the Pacific 2,300 years before. Their improbable migration across the Pacific continued a long eastward march that had begun in Africa 70,000 to 50,000 years earlier. Cook’s journey, meanwhile, continued a westward movement started by his own ancestors, who had left Africa around the same time Tupaia’s ancestors had. In meeting each other, Cook and Tupaia closed the circle, completing a journey their forebears had begun together, so many millennia before. [Continue reading…]

Facebooktwittermail

The price of human domestication

Civilization is overrated — and often confused with culture, whose development predates civilization by tens of thousands of years.

The popular view is that once we started ploughing fields and building cities, we could rise above the needs of mere survival and start cultivating our higher faculties through art and science, and that did indeed happen — for a privileged few. For the mass of humanity however, civilization turned people into a herded animal.

We didn’t just domesticate plants and livestock but also human populations. And it turns out that like every other kind of domestication, the conditions suited to mass reproduction also serve as breeding grounds for harmful mutations.

For humans the vast majority of harmful mutations have occurred in the last 5-10,000 years with the highest concentration among Europeans.

redOrbit.com: In a world that’s more than 4 billion years old, humans have only existed for a fraction of that — roughly 200,000 years. In those 200,000 years of existence, not a lot is known about genetic mutation until we close in on the last 5 to 10 thousand years. It is within that time that researchers believe nearly 75 percent of gene mutations have occurred, making our DNA distinctly different now than it was way back when.

This finding has been calculated in new research from the University of Washington, published in this week’s issue of the journal Nature. The results, based on a genetic study of roughly 6,500 Americans (4,298 European-Americans and 2,217 African-Americans), were gleaned from studying 1 million single-letter variations in the human DNA code. These variations revealed that most of the mutations seen are of recent origin. And more than 86 percent of the harmful protein-coding mutations found occurred during the past 10 millennia. In all, about 14 percent of mutations identified were found to be harmful.

While the researchers found instances of harmful mutations, most were benign and had no effect on people, and a few more may even be beneficial. While each specific mutation is rare, the findings of the study suggest that the human population acquired abundance of single-nucleotide genetic variants in a relatively short time.

“Recent human history has profoundly shaped patterns of genetic variation present in contemporary populations,” study researcher Joshua Akey, of the University of Washington, told Business Insider in an email. “Our results suggest that ~90% of evolutionary deleterious variants arose in the last 200-400 generations.”

Akey said the expanding human growth in population has enabled DNA errors to occur more abruptly. He noted that people with European ancestry have shown the most of these new deleterious mutations because the population boom was more recent among Europeans, and natural selection has yet to remove them.

“There’s an enormous amount of recently arisen, rare mutations that’s directly attributable to the explosive population growth over the last two to four generations,” Akey told Business Week’s Elizabeth Lapatto in a phone interview.

The population of the planet has just soared beyond 7 billion, according to US Census Bureau data. That’s nearly triple the 1950 population of 2.5 billion. Such a rapid increase in population could allow unusual combinations of gene mutations to affect more people albeit remaining relatively rare, Akey said.

While some mutations are seen in the lettering of our genes, other mutations change the way the proteins made from those genes act. Some of these deleterious mutations can have negative impacts on humans’ ability to survive and reproduce, while others could be evolutionary fodder for improving the human race.

“Each generation, humanity incurs on the order of 10^11 new mutations,” Akey said. “The vast majority of these either have no phenotypic or functional consequences, or are deleterious. However, a small fraction are expected to be advantageous [sic].”

“What specific traits they may influence would just be pure speculation, but we can reasonably posit they exist and will be potential substrates for natural selection to act on in the future,” Akey wrote.

Akey added that as the population continues to balloon, so too will new mutations. The growing population makes it more likely that new mutations will be introduced, such as those linked to autism, leading to an increase in other diseases. [Continue reading…]

Facebooktwittermail

The fate of the species

Charles C. Mann writes: About 75,000 years ago, a huge volcano exploded on the island of Sumatra. The biggest blast for several million years, the eruption created Lake Toba, the world’s biggest crater lake, and ejected the equivalent of as much as 3,000 cubic kilometers of rock, enough to cover the District of Columbia in a layer of magma and ash that would reach to the stratosphere. A gigantic plume spread west, enveloping southern Asia in tephra (rock, ash, and dust). Drifts in Pakistan and India reached as high as six meters. Smaller tephra beds blanketed the Middle East and East Africa. Great rafts of pumice filled the sea and drifted almost to Antarctica.

In the long run, the eruption raised Asian soil fertility. In the short term, it was catastrophic. Dust hid the sun for as much as a decade, plunging the earth into a years-long winter accompanied by widespread drought. A vegetation collapse was followed by a collapse in the species that depended on vegetation, followed by a collapse in the species that depended on the species that depended on vegetation. Temperatures may have remained colder than normal for a thousand years. Orangutans, tigers, chimpanzees, cheetahs—all were pushed to the verge of extinction.

At about this time, many geneticists believe, Homo sapiens’ numbers shrank dramatically, perhaps to a few thousand people—the size of a big urban high school. The clearest evidence of this bottleneck is also its main legacy: humankind’s remarkable genetic uniformity. Countless people have viewed the differences between races as worth killing for, but compared to other primates—even compared to most other mammals—human beings are almost indistinguishable, genetically speaking. DNA is made from exceedingly long chains of “bases.” Typically, about one out of every 2,000 of these “bases” differs between one person and the next. The equivalent figure from two E. coli (human gut bacteria) might be about one out of twenty. The bacteria in our intestines, that is, have a hundredfold more innate variability than their hosts—evidence, researchers say, that our species is descended from a small group of founders.

Uniformity is hardly the only effect of a bottleneck. When a species shrinks in number, mutations can spread through the entire population with astonishing rapidity. Or genetic variants that may have already been in existence—arrays of genes that confer better planning skills, for example—can suddenly become more common, effectively reshaping the species within a few generations as once-unusual traits become widespread.

Did Toba, as theorists like Richard Dawkins have argued, cause an evolutionary bottleneck that set off the creation of behaviorally modern people, perhaps by helping previously rare genes—Neanderthal DNA or an opportune mutation—spread through our species? Or did the volcanic blast simply clear away other human species that had previously blocked H. sapiens’ expansion? Or was the volcano irrelevant to the deeper story of human change?

For now, the answers are the subject of careful back-and-forth in refereed journals and heated argument in faculty lounges. All that is clear is that about the time of Toba, new, behaviorally modern people charged so fast into the tephra that human footprints appeared in Australia within as few as 10,000 years, perhaps within 4,000 or 5,000. Stay-at-home Homo sapiens 1.0, a wallflower that would never have interested Lynn Margulis, had been replaced by aggressively expansive Homo sapiens 2.0. Something happened, for better and worse, and we were born.

One way to illustrate what this upgrade looked like is to consider Solenopsis invicta, the red imported fire ant. Geneticists believe that S. invicta originated in northern Argentina, an area with many rivers and frequent floods. The floods wipe out ant nests. Over the millennia, these small, furiously active creatures have acquired the ability to respond to rising water by coalescing into huge, floating, pullulating balls—workers on the outside, queen in the center—that drift to the edge of the flood. Once the waters recede, colonies swarm back into previously flooded land so rapidly that S. invicta actually can use the devastation to increase its range.

In the 1930s, Solenopsis invicta was transported to the United States, probably in ship ballast, which often consists of haphazardly loaded soil and gravel. As a teenaged bug enthusiast, Edward O. Wilson, the famed biologist, spotted the first colonies in the port of Mobile, Alabama. He saw some very happy fire ants. From the ant’s point of view, it had been dumped into an empty, recently flooded expanse. S. invicta took off, never looking back.

The initial incursion watched by Wilson was likely just a few thousand individuals—a number small enough to suggest that random, bottleneck-style genetic change played a role in the species’ subsequent history in this country. In their Argentine birthplace, fire-ant colonies constantly fight each other, reducing their numbers and creating space for other types of ant. In the United States, by contrast, the species forms cooperative supercolonies, linked clusters of nests that can spread for hundreds of miles. Systematically exploiting the landscape, these supercolonies monopolize every useful resource, wiping out other ant species along the way—models of zeal and rapacity. Transformed by chance and opportunity, new-model S. invictus needed just a few decades to conquer most of the southern United States.

Homo sapiens did something similar in the wake of Toba. For hundreds of thousands of years, our species had been restricted to East Africa (and, possibly, a similar area in the south). Now, abruptly, new-model Homo sapiens were racing across the continents like so many imported fire ants. The difference between humans and fire ants is that fire ants specialize in disturbed habitats. Humans, too, specialize in disturbed habitats—but we do the disturbing. [Continue reading…]

While Mann’s long essay is duly cautious about the future because we do indeed have a great capacity to mess things up, he notes major strides in human progress — such as the widespread abolition of slavery in the nineteenth century — which he sees as an indication of our capacity for enlightened change.

I have a less sanguine view. On that specific point, the abolition of slavery, it’s worth noting that it coincided with the industrial revolution and the growth of capitalism. The more skills workers require, the less practical it becomes that they are enslaved, but this is a matter of expedience — not the liberation of human potential.

As fewer and fewer people are required to operate machines, the need to control their behavior has not diminished. The control of human hands has been supplanted by the control of human desires. The enduring fact remains: throughout the world, small groups of people exert enormous power in shaping the lives of the rest of humanity.

Because of this division, it is very hard for the controlling elite to grasp the idea that ultimately we face a common fate. There is too much historical evidence supporting the view that however much misery might prevail in the world, it will always be possible for those with sufficient resources to insulate themselves from every new peril. By the time it becomes inescapably evident that this cannot always be true, it will already be too late.

Facebooktwittermail

I cry, therefore I am

Michael Trimble writes: In 2008, at a zoo in Münster, Germany, a gorilla named Gana gave birth to a male infant, who died after three months. Photographs of Gana, looking stricken and inconsolable, were ubiquitous. “Heartbroken gorilla cradles her dead baby,” Britain’s Daily Mail declared. Crowds thronged the zoo to see the grieving mother.

Sad as the scene was, the humans, not Gana, were the only ones crying. The notion that animals can weep — apologies to Dumbo, Bambi and Wilbur — has no scientific basis. Years of observations by the primatologists Dian Fossey, who observed gorillas, and Jane Goodall, who worked with chimpanzees, could not prove that animals cry tears from emotion.

In his book “The Emotional Lives of Animals,” the only tears the biologist Marc Bekoff were certain of were his own. Jeffrey Moussaieff Masson and Susan McCarthy, the authors of “When Elephants Weep,” admit that “most elephant watchers have never seen them weep.”

It’s true that many mammals shed tears, especially in response to pain. Tears protect the eye by keeping it moist, and they contain antimicrobial proteins. But crying as an embodiment of empathy is, I maintain, unique to humans and has played an essential role in human evolution and the development of human cultures.

Within two days an infant can imitate sad and happy faces. If a newborn mammal does not cry out (typically, in the first few weeks of life, without tears) it is unlikely to get the attention it needs to survive. Around three to four months, the relationship between the human infant and its environment takes on a more organized communicative role, and tearful crying begins to serve interpersonal purposes: the search for comfort and pacification. As we get older, crying becomes a tool of our social repertory: grief and joy, shame and pride, fear and manipulation.

Tears are as universal (but less culturally contingent) as laughter, and tragedy is more complex than joy — an insight Tolstoy and many others have offered. But although we all cry, we do so in different ways. [Continue reading…]

Facebooktwittermail

The evolution of running

With the civilizational bias that skews most people’s perceptions of human history, we have come to regard the notion of ‘primitive’ through its connotations, crude, unsophisticated, and poorly developed. Yet what is primitive is primary. It is the origin from which we have strayed and the essential we have largely forgotten.

Daniel Lieberman: [I]ncreases in brain size were not really an early event in human evolution, and in fact, they didn’t occur until after hunting and after the invention of hunting and gathering, and not even until cooking and various other technological inventions, which gave us the energy necessary to have really large brains.

Brains are very costly. Right now, just sitting here, my brain (even though I’m not doing much other than talking) is consuming about 20- 25 percent of my resting metabolic rate. That’s an enormous amount of energy, and to pay for that, I need to eat quite a lot of calories a day, maybe about 600 calories a day, which back in the Paleolithic was quite a difficult amount of energy to acquire. So having a brain of 1,400 cubic centimeters, about the size of my brain, is a fairly recent event and very costly.

The idea then is at what point did our brains become so important that we got the idea that brain size and intelligence really mattered more than our bodies? I contend that the answer was never, and certainly not until the Industrial Revolution.

Why did brains get so big? There are a number of obvious reasons. One of them, of course, is for culture and for cooperation and language and various other means by which we can interact with each other, and certainly those are enormous advantages. If you think about other early humans like Neanderthals, their brains are as large or even larger than the typical brain size of human beings today. Surely those brains are so costly that there would have had to be a strong benefit to outweigh the costs. So cognition and intelligence and language and all of those important tasks that we do must have been very important.

We mustn’t forget that those individuals were also hunter-gatherers. They worked extremely hard every day to get a living. A typical hunter-gatherer has to walk between nine and 15 kilometers a day. A typical female might walk 9 kilometers a day, a typical male hunter-gatherer might walk 15 kilometers a day, and that’s every single day. That’s day-in, day-out, there’s no weekend, there’s no retirement, and you do that for your whole life. It’s about the distance if you walk from Washington, DC to LA every year. That’s how much walking hunter-gatherers did every single year.

In addition, they’re constantly digging, they’re climbing trees, and they’re using their bodies intensely. I would argue that cognition was an extremely important factor in human evolution, along with language, theory of mind — all those cognitive developments that make us so sophisticated. But they weren’t a triumph of cognition over brute force; it was a combination. It was not brains over brawn, it was brains plus brawn, and that made possible the hunter-gatherer way of life.

What hunter-gatherers really do is they use division of labor, they have intense cooperation, they have intense social interactions, and they have group memory. All of those behaviors enable hunter-gatherers to interact in ways such that they can increase the rate at which they can acquire energy and have offspring at a higher rate than chimpanzees. It’s a very energetically intensive way of life that’s made possible by a combination of extraordinary intelligence, inventiveness, creativity, language, but also daily physical exercise.

The other reason we often discount the importance of brawn in our lives is that we have a very strange idea of what constitutes athleticism. Think about the events that we care about most in the Olympics. They’re the power sports. They’re the 100-meter dash, the 100-meter freestyle events. Most athletes, the ones we really value the most, are physically very powerful. But if you think about it this way, most humans are wimps.

Usain Bolt, who is the world’s fastest human being today, can run about 10.4 meters a second, and he can do so for about ten or 20 seconds. My dog, any goat, any sheep I can study in my lab, can run about twice as fast as Usain Bolt without any training, without any practice, any special technology, any drugs or whatever. Humans, the very fastest human beings, are incredibly slow compared to most mammals. Not only in terms of brute speed, but also in terms of how long they can go at a given speed. Usain Bolt can go 10.4 meters a second for about ten to 20 seconds. My dog or a goat or a lion or a gazelle or some antelope in Africa can run 20 meters a second for about four minutes. So there’s no way Usain Bolt could ever outrun any lion or for that matter run down any animal.

A typical chimpanzee is between about two and five times more powerful than a human being. A chimpanzee, who weighs less than a human, can just rip somebody’s arm off or rip their face off (as recently happened in Connecticut). It’s not that the chimpanzee is remarkably strong, it’s that we are remarkably weak. We have this notion that humans are terrible natural athletes. But we’ve been looking at the wrong kind of athleticism. What we’re really good at is not power, what we’re really phenomenal at is endurance. We’re the tortoises of the animal world, not the hares of the animal world. Humans can actually outrun most animals over very, very long distances.

David Attenborough follows the San people of the Kalahari desert, the last tribe on earth to use persistence hunting.

Facebooktwittermail

Eating meat may have ‘made us human’

Science Daily reports: A skull fragment unearthed by anthropologists in Tanzania shows that our ancient ancestors were eating meat at least 1.5 million years ago, shedding new light into the evolution of human physiology and brain development.

“Meat eating has always been considered one of the things that made us human, with the protein contributing to the growth of our brains,” said Charles Musiba, Ph.D., associate professor of anthropology at the University of Colorado Denver, who helped make the discovery. “Our work shows that 1.5 million years ago we were not opportunistic meat eaters, we were actively hunting and eating meat.”

The study was published October 3 in the peer-reviewed journal PLOS ONE.

The two-inch skull fragment was found at the famed Olduvai Gorge in northern Tanzania, a site that for decades has yielded numerous clues into the evolution of modern humans and is sometimes called `the cradle of mankind.’

The fragment belonged to a 2-year-old child and showed signs of porotic hyperostosis associated with anemia. According to the study, the condition was likely caused by a diet suddenly lacking in meat.

“The presence of anemia-induced porotic hyperostosis…indicates indirectly that by at least the early Pleistocene meat had become so essential to proper hominin functioning that its paucity or lack led to deleterious pathological conditions,” the study said. “Because fossils of very young hominin children are so rare in the early Pleistocene fossil record of East Africa, the occurrence of porotic hyperostosis in one…suggests we have only scratched the surface in our understanding of nutrition and health in ancestral populations of the deep past.”

Musiba said the evidence showed that the juvenile’s diet was deficient in vitamin B12 and B9. Meat seems to have been cut off during the weaning process.

“He was not getting the proper nutrients and probably died of malnutrition,” he said.

The study offers insights into the evolution of hominins including Homo sapiens. Musiba said the movement from a scavenger, largely plant-eating lifestyle to a meat-eating one may have provided the protein needed to grow our brains and give us an evolutionary boost. [Continue reading…]

Facebooktwittermail

With science, new portrait of the cave artist

The New York Times reports: Stone Age artists were painting red disks, handprints, clublike symbols and geometric patterns on European cave walls long before previously thought, in some cases more than 40,000 years ago, scientists reported on Thursday, after completing more reliable dating tests that raised a possibility that Neanderthals were the artists.

A more likely situation, the researchers said, is that the art — 50 samples from 11 caves in northwestern Spain — was created by anatomically modern humans fairly soon after their arrival in Europe.

The findings seem to put an exclamation point to a run of recent discoveries: direct evidence from fossils that Homo sapiens populations were living in England 41,500 to 44,200 years ago and in Italy 43,000 to 45,000 years ago, and that they were making flutes in German caves about 42,000 years ago. Then there is the new genetic evidence of modern human-Neanderthal interbreeding, suggesting a closer relationship than had been generally thought.

The successful application of a newly refined uranium-thorium dating technique is also expected to send other scientists to other caves to see if they can reclaim prehistoric bragging rights.

In the new research, an international team led by Alistair W. G. Pike of the University of Bristol in England determined that the red disk in the cave known as El Castillo was part of the earliest known wall decorations, at a minimum of 40,800 years old. That makes it the earliest cave art found so far in Europe, perhaps 4,000 years older than the paintings at Grotte Chauvet in France. [Continue reading…]

Facebooktwittermail

What crows can teach people

Are you as smart as a crow? Take this test to find out.

I’m proud to say I got it right the first time — but I probably had an advantage: I tamed two crows when I was a kid so I’ve spent some time staring them in the eye.

As a nine-year-old had I been growing up in an indigenous tribe on another continent I dare say my feat of being able to call a crow from a tree and have it swoop down and land on my outstretched arm might have set me on course for training as a shaman. Instead, after one of these crows swooped down to examine what kind of tasty morsel was tucked inside a baby stroller, protests from a distraught mother meant that these over-inquisitive crows were no longer welcome in our neighborhood. I had to promptly take the crows away and let them cause trouble some place else.

Forty-five years ago crows were under-appreciated. Now they are recognized as among the most intelligent creatures on the planet, able to pass on knowledge from one generation to the next and with tool-making skills that surpass those of chimpanzees. (To learn more, watch the video below.) Perhaps most intriguing, crows are able to recognize individual human faces. While we find it difficult to tell one crow from another, they can spot the differences between us from hundreds of feet away.

Professor John Marzluff at the University of Washington has studied crows’ face recognition abilities and speculates that crows need to be able to differentiate between people so that they can spot individuals who pose a threat. Robert Krulwich, describing Marzluff’s explanation for this aspect of crow intelligence, says: “It pays for a crow to pay attention, while we people, we are not threatened or helped by individual crows. So they need to know about us but we don’t need to know about them individually. And that’s what I call the crow paradox.”

I’ll come back to this paradox shortly, but I don’t find the idea very persuasive that crows need to be able to spot dangerous people.

The ancient and continued use of scarecrows is a testament to the enduring need of farmers to scare crows along with farmers’ stubborn persistence in using a technique that is largely ineffective. For the wary crow, the truly dangerous farmer is all too easy to spot — not by his face but by unmistakable warnings he often posts: shot crows strung up like mascots promising a similar fate to those who venture too near to his valuable crops.

A crow’s ability to recognize a human face may actually have nothing to do with a need to to differentiate individual people from one another. It may instead be a by-product of the fine level of discrimination crows need to be able to differentiate one crow from another inside their own complex social systems.

For the eye to which no two crows look alike, the differences between any two much larger people must seem all the more extreme. As for why crows would hone in on facial differences, it would be for exactly the same reason that people do: because we change attire and the face is the one feature of appearance and identity that maintains day-to-day continuity.

Returning then to Krulwich’s supposed crow paradox — that they need to be able to differentiate among us individually while we have no need to be able to distinguish one crow from another — there is in this idea a very anthropomorphic image of social structure where important individuals stand out and the unimportant blend into a homogenous mass. The implication is that the crow pays attention to the dangerous person while ignoring everyone else.

This is a very human idea — that some people don’t matter — and reflects a social deficit we incur by constructing social structures that stretch far beyond our perceptual horizons.

For the crow and probably every other non-human social animal, the world, circumscribed by the reach of the senses, is not filtered through preoccupations about a wider world. We, on the other hand, are perpetually inattentive to our immediate surroundings because we place our attention elsewhere — on thoughts, feelings, memories, expectations — on the things that we tell ourselves matter most.

Through this bifurcation between that which matters and that which supposedly doesn’t, we lose the keen alertness in which for birds and animals a much wider spectrum of the present, bound within a perceptual horizon, always matters.

Facebooktwittermail

New evidence suggests Stone Age hunters from Europe discovered America

The Washington Post reports: When the crew of the Virginia scallop trawler Cinmar hauled a mastodon tusk onto the deck in 1970, another oddity dropped out of the net: A dark, tapered stone blade, nearly eight inches long and still sharp.

Forty years later, this rediscovered prehistoric slasher has reopened debate on a radical theory about who the first Americans were and when they got here.

Archaeologists have long held that North America remained unpopulated until about 15,000 years ago, when Siberian people walked or boated into Alaska and down the West Coast.

But the mastodon relic turned out to be 22,000 years old, suggesting the blade was just as ancient.

Whoever fashioned that blade was not supposed to be here.

Its makers likely paddled from Europe and arrived in America thousands of years ahead of the western migration, argues Smithsonian Institution anthropologist Dennis Stanford, making them the first Americans.

“I think it’s feasible,” said Tom Dillehay, a prominent archaeologist at Vanderbilt University. “The evidence is building up and it certainly warrants discussion.”

At the height of the last Ice Age, Stanford says, mysterious stone-age European people known as the Solutreans paddled along an ice cap jutting into the North Atlantic. They lived like Inuits, harvesting seals and seabirds.

The Solutreans eventually spread across North America, Stanford argues, hauling their distinctive blades with them.

When Stanford proposed this “Solutrean hypothesis” in 1999, colleagues roundly rejected it. One prominent archaeologist suggested Stanford was throwing his career away.

But now, 13 years later, Stanford and Exeter University archaeologist Bruce Bradley lay out a detailed case – bolstered by the curious blade and other stone tools recently found in the mid-Atlantic – in a new book, Across Atlantic Ice.

“I drank the Solutrean Kool-aid,” said Steve Black, an archaeologist at Texas State University in San Marcos. “I had been very dubious. It’s something a lot of [archaeologists] have dismissed out of hand. But I came away from the book feeling like it’s an extremely credible idea that needs to be taken seriously.”

Facebooktwittermail