Category Archives: Anthropology

The most arrogant creatures on Earth

Dominique Mosbergen writes: Researchers from the University of Adelaide in Australia argue in an upcoming book, The Dynamic Human, that humans really aren’t much smarter than other creatures — and that some animals may actually be brighter than we are.

“For millennia, all kinds of authorities — from religion to eminent scholars — have been repeating the same idea ad nauseam, that humans are exceptional by virtue that they are the smartest in the animal kingdom,” the book’s co-author Dr. Arthur Saniotis, a visiting research fellow with the university’s School of Medical Sciences, said in a written statement. “However, science tells us that animals can have cognitive faculties that are superior to human beings.”

Not to mention, ongoing research on intelligence and primate brain evolution backs the idea that humans aren’t the cleverest creatures on Earth, co-author Dr. Maciej Henneberg, a professor also at the School of Medical Sciences, told The Huffington Post in an email.

The researchers said the belief in the superiority of that human intelligence can be traced back around 10,000 years to the Agricultural Revolution, when humans began domesticating animals. The idea was reinforced with the advent of organized religion, which emphasized human beings’ superiority over other creatures. [Continue reading…]

At various times in my life, I’ve crossed paths with people possessing immense wealth and power, providing me with glimpses of the mindset of those who regard themselves as the most important people on this planet.

From what I can tell, the concentration of great power does not coincide with the expression of great intelligence. What is far more evident is a great sense of entitlement, which is to say a self-validating sense that power rests where power belongs and that the inequality in its distribution is a reflection of some kind of natural order.

Since this self-serving perception of hierarchical order operates among humans and since humans as a species wield so much more power than any other, it’s perhaps not surprising that we exhibit the same kind of hubris collectively that we see individually in the most dominant among us.

Nevertheless, it is becoming increasingly clear that our sense of superiority is rooted in ignorance.

Amit Majmudar writes: There may come a time when we cease to regard animals as inferior, preliminary iterations of the human—with the human thought of as the pinnacle of evolution so far—and instead regard all forms of life as fugue-like elaborations of a single musical theme.

Animals are routinely superhuman in one way or another. They outstrip us in this or that perceptual or physical ability, and we think nothing of it. It is only our kind of superiority (in the use of tools, basically) that we select as the marker of “real” superiority. A human being with an elephant’s hippocampus would end up like Funes the Memorious in the story by Borges; a human being with a dog’s olfactory bulb would become a Vermeer of scent, but his art would be lost on the rest of us, with our visually dominated brains. The poetry of the orcas is yet to be translated; I suspect that the whale sagas will have much more interesting things in them than the tablets and inscriptions of Sumer and Akkad.

If science should ever persuade people of this biological unity, it would be of far greater benefit to the species than penicillin or cardiopulmonary bypass; of far greater benefit to the planet than the piecemeal successes of environmental activism. We will have arrived, by study and reasoning, at the intuitive, mystical insights of poets.

Facebooktwittermail

How computers are making people stupid

The pursuit of artificial intelligence has been driven by the assumption that if human intelligence can be replicated or advanced upon by machines then this accomplishment will in various ways serve the human good. At the same time, thanks to the technophobia promoted in some dystopian science fiction, there is a popular fear that if machines become smarter than people we will end up becoming their slaves.

It turns out that even if there are some irrational fears wrapped up in technophobia, there are good reasons to regard computing devices as a threat to human intelligence.

It’s not that we are creating machines that harbor evil designs to take over the world, but simply that each time we delegate a function of the brain to an external piece of circuitry, our mental faculties inevitably atrophy.

Use it or lose it applies just as much to the brain as it does to any other part of the body.

Carolyn Gregoire writes: Take a moment to think about the last time you memorized someone’s phone number. Was it way back when, perhaps circa 2001? And when was the last time you were at a dinner party or having a conversation with friends, when you whipped out your smartphone to Google the answer to someone’s question? Probably last week.

Technology changes the way we live our daily lives, the way we learn, and the way we use our faculties of attention — and a growing body of research has suggested that it may have profound effects on our memories (particularly the short-term, or working, memory), altering and in some cases impairing its function.

The implications of a poor working memory on our brain functioning and overall intelligence levels are difficult to over-estimate.

“The depth of our intelligence hinges on our ability to transfer information from working memory, the scratch pad of consciousness, to long-term memory, the mind’s filing system,” Nicholas Carr, author of The Shallows: What The Internet Is Doing To Our Brains, wrote in Wired in 2010. “When facts and experiences enter our long-term memory, we are able to weave them into the complex ideas that give richness to our thought.”

While our long-term memory has a nearly unlimited capacity, the short-term memory has more limited storage, and that storage is very fragile. “A break in our attention can sweep its contents from our mind,” Carr explains.

Meanwhile, new research has found that taking photos — an increasingly ubiquitous practice in our smartphone-obsessed culture — actually hinders our ability to remember that which we’re capturing on camera.

Concerned about premature memory loss? You probably should be. Here are five things you should know about the way technology is affecting your memory.

1. Information overload makes it harder to retain information.

Even a single session of Internet usage can make it more difficult to file away information in your memory, says Erik Fransén, computer science professor at Sweden’s KTH Royal Institute of Technology. And according to Tony Schwartz, productivity expert and author of The Way We’re Working Isn’t Working, most of us aren’t able to effectively manage the overload of information we’re constantly bombarded with. [Continue reading…]

As I pointed out in a recent post, the externalization of intelligence long preceded the creation of smart phones and personal computers. Indeed, it goes all the way back to the beginning of civilization when we first learned how to transform language into a material form as the written word, thereby creating a substitute for memory.

Plato foresaw the consequences of writing.

In Phaedrus, he describes an exchange between the god Thamus, king and ruler of all Egypt, and the god Theuth, who has invented writing. Theuth, who is very proud of what he has created says: “This invention, O king, will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.” But Thamus points out that while one man has the ability to invent, the ability to judge an invention’s usefulness or harmfulness belongs to another.

If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.

Bedazzled by our ingenuity and its creations, we are fast forgetting the value of this quality that can never be implanted in a machine (or a text): wisdom.

Facebooktwittermail

Worried about terrorism? You should be more afraid of bread!

David Perlmutter, MD writes: While gluten makes up the lion’s share of protein in wheat, research reveals that modern wheat is capable of producing more than 23,000 different proteins, any one of which could trigger a potentially damaging inflammatory response. One protein in particular is wheat germ agglutinin (WGA). WGA is classified as a lectin — a term for a protein produced by an organism to protect itself from predation.

All grains produce lectins, which selectively bind to unique proteins on the surfaces of bacteria, fungi, and insects. These proteins are found throughout the animal kingdom. One protein in particular for which WGA has an extremely high affinity is N-Acetylglucosamine. N-Acetylglucosamine richly adorns the casing of insects and plays an important role in the structure of the cellular walls of bacteria. More importantly, it is a key structural component in humans in a variety of tissues, including tendons, joint surfaces, cartilage, the lining of the entire digestive tract, and even the lining of the hundreds of miles of blood vessels found within each of us.

It is precisely the ability of WGA to bind to proteins lining the gut that raises concern amongst medical researchers. When WGA binds to these proteins, it may leave these cells less well protected against the harmful effects of the gut contents.

WGA may also have direct toxic effects on the heart, endocrine, and immune systems, and even the brain. In fact, so readily does WGA make its way into the brain that scientists are actually testing it as a possible means of delivering medicines in an attempt to treat Alzheimer’s disease.

And again, the concern here is not just for a small segment of the population who happened to inherit susceptibility for sensitivity to gluten. This is a concern as it relates to all humans. As medical researcher Sayer Ji stated, “What is unique about WGA is that it can do direct damage to the majority of tissues in the human body without requiring a specific set of genetic susceptibilities and/or immune-mediated articulations. This may explain why chronic inflammatory and degenerative conditions are endemic to wheat-consuming populations even when overt allergies or intolerances to wheat gluten appear exceedingly rare.”

The gluten issue is indeed very real and threatening. But it now seems clear that lectin proteins found in wheat may harbor the potential for even more detrimental effects on human health. It is particularly alarming to consider the fact that there is a move to actually genetically modify wheat to enhance its WGA content.

Scientific research is now giving us yet another reason to reconsider the merits of our daily bread. The story of WGA’s potential destructive effects on human health is just beginning to be told. We should embrace the notion that low levels of exposure to any toxin over an extended period can lead to serious health issues. And this may well characterize the under-recognized threat of wheat consumption for all humans.

Facebooktwittermail

Baffling 400,000-year-old clue to human origins

The New York Times reports: Scientists have found the oldest DNA evidence yet of humans’ biological history. But instead of neatly clarifying human evolution, the finding is adding new mysteries.

In a paper in the journal Nature, scientists reported Wednesday that they had retrieved ancient human DNA from a fossil dating back about 400,000 years, shattering the previous record of 100,000 years.

The fossil, a thigh bone found in Spain, had previously seemed to many experts to belong to a forerunner of Neanderthals. But its DNA tells a very different story. It most closely resembles DNA from an enigmatic lineage of humans known as Denisovans. Until now, Denisovans were known only from DNA retrieved from 80,000-year-old remains in Siberia, 4,000 miles east of where the new DNA was found.

The mismatch between the anatomical and genetic evidence surprised the scientists, who are now rethinking human evolution over the past few hundred thousand years. It is possible, for example, that there are many extinct human populations that scientists have yet to discover. They might have interbred, swapping DNA. Scientists hope that further studies of extremely ancient human DNA will clarify the mystery.

“Right now, we’ve basically generated a big question mark,” said Matthias Meyer, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and a co-author of the new study. [Continue reading…]

Facebooktwittermail

The Western European roots of Native Americans

The New York Times reports: The genome of a young boy buried at Mal’ta near Lake Baikal in eastern Siberia some 24,000 years ago has turned out to hold two surprises for anthropologists.

The first is that the boy’s DNA matches that of Western Europeans, showing that during the last Ice Age people from Europe had reached farther east across Eurasia than previously supposed. Though none of the Mal’ta boy’s skin or hair survives, his genes suggest he would have had brown hair, brown eyes and freckled skin.

The second surprise is that his DNA also matches a large proportion — about 25 percent — of the DNA of living Native Americans. The first people to arrive in the Americas have long been assumed to have descended from Siberian populations related to East Asians. It now seems that they may be a mixture between the Western Europeans who had reached Siberia and an East Asian population.

The Mal’ta boy was 3 to 4 years old and was buried under a stone slab wearing an ivory diadem, a bead necklace and a bird-shaped pendant. Elsewhere at the same site about 30 Venus figurines were found of the kind produced by the Upper Paleolithic cultures of Europe. The remains were excavated by Russian archaeologists over a 20-year period ending in 1958 and stored in museums in St. Petersburg.

There they lay for some 50 years until they were examined by a team led by Eske Willerslev of the University of Copenhagen. Dr. Willerslev, an expert in analyzing ancient DNA, was seeking to understand the peopling of the Americas by searching for possible source populations in Siberia. He extracted DNA from bone taken from the child’s upper arm, hoping to find ancestry in the East Asian peoples from whom Native Americans are known to be descended.

But the first results were disappointing. The boy’s mitochondrial DNA belonged to the lineage known as U, which is commonly found among the modern humans who first entered Europe about 44,000 years ago. The lineages found among Native Americans are those designated A, B, C, D and X, so the U lineage pointed to contamination of the bone by the archaeologists or museum curators who had handled it, a common problem with ancient DNA projects. “The study was put on low speed for about a year because I thought it was all contamination,” Dr. Willerslev said.

His team proceeded anyway to analyze the nuclear genome, which contains the major part of human inheritance. They were amazed when the nuclear genome also turned out to have partly European ancestry. Examining the genome from a second Siberian grave site, that of an adult who died 17,000 years ago, they found the same markers of European origin. Together, the two genomes indicate that descendants of the modern humans who entered Europe had spread much farther east across Eurasia than had previously been assumed and occupied Siberia during an extremely cold period starting 20,000 years ago that is known as the Last Glacial Maximum.

The other surprise from the Mal’ta boy’s genome was that it matched to both Europeans and Native Americans but not to East Asians. Dr. Willerslev’s interpretation was that the ancestors of Native Americans had already separated from the East Asian population when they interbred with the people of the Mal’ta culture, and that this admixed population then crossed over the Beringian land bridge that then lay between Siberia and Alaska to become a founding population of Native Americans. [Continue reading…]

Facebooktwittermail

Social complexity and facial diversity among primates

UCLA Newsroom: Why do the faces of some primates contain so many different colors — black, blue, red, orange and white — that are mixed in all kinds of combinations and often striking patterns while other primate faces are quite plain?

UCLA biologists reported last year on the evolution of 129 primate faces in species from Central and South America. This research team now reports on the faces of 139 Old World African and Asian primate species that have been diversifying over some 25 million years.

With these Old World monkeys and apes, the species that are more social have more complex facial patterns, the biologists found. Species that have smaller group sizes tend to have simpler faces with fewer colors, perhaps because the presence of more color patches in the face results in greater potential for facial variation across individuals within species. This variation could aid in identification, which may be a more difficult task in larger groups.

Species that live in the same habitat with other closely related species tend to have more complex facial patterns, suggesting that complex faces may also aid in species recognition, the life scientists found.

“Humans are crazy for Facebook, but our research suggests that primates have been relying on the face to tell friends from competitors for the last 50 million years and that social pressures have guided the evolution of the enormous diversity of faces we see across the group today,” said Michael Alfaro, an associate professor of ecology and evolutionary biology in the UCLA College of Letters and Science and senior author of the study.

“Faces are really important to how monkeys and apes can tell one another apart,” he said. “We think the color patterns have to do both with the importance of telling individuals of your own species apart from closely related species and for social communication among members of the same species.” [Continue reading…]

Facebooktwittermail

Allergies and the ‘farm effect’

Moises Velasquez-Manoff writes: Will the cure for allergies come from the cowshed?

Allergies are often seen as an accident. Your immune system misinterprets a harmless protein like dust or peanuts as a threat, and when you encounter it, you pay the price with sneezing, wheezing, and in the worst cases, death.

What prompts some immune systems to err like this, while others never do? Some of the vulnerability is surely genetic. But comparative studies highlight the importance of environment, beginning, it seems, in the womb. Microbes are one intriguing protective factor. Certain ones seem to stimulate a mother’s immune system during pregnancy, preventing allergic disease in children.

By emulating this naturally occurring phenomenon, scientists may one day devise a way to prevent allergies.

This task, though still in its infancy, has some urgency. Depending on the study and population, the prevalence of allergic disease and asthma increased between two- and threefold in the late 20th century, a mysterious trend often called the “allergy epidemic.”

These days, one in five American children have a respiratory allergy like hay fever, and nearly one in 10 have asthma.

Nine people die daily from asthma attacks. While the increase in respiratory allergies shows some signs of leveling off, the prevalence of food and skin allergies continues to rise. Five percent of children are allergic to peanuts, milk and other foods, half again as many as 15 years ago. And each new generation seems to have more severe, potentially life-threatening allergic reactions than the last.

Some time ago, I visited a place where seemingly protective microbes occurred spontaneously. It wasn’t a spotless laboratory in some university somewhere. It was a manure-spattered cowshed in Indiana’s Amish country.

My guide was Mark Holbreich, an allergist in Indianapolis. He’d recently discovered that the Amish people who lived in the northern part of the state were remarkably free of allergies and asthma.

About half of Americans have evidence of allergic sensitization, which increases the risk of allergic disease. But judging from skin-prick tests, just 7.2 percent of the 138 Amish children who Dr. Holbreich tested were sensitized to tree pollens and other allergens. That yawning difference positions the Indiana Amish among the least allergic populations ever described in the developed world.

This invulnerability isn’t likely to be genetic. The Amish originally came to the United States from the German-speaking part of Switzerland, and these days Swiss children, a genetically similar population, are about as allergic as Americans.

Ninety-two percent of the Amish children Dr. Holbreich tested either lived on farms or visited one frequently. Farming, Dr. Holbreich thinks, is the Amish secret. This idea has some history. Since the late 1990s, European scientists have investigated what they call the “farm effect.” [Continue reading…]

Facebooktwittermail

Skull of Homo erectus throws story of human evolution into disarray

The Guardian reports: The spectacular fossilised skull of an ancient human ancestor that died nearly two million years ago in central Asia has forced scientists to rethink the story of early human evolution.

Anthropologists unearthed the skull at a site in Dmanisi, a small town in southern Georgia, where other remains of human ancestors, simple stone tools and long-extinct animals have been dated to 1.8m years old.

Experts believe the skull is one of the most important fossil finds to date, but it has proved as controversial as it is stunning. Analysis of the skull and other remains at Dmanisi suggests that scientists have been too ready to name separate species of human ancestors in Africa. Many of those species may now have to be wiped from the textbooks.

The latest fossil is the only intact skull ever found of a human ancestor that lived in the early Pleistocene, when our predecessors first walked out of Africa. The skull adds to a haul of bones recovered from Dmanisi that belong to five individuals, most likely an elderly male, two other adult males, a young female and a juvenile of unknown sex.

The site was a busy watering hole that human ancestors shared with giant extinct cheetahs, sabre-toothed cats and other beasts. The remains of the individuals were found in collapsed dens where carnivores had apparently dragged the carcasses to eat. They are thought to have died within a few hundred years of one another.

“Nobody has ever seen such a well-preserved skull from this period,” said Christoph Zollikofer, a professor at Zurich University’s Anthropological Institute, who worked on the remains. “This is the first complete skull of an adult early Homo. They simply did not exist before,” he said. Homo is the genus of great apes that emerged around 2.4m years ago and includes modern humans.

Other researchers said the fossil was an extraordinary discovery. “The significance is difficult to overstate. It is stunning in its completeness. This is going to be one of the real classics in paleoanthropology,” said Tim White, an expert on human evolution at the University of California, Berkeley. [Continue reading…]

Facebooktwittermail

Is war natural?

There are all kinds of problems in posing the question, is war natural? If we conclude it is unnatural, then we are likely to treat it as an aberration that might be avoided if we were to simply know better — that in some sense all wars happen by mistake.

If we conclude that war-making is an intrinsic feature of human nature, then we assume a kind of fatalism that views war as ugly but unavoidable.

In an essay which probes the question of whether humans have the instinct to make war, the evolutionary biologist, David P Barash, makes an important distinction between violence and war — the former being an adaptation, while the latter a capacity. He explains the distinction between adaptation and capacity in this way:

Language is almost certainly an adaptation, something that all normal human beings can do, although the details vary with circumstance. By contrast, reading and writing are capacities, derivative traits that are unlikely to have been directly selected for, but have developed through cultural processes. Similarly, walking and probably running are adaptations; doing cartwheels or handstands are capacities.

Barash writes:

[Napoleon Chagnon’s] best-selling book The Fierce People (1968) [on the Yanomami people of the Venezuelan/Brazilian Amazon] has been especially influential in enshrining an image of tribal humanity as living in a state of ‘chronic warfare’.

Chagnon has been the subject of intense criticism but, to my mind, there is simply no question about the empirical validity and theoretical value of his research. In a field (call it evolutionary psychology or, as I prefer, human sociobiology) that has often been criticised for a relative absence of hard data, his findings, however politically distasteful, have been welcome indeed. Among these, one of the most convincing has been Chagnon’s demonstration that, among the Yanomami, not only is inter-village ‘warfare’ frequent and lethal, but that Yanomami men who have killed other men experience significantly higher reproductive success — evolutionary fitness — than do non-killers. His data, although disputed by other specialists, appear altogether reliable and robust.

So I admire the man, and his work, but I have a growing sense of discomfort about the way that Chagnon’s Yanomami research has been interpreted and the inferences that have been drawn from it.

I fear that many of my colleagues have failed, as previously have I, to distinguish between the relatively straightforward evolutionary roots of human violence and the more complex, multifaceted and politically fraught question of human war. To be blunt, violence is almost certainly deeply entrenched in human nature; warfare, not so much. A fascination with the remarkably clear correlation between Yanomami violence and male fitness has blinded us to the full range of human non-violence, causing us to ignore and undervalue realms of peacemaking in favour of a focus on exciting and attention-grabbing patterns of war-making.

As an evolutionary scientist, I have been enthusiastic about identifying the adaptive significance — the evolutionary imprint — of apparently universal human traits. For a long time, it seemed that Chagnon’s finding of the reproductive success of Yanomami men who were killers was one of the most robust pieces of evidence for this. Now I am not so sure, and this is my mea culpa.

There has also been a tendency among evolutionary thinkers to fix upon certain human groups as uniquely revelatory, not simply because the research about them is robust, but also because their stories are both riveting and consistent with our pre-existing expectations. They are just plain fun to talk about, especially for men.

Remember, too, the journalists’ edict: ‘If it bleeds, it leads.’ You are unlikely to see a newspaper headline announcing that ‘France and Germany Did Not Go To War’, whereas a single lethal episode, anywhere in the world, is readily pounced upon as news. Language conventions speak volumes, too. It is said that the Bedouin have nearly 100 different words for camels, distinguishing between those that are calm, energetic, aggressive, smooth-gaited, or rough, etc. Although we carefully identify a multitude of wars — the Hundred Years War, the Thirty Years War, the American Civil War, the Vietnam War, and so forth — we don’t have a plural form for peace.

It makes evolutionary sense that human beings pay special attention to episodes of violence, whether interpersonal or international: they are matters of life and death, after all. But when serious scientists do the same and, what is more, when they base ‘normative’ conclusions about the human species on what is simply a consequence of their selective attention, we all have a problem.

The most serious problem with Chagnon’s influence on our understanding of human nature is one familiar to many branches of science: generalising from one data set — however intensive — to a wider universe of phenomena. Academic psychologists, for example, are still reeling from a 2010 study by the University of British Columbia which found that the majority of psychological research derives from college students who are ‘Western, Educated, Industrialised, Rich, and Democratic’ — in short, WEIRD. [See “The Weirdest People in the World“.] Similarly, the Yanomami are only one of a large number of very different, tribal human societies. Given the immense diversity of human cultural traditions, any single group of Homo sapiens must be considered profoundly unrepresentative of the species as a whole.

Just as the Yanomami can legitimately be cited as notably violence-prone — at both the individual and group level — many other comparable tribal peoples do not engage in anything remotely resembling warfare. These include the Batek of Malaysia, the Hadza of Tanzania, the Martu of Australia, a half-dozen or more indigenous South Indian forager societies, and numerous others, each of whom are no less human than those regularly trotted out to ‘prove’ our inherent war-proneness.

In the Dark Ages of biology, taxonomists used to identify a ‘type species’ thought to represent each genus, but the idea no longer has any currency in biology. The great evolutionary biologist Ernst Mayr effectively demonstrated that statistical and population thinking trumps the idea of a Platonic concept of ‘types’, independent of the actual diversity of living things, not least Homo sapiens. Yet anthropologists (and biologists, who should know better) seem to have fallen into the trap of seizing upon a few human societies, and generalising them as representative of Homo sapiens as a whole. Regrettably, this tendency to identify ‘type societies’ has been especially acute when it comes to establishing the supposed prevalence of human warfare.

In his justly admired book The Better Angels of our Nature (2011), the evolutionary psychologist Steven Pinker made a powerful case that human violence — interpersonal as well as warring — has diminished substantially in recent times. But in his eagerness to emphasise the ameliorating effects of historically recent social norms, Pinker exaggerated our pre-existing ‘natural’ level of war-proneness, claiming that ‘chronic raiding and feuding characterised life in a state of nature’. The truth is otherwise. As recent studies by the anthropologist Douglas Fry and others have shown, the overwhelmingly predominant way of life for most of our evolutionary history — in fact, pretty much the only one prior to the Neolithic revolution — was that of nomadic hunter-gatherers. And although such people engage in their share of interpersonal violence, warfare in the sense of group-based lethal violence directed at other groups is almost non-existent, having emerged only with early agricultural surpluses and the elaboration of larger-scale, tribal organisation, complete with a warrior ethos and proto-military leadership. [Continue reading…]

Facebooktwittermail

Europe’s prehistoric milk revolution

Nature reports: In the 1970s, archaeologist Peter Bogucki was excavating a Stone Age site in the fertile plains of central Poland when he came across an assortment of odd artefacts. The people who had lived there around 7,000 years ago were among central Europe’s first farmers, and they had left behind fragments of pottery dotted with tiny holes. It looked as though the coarse red clay had been baked while pierced with pieces of straw.

Looking back through the archaeological literature, Bogucki found other examples of ancient perforated pottery. “They were so unusual — people would almost always include them in publications,” says Bogucki, now at Princeton University in New Jersey. He had seen something similar at a friend’s house that was used for straining cheese, so he speculated that the pottery might be connected with cheese-making. But he had no way to test his idea.

The mystery potsherds sat in storage until 2011, when Mélanie Roffet-Salque pulled them out and analysed fatty residues preserved in the clay. Roffet-Salque, a geochemist at the University of Bristol, UK, found signatures of abundant milk fats — evidence that the early farmers had used the pottery as sieves to separate fatty milk solids from liquid whey. That makes the Polish relics the oldest known evidence of cheese-making in the world.

Roffet-Salque’s sleuthing is part of a wave of discoveries about the history of milk in Europe. Many of them have come from a €3.3-million (US$4.4-million) project that started in 2009 and has involved archaeologists, chemists and geneticists. The findings from this group illuminate the profound ways that dairy products have shaped human settlement on the continent.

During the most recent ice age, milk was essentially a toxin to adults because — unlike children — they could not produce the lactase enzyme required to break down lactose, the main sugar in milk. But as farming started to replace hunting and gathering in the Middle East around 11,000 years ago, cattle herders learned how to reduce lactose in dairy products to tolerable levels by fermenting milk to make cheese or yogurt. Several thousand years later, a genetic mutation spread through Europe that gave people the ability to produce lactase — and drink milk — throughout their lives. That adaptation opened up a rich new source of nutrition that could have sustained communities when harvests failed. [Continue reading…]

Facebooktwittermail

The decline of human intelligence

Huffington Post reports: Our technology may be getting smarter, but a provocative new study suggests human intelligence is on the decline. In fact, it indicates that Westerners have lost 14 I.Q. points on average since the Victorian Era.

What exactly explains this decline? Study co-author Dr. Jan te Nijenhuis, professor of work and organizational psychology at the University of Amsterdam, points to the fact that women of high intelligence tend to have fewer children than do women of lower intelligence. This negative association between I.Q. and fertility has been demonstrated time and again in research over the last century.

But this isn’t the first evidence of a possible decline in human intelligence.

“The reduction in human intelligence (if there is any reduction) would have begun at the time that genetic selection became more relaxed,” Dr. Gerald Crabtree, professor of pathology and developmental biology at Stanford University, told The Huffington Post in an email. “I projected this occurred as our ancestors began to live in more supportive high density societies (cities) and had access to a steady supply of food. Both of these might have resulted from the invention of agriculture, which occurred about 5,000 to 12,000 years ago.” [Continue reading…]

Facebooktwittermail

Australia’s Aboriginals

Before visiting Matamata, a lost-in-the-bush village of 25 or so people in Australia’s Northern Territory, Michael Finkel needed permission from the village’s matriarch, Phyllis Batumbil. She agreed and asked him to bring dinner for everyone:

I unloaded two duffels of personal effects and a dozen bags of groceries. Dinner for 25, I mentioned, is quite a load. Batumbil nodded. Take a look at all that food, she said. Could you imagine catching that much in one day using only a spear? And then again the next day and the day after that? I said it would be just about impossible. Aboriginal people, she said, have been doing it every day for at least 50,000 years.

For 49,800 of those years they had the continent to themselves. There were once about 250 distinct Aboriginal languages, hundreds more dialects, and many more clans and subgroups. But there is deep spiritual and cultural overlap among them, and indigenous Australians I spoke with said it was not insulting to combine everyone together under the general title of Aboriginal. They call themselves Aboriginals. They lived for a couple of thousand generations in small, nomadic bands, as befits a hunter-gatherer existence, moving in their own rhythms about the vast expanse of Australia. Then on April 29, 1770, British explorer James Cook landed his ship, the Endeavour, on the southeastern shore. The next two centuries were a horror show of cultural obliteration — massacres, disease, alcoholism, forced integration, surrender.

More than a half million Aboriginals currently live in Australia, less than 3 percent of the population. Few have learned to perform an Aboriginal dance or hunt with a spear. Many anthropologists credit Aboriginals with possessing the world’s longest enduring religion as well as the longest continuing art forms — the cross-hatched and dot-patterned painting styles once inscribed in caves and rock shelters. They are one of the most durable societies the planet has ever known. But the traditional Aboriginal way of life is now, by any real measure, almost extinct.

Almost. There remain a few places. Foremost is a region known as Arnhem Land, where Matamata is located, along with a couple dozen other communities, all connected by rough dirt roads passable only in dry weather. [Continue reading…]

Facebooktwittermail

Were Neanderthals the mental equals of modern humans?

Red disks next to stenciled hand-prints in El Castillo cave, Spain -- one at least 40,800 years old -- might be the creation of Neanderthals.

Tim Appenzeller writes: [D]id the Neanderthals, once caricatured as brute cavemen, have minds like our own, capable of abstract thinking, symbolism and even art? It is one of the most haunting questions about the people who once shared a continent with us, then mysteriously vanished.

An early date for the paintings [found in El Castillo cave, Spain] would also be a vindication for the slight, dark-haired man watching as Pike works [taking samples of calcite accretions formed on the surface of the paintings]: João Zilhão, who has emerged as the leading advocate for Neanderthals, relentlessly pressing the case that these ice-age Europeans were our cognitive equals. Zilhão, an archaeologist at the Catalan Institution for Research and Advanced Studies at the University of Barcelona in Spain, believes that other signs of sophisticated Neanderthal culture have already proved his point. But he is willing to debate on his opponents’ terms. “To my mind, we don’t need that evidence,” he says of the paintings. “But I guess for many of my colleagues this would be the smoking gun.”

The front line in the Neanderthal wars runs through another cave: Grotte du Renne, 1,000 kilometres away in central France. As early as the 1950s, excavations there unearthed a collection of puzzling artefacts. Among them were bone awls, distinctive stone blades and palaeolithic baubles — the teeth of animals such as foxes or marmots, grooved or pierced so that they could be worn on a string. They were buried beneath artefacts typical of the first modern humans in Europe, suggesting that these objects were older. A startling possibility loomed: that artefacts of this style, collectively known as the Châtelperronian industry, were made by Neanderthals. [Continue reading…]

Facebooktwittermail

Chimpanzees can engage in metacognition — they can think about their own thinking

Georgia State University: Humans’ closest animal relatives, chimpanzees, have the ability to “think about thinking” – what is called “metacognition,” according to new research by scientists at Georgia State University and the University at Buffalo.

Michael J. Beran and Bonnie M. Perdue of the Georgia State Language Research Center (LRC) and J. David Smith of the University at Buffalo conducted the research, published in the journal Psychological Science of the Association for Psychological Science.

“The demonstration of metacognition in nonhuman primates has important implications regarding the emergence of self-reflective mind during humans’ cognitive evolution,” the research team noted.

Metacognition is the ability to recognize one’s own cognitive states. For example, a game show contestant must make the decision to “phone a friend” or risk it all, dependent on how confident he or she is in knowing the answer.

“There has been an intense debate in the scientific literature in recent years over whether metacognition is unique to humans,” Beran said.

Chimpanzees at Georgia State’s LRC have been trained to use a language-like system of symbols to name things, giving researchers a unique way to query animals about their states of knowing or not knowing.

In the experiment, researchers tested the chimpanzees on a task that required them to use symbols to name what food was hidden in a location. If a piece of banana was hidden, the chimpanzees would report that fact and gain the food by touching the symbol for banana on their symbol keyboards.

But then, the researchers provided chimpanzees either with complete or incomplete information about the identity of the food rewards.

In some cases, the chimpanzees had already seen what item was available in the hidden location and could immediately name it by touching the correct symbol without going to look at the item in the hidden location to see what it was.

In other cases, the chimpanzees could not know what food item was in the hidden location, because either they had not seen any food yet on that trial, or because even if they had seen a food item, it may not have been the one moved to the hidden location.

In those cases, they should have first gone to look in the hidden location before trying to name any food.

In the end, chimpanzees named items immediately and directly when they knew what was there, but they sought out more information before naming when they did not already know.

The research team said, “This pattern of behavior reflects a controlled information-seeking capacity that serves to support intelligent responding, and it strongly suggests that our closest living relative has metacognitive abilities closely related to those of humans.”

The research was supported in part by the National Institutes of Health and the National Science Foundation.

Facebooktwittermail

Monkeys don’t like selfish people

Jalees Rehman writes: When we observe an interaction between two other human beings (Person A and Person B), we sometimes draw conclusions about the personality traits or character of these two individuals. For example, if we see that Person A is being rude to Person B, we may be less likely to trust Person A, even though we are merely “third-party” evaluators. i.e. not directly involved in the interaction. Multiple studies with humans have already documented such third-party social evaluation, which can even occur among children. A study published in 2010 showed that 3-year old children were less likely to help adults who had previously acted in a harmful manner in front of the kids, i.e. torn up a picture drawn by another adult in a staged experiment.

Do animals who observe humans also conduct such third-party social evaluations of humans? The recent study “Third-party social evaluation of humans by monkeys” published in Nature Communications by James Anderson and colleagues staged interactions with human actors in front of tufted capuchin monkeys (Cebus apella). The researchers found that the monkeys indeed evaluate humans after witnessing third-party interactions involving either helpful interventions or a failure to help fellow humans.

In front of each monkey, two actors performed either “helper” sessions or “non-helper” sessions. In the “helper” sessions, Actor A tried to get a toy out of a container and requested help from Actor B, who complied and helped out Actor A. In the “non-helper” sessions, Actor B refused to help. After the sessions, both actors offered a piece of food to the monkey. In the helper sessions, monkeys readily accepted food from both actors. On the other hand, monkeys in the non-helper sessions accepted food more frequently from actor A (the requester of help) than Actor B (the non-helper). [Continue reading…]

Facebooktwittermail

Self-knowledge is required for human survival

E.O. Wilson writes: Evolutionary biologists have searched for the grandmaster of advanced social evolution, the combination of forces and environmental circumstances that bestowed greater longevity and more successful reproduction on the possession of high social intelligence. At present there are two competing theories of the principal force. The first is kin selection: individuals favor collateral kin (relatives other than offspring) making it easier for altruism to evolve among members of the same group. Altruism in turn engenders complex social organization, and, in the one case that involves big mammals, human-level intelligence.

The second, more recently argued theory (full disclosure: I am one of the modern version’s authors), the grandmaster is multilevel selection. This formulation recognizes two levels at which natural selection operates: individual selection based on competition and cooperation among members of the same group, and group selection, which arises from competition and cooperation between groups. Multilevel selection is gaining in favor among evolutionary biologists because of a recent mathematical proof that kin selection can arise only under special conditions that demonstrably do not exist, and the better fit of multilevel selection to all of the two dozen known animal cases of eusocial evolution.

The roles of both individual and group selection are indelibly stamped (to borrow a phrase from Charles Darwin) upon our social behavior. As expected, we are intensely interested in the minutiae of behavior of those around us. Gossip is a prevailing subject of conversation, everywhere from hunter-gatherer campsites to royal courts. The mind is a kaleidoscopically shifting map of others, each of whom is drawn emotionally in shades of trust, love, hatred, suspicion, admiration, envy and sociability. We are compulsively driven to create and belong to groups, variously nested, overlapping or separate, and large or small. Almost all groups compete with those of similar kind in some manner or other. We tend to think of our own as superior, and we find our identity within them.

The existence of competition and conflict, the latter often violent, has been a hallmark of societies as far back as archaeological evidence is able to offer. These and other traits we call human nature are so deeply resident in our emotions and habits of thought as to seem just part of some greater nature, like the air we all breathe, and the molecular machinery that drives all of life. But they are not. Instead, they are among the idiosyncratic hereditary traits that define our species.

The major features of the biological origins of our species are coming into focus, and with this clarification the potential of a more fruitful contact between science and the humanities. The convergence between these two great branches of learning will matter hugely when enough people have thought it through. On the science side, genetics, the brain sciences, evolutionary biology, and paleontology will be seen in a different light. Students will be taught prehistory as well as conventional history, the whole presented as the living world’s greatest epic.

We will also, I believe, take a more serious look at our place in nature. Exalted we are indeed, risen to be the mind of the biosphere without a doubt, our spirits capable of awe and ever more breathtaking leaps of imagination. But we are still part of earth’s fauna and flora. We are bound to it by emotion, physiology, and not least, deep history. It is dangerous to think of this planet as a way station to a better world, or continue to convert it into a literal, human-engineered spaceship. Contrary to general opinion, demons and gods do not vie for our allegiance. We are self-made, independent, alone and fragile. Self-understanding is what counts for long-term survival, both for individuals and for the species.

Facebooktwittermail