Category Archives: Anthropology

How climate may have shaped languages

Scientific American reports: Opera singers and dry air don’t get along. In fact, the best professional singers require humid settings to help them achieve the right pitch. “When your vocal cords are really dry, they’re a little less elastic,” says Caleb Everett, an anthropological linguist at the University of Miami. As a result, singers experience tiny variations in pitch, called jitter, as well as wavering volume—both of which contribute to rougher refrains.

If the amount of moisture in the air influences musical pitch, Everett wondered, has that translated into the development of fewer tonal languages in arid locations? Tonal languages, such as Mandarin Chinese and Cherokee, rely on variations in pitch to differentiate meaning: the same syllable spoken at a higher pitch can specify a different word if spoken at a lower pitch or in a rising or falling tone.

In a survey of more than 3,700 languages, Everett and his collaborators found that those with complex tones do indeed occur less frequently in dry areas than they do in humid ones, even after accounting for the clustering of related languages. For instance, more than half of the hundreds of languages spoken in tropical sub-Saharan locations feature complex tones, whereas none of the two dozen languages in the Sahara do. Overall, only one in 30 complex tonal languages flourished in dry areas; one in three nontonal languages cropped up in those same regions. The results appeared in February in the Proceedings of the National Academy of Sciences USA. [Continue reading…]

Facebooktwittermail

How useful is the idea of the Anthropocene?

Jedediah Purdy writes: As much as a scientific concept, the Anthropocene is a political and ethical gambit. Saying that we live in the Anthropocene is a way of saying that we cannot avoid responsibility for the world we are making. So far so good. The trouble starts when this charismatic, all-encompassing idea of the Anthropocene becomes an all-purpose projection screen and amplifier for one’s preferred version of ‘taking responsibility for the planet’.

Peter Kareiva, the controversial chief scientist of the Nature Conservancy, uses the theme ‘Conservation in the Anthropocene’ to trash environmentalism as philosophically naïve and politically backward. Kareiva urges conservationists to give up on wilderness and embrace what the writer Emma Marris calls the ‘rambunctious garden’. Specifically, Kareiva wants to rank ecosystems by the quality of ‘ecosystem services’ they provide for human beings instead of ‘pursuing the protection of biodiversity for biodiversity’s sake’. He wants a pro‑development stance that assumes that ‘nature is resilient rather than fragile’. He insists that: ‘Instead of scolding capitalism, conservationists should partner with corporations in a science-based effort to integrate the value of nature’s benefits into their operations and cultures.’ In other words, the end of nature is the signal to carry on with green-branded business as usual, and the business of business is business, as the Nature Conservancy’s partnerships with Dow, Monsanto, Coca-Cola, Pepsi, J P Morgan, Goldman Sachs and the mining giant Rio Tinto remind us.

Kareiva is a favourite of Andrew Revkin, the roving environmental maven of The New York Times Magazine, who touts him as a paragon of responsibility-taking, a leader among ‘scholars and doers who see that new models for thinking and acting are required in this time of the Anthropocene’. This pair and their friends at the Breakthrough Institute in California can be read as making a persistent effort to ‘rebrand’ environmentalism as humanitarian and development-friendly (and capture speaking and consultancy fees, which often seem to be the major ecosystem services of the Anthropocene). This is itself a branding strategy, an opportunity to slosh around old plonk in an ostentatiously shiny bottle. [Continue reading…]

Facebooktwittermail

A deficit in patience produces the illusion of a shortage of time

Chelsea Wald writes: Not long ago I diagnosed myself with the recently identified condition of sidewalk rage. It’s most pronounced when it comes to a certain friend who is a slow walker. Last month, as we sashayed our way to dinner, I found myself biting my tongue, thinking, I have to stop going places with her if I ever want to … get there!

You too can measure yourself on the “Pedestrian Aggressiveness Syndrome Scale,” a tool developed by University of Hawaii psychologist Leon James. While walking in a crowd, do you find yourself “acting in a hostile manner (staring, presenting a mean face, moving closer or faster than expected)” and “enjoying thoughts of violence?”

Slowness rage is not confined to the sidewalk, of course. Slow drivers, slow Internet, slow grocery lines — they all drive us crazy. Even the opening of this article may be going on a little too long for you. So I’ll get to the point. Slow things drive us crazy because the fast pace of society has warped our sense of timing. Things that our great-great-grandparents would have found miraculously efficient now drive us around the bend. Patience is a virtue that’s been vanquished in the Twitter age.

Once upon a time, cognitive scientists tell us, patience and impatience had an evolutionary purpose. They constituted a yin and yang balance, a finely tuned internal timer that tells when we’ve waited too long for something and should move on. When that timer went buzz, it was time to stop foraging at an unproductive patch or abandon a failing hunt.

“Why are we impatient? It’s a heritage from our evolution,” says Marc Wittmann, a psychologist at the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany. Impatience made sure we didn’t die from spending too long on a single unrewarding activity. It gave us the impulse to act.

But that good thing is gone. The fast pace of society has thrown our internal timer out of balance. It creates expectations that can’t be rewarded fast enough — or rewarded at all. When things move more slowly than we expect, our internal timer even plays tricks on us, stretching out the wait, summoning anger out of proportion to the delay. [Continue reading…]

Facebooktwittermail

Ruins of the ‘City of the Monkey God’ discovered in Honduran rain forest

Douglas Preston writes: The ruins were first identified in May 2012, during an aerial survey of a remote valley in La Mosquitia, a vast region of swamps, rivers, and mountains containing some of the last scientifically unexplored places on earth.

For a hundred years, explorers and prospectors told tales of the white ramparts of a lost city glimpsed above the jungle foliage. Indigenous stories speak of a “white house” or a “place of cacao” where Indians took refuge from Spanish conquistadores — a mystical, Eden-like paradise from which no one ever returned.

Since the 1920s, several expeditions had searched for the White City, or Ciudad Blanca. The eccentric explorer Theodore Morde mounted the most famous of these in 1940, under the aegis of the Museum of the American Indian (now part of the Smithsonian Institution).

Morde returned from Mosquitia with thousands of artifacts, claiming to have entered the City. According to Morde, the indigenous people there said it contained a giant, buried statue of a monkey god. He refused to divulge the location out of fear, he said, that the site would be looted. He later committed suicide and his site—if it existed at all—was never identified.

More recently, documentary filmmakers Steve Elkins and Bill Benenson launched a search for the lost city.

They identified a crater-shaped valley, encircled by steep mountains, as a possible location.

To survey it, in 2012 they enlisted the help of the Center for Airborne Laser Mapping at the University of Houston. A Cessna Skymaster, carrying a million-dollar lidar scanner, flew over the valley, probing the jungle canopy with laser light. lidar — “Light Detection and Ranging” — is able to map the ground even through dense rain forest, delineating any archaeological features that might be present.

When the images were processed, they revealed unnatural features stretching for more than a mile through the valley. When Fisher analyzed the images, he found that the terrain along the river had been almost entirely reshaped by human hands. [Continue reading…]

Facebooktwittermail

Jawbone’s discovery sheds light on early human ancestor

ASU News: The earliest evidence of our human genus – Homo – was found in Ethiopia by a team of Arizona State University scientists and students during field research in 2013.

The fossil, the left side of a lower jaw with five teeth, has been dated to 2.8 million years ago, which predates the previously known fossils of the Homo lineage by approximately 400,000 years.

The discovery is being published for the first time in the March 4 online version of the journal Science.

For decades, scientists who study the origins of modern-day humans have been searching for fossils documenting the earliest phases of the Homo lineage.

Researchers have found fossils that are 3 million years old and older. The most famous example of those human ancestors is the skeleton of Lucy, found in northeastern Africa in 1974 by ASU researcher Donald Johanson. Lucy and her relatives, though they walked on two feet, were smaller-brained and more apelike than later members of the human family tree.

Scientists have also found fossils that are 2.3 million years old and younger. These ancestors are in the genus Homo and are closer to modern day humans.


But very little had been found in between – that 700,000-year gap had turned up few fossils with which to determine the evolution from Lucy to the genus Homo. Because of that gap, there has been little agreement on the time of origin of the Homo lineage.

With this find, that mysterious time period has gotten a little clearer. [Continue reading…]

The Los Angeles Times adds: The significance of this discovery, according to some researchers, is that it firmly fixes the origins of Homo in East Africa and fits the hypothesis that climate change drove key developments in a variety of mammals, including our early forebears.

When Lucy roamed Ethiopia roughly 3.2 million years ago, the region enjoyed long rainy seasons that supported the growth of many trees and a wide variety of vegetation, according to researchers.

By the time of Homo’s first established appearance in the Horn of Africa, however, things had become much drier and the landscape had transformed into a vast, treeless expanse of grasslands with a few rivers and lakes — a scene very similar to today’s Serengeti plains or Kalahari.

It was an unforgiving climate when it came to survival.

But the hallmark of the genus that includes Homo sapiens is resourcefulness. Larger brains, the ability to fashion stone tools, and teeth suited to chewing a variety of foods would have given our early ancestors the flexibility to live in an inflexible environment, researchers say. [Continue reading…]

Facebooktwittermail

Crows understand analogies

crow

Scientific American reports: People are fascinated by the intelligence of animals. In fact, cave paintings dating back some 40,000 years suggest that we have long harbored keen interest in animal behavior and cognition. Part of that interest may have been practical: animals can be dangerous, they can be sources of food and clothing, and they can serve as sentries or mousers.

But, another part of that fascination is purely theoretical. Because animals resemble us in form, perhaps they also resemble us in thought. For many philosophers — including René Descartes and John Locke — granting intelligence to animals was a bridge too far. They especially deemed abstract reasoning to be uniquely human and to perfectly distinguish people from “brutes.” Why? Because animals do not speak, they must have no thoughts.

Nevertheless, undeterred by such pessimistic pronouncements, informed by Darwin’s theory of evolution, and guided by the maxim that “actions speak more loudly than words,” researchers today are fashioning powerful behavioral tests that provide nonverbal ways for animals to disclose their intelligence to us. Although animals may not use words, their behavior may serve as a suitable substitute; its study may allow us to jettison the stale convention that thought without language is impossible. [Continue reading…]

Facebooktwittermail

Ancient skull sheds light on human dispersal out of Africa

Ivan Semeniuk reports: Francesco Berna still remembers his first visit to Manot Cave, accidentally discovered in 2008 on a ridge in northern Israel. A narrow passage steeply descends into darkness. It then opens onto a 60-metre-long cavern with side chambers, all dramatically ornamented with stalactites and stalagmites.“It’s a spectacular cave,” said Dr. Berna, a geoarcheologist at Simon Fraser University in Burnaby, B.C. “It’s basically untouched.”

Now Manot Cave has yielded a tantalizing sign of humanity’s initial emergence out of Africa and a possible forerunner of the first modern humans in Europe, an international team of researchers that includes Dr. Berna said on Wednesday.

The find also establishes the Levant region (including Israel, Lebanon and part of Syria) as a plausible setting where our species interbred with its Neanderthal cousins.

The team’s key piece of evidence is a partial human skull found during the initial reconnaissance of the cave.

Based on its features and dimensions, the skull is unquestionably that of an anatomically modern human, the first such find in the region. The individual would probably have looked like the first Homo sapiens that appeared in Africa about 200,000 years ago and been physically indistinguishable from humans today.

“He or she would look very modern. With a tie on, you would not be able to tell the difference,” said Israel Hershkovitz, a biological anthropologist at Tel Aviv University and lead author of a paper published this week in the journal Nature that documents the Manot Cave find.

The age of the fossil is the crucial detail. The team’s analysis shows it is about 55,000 years old. That is more recent than the fragmentary remains of some not-so-modern-looking humans that drifted into the region at an earlier stage. But it coincides exactly with a period when a wetter climate may have opened the door to the first modern human migration out of Africa.

Fossils of modern humans that are only slightly less old than the Manot Cave skull have been found in the Czech Republic and Romania, making the new find a potential forerunner of the first Europeans. [Continue reading…]

Much of the reporting on these findings makes reference to “the first Europeans” and even though anthropologists might be clear about what they mean when they use to term Europe, they might consider avoiding using it, given the common meaning that is usually attached to the word.

Indeed, the lead researcher cited above, Israel Hershkovitz, illustrates the problem as he reinforces cultural stereotypes by implying that the human has fully evolved once he adorns the symbol of European, masculine power: a necktie. The irony is compounded by the fact that he and his team were trumpeting the significance of their discovery of a woman’s skull.

(No doubt many Europeans and others with European affectations have been disturbed this week to see Greece’s new prime minister, in the birthplace of democracy, assuming power without a necktie.)

The Oxford archeologist, Barry Cunliffe, has referred to the region of land that recently got dubbed “Europe” as “the westerly excrescence of the continent of Asia.”

Europeans might object to the suggestion that they inhabit an excrescence — especially since the terms suggests an abnormality — but in terms of continental topography, it points to Europe’s unique feature: its eastern boundaries have always been elastic and somewhat arbitrary.

More importantly, when it comes to human evolution, to frame this in terms of the advance into Europe revives so many echoes of nineteenth century racism.

It cannot be overstated that the first Europeans were not European.

Europe is an idea that has only been around for a few hundred years during which time it has been under constant revision.

Migration is also a misleading term since it evokes images of migrants: people who travel vast distances to inhabit new lands.

Human dispersal most likely involved rather short hops, one generation at a time, interspersed with occasional actual migrations driven by events like floods or famine.

Facebooktwittermail

How civilization has given humans brittle bones

Nicholas St. Fleur writes: Somewhere in a dense forest of ash and elm trees, a hunter readies his spear for the kill. He hurls his stone-tipped weapon at his prey, an unsuspecting white-tailed deer he has tracked since morning. The crude projectile pierces the animal’s hide, killing it and giving the hunter food to bring back to his family many miles away. Such was survival circa 5,000 B.C. in ancient North America.

But today, the average person barely has to lift a finger, let alone throw a spear to quell their appetite. The next meal is a mere online order away. And according to anthropologists, this convenient, sedentary way of life is making bones weak. Ahead, there’s a future of fractures, breaks, and osteoporosis. But for some anthropologists, the key to preventing aches in bones is by better understanding the skeletons of our hunter-gatherer ancestors.

“Over the vast majority of human prehistory, our ancestors engaged in far more activity over longer distances than we do today,” said Brian Richmond, an anthropologist from the American Museum of Natural History in New York, in a statement. “We cannot fully understand human health today without knowing how our bodies evolved to work in the past, so it is important to understand how our skeletons evolved within the context of those high levels of activity.”

For thousands of years, Native American hunter-gatherers trekked on strenuous ventures for food. And for those same thousands of years, dense skeletons supported their movements. But about 6,000 years later with the advent of agriculture the bones and joints of Native Americans became less rigid and more fragile. Similar transitions occurred across the world as populations shifted from foraging to farming, according to two new papers published Monday in the Proceedings of the National Academies of Sciences. [Continue reading…]

Facebooktwittermail

The art of not trying

John Tierney writes: Just be yourself.

The advice is as maddening as it is inescapable. It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself.

But when you’re nervous, how can you be yourself? How you can force yourself to relax? How can you try not to try?

It makes no sense, but the paradox is essential to civilization, according to Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.

He calls it the paradox of wu wei, the Chinese term for “effortless action.” Pronounced “ooo-way,” it has similarities to the concept of flow, that state of effortless performance sought by athletes, but it applies to a lot more than sports. Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.

Dr. Slingerland, a professor of Asian studies at the University of British Columbia, argues that the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good. [Continue reading…]

Facebooktwittermail

Massive genetic effort confirms bird songs related to human speech

Scientific American reports: Songbirds stutter, babble when young, become mute if parts of their brains are damaged, learn how to sing from their elders and can even be “bilingual” — in other words, songbirds’ vocalizations share a lot of traits with human speech. However, that similarity goes beyond behavior, researchers have found. Even though humans and birds are separated by millions of years of evolution, the genes that give us our ability to learn speech have much in common with those that lend birds their warble.

A four-year long effort involving more than 100 researchers around the world put the power of nine supercomputers into analyzing the genomes of 48 species of birds. The results, published this week in a package of eight articles in Science and 20 papers in other journals, provides the most complete picture of the bird family tree thus far. The project has also uncovered genetic signatures in song-learning bird brains that have surprising similarities to the genetics of speech in humans, a finding that could help scientists study human speech.

The analysis suggests that most modern birds arose in an impressive speciation event, a “big bang” of avian diversification, in the 10 million years immediately following the extinction of dinosaurs. This period is more recent than posited in previous genetic analyses, but it lines up with the fossil record. By delving deeper into the rich data set, research groups identified when birds lost their teeth, investigated the relatively slow evolution of crocodiles and outlined the similarities between birds’ and humans’ vocal learning ability, among other findings. [Continue reading…]

Facebooktwittermail

The thoughts of our ancient ancestors

The discovery of what appear to have been deliberately etched markings made by a human ancestor, Homo erectus, on the surface of a shell, call for a reconsideration of assumptions that have been made about the origins of abstract thought.

While the meaning of these zigzag markings will most likely remain forever unknown, it can reasonably be inferred that for the individual who created them, the marks had some significance.

In a report in Nature, Josephine Joordens, a biologist at Leiden University whose team discovered the markings, says:

“We’ve looked at all possibilities, but in the end we are really certain that this must have been made by an agent who did a very deliberate action with a very sharp implement,” says Joordens. Her team tried replicating the pattern on fresh and fossilized shells, “and that made us realize how difficult it really was”, she says.

Saying much more about the engraving is tricky. “If you don’t know the intention of the person who made it, it’s impossible to call it art,” says Joordens.

“But on the other hand, it is an ancient drawing. It is a way of expressing yourself. What was meant by the person who did this, we simply don’t know, ” she adds. “It could have been to impress his girlfriend, or to doodle a bit, or to mark the shell as his own property.”

Clive Finlayson, a zoologist at the Gibraltar Museum who was part of the team that described cross-hatch patterns linked to Neanderthals, is also agnostic about whether to call the H. erectus doodles art. What is more important, he says, is the growing realization that abilities such as abstract thinking, once ascribed to only H. sapiens, were present in other archaic humans, including, now, their ancestors.

“I’ve been suggesting increasingly strongly that a lot of these things that are meant to be modern human we’re finding in other hominids,” he says. “We really need to revisit these concepts and take stock.”

Palaeoanthropology, by necessity, is a highly speculative discipline — therein lies both its strength and its weakness.

The conservatism of hard science recoils at the idea that some scratches on a single shell amount to sufficient evidence to prompt a reconsideration about the origins of the human mind, and yet to refrain from such speculation seems like an effort to restrain the powers of the very thing we are trying to understand.

Rationally, there is as much reason to assume that abstract thinking long predates modern humans and thus searching for evidence of its absence and finding none would leave us agnostic about its presence or absence, than there is reason to assume that at some juncture it was born.

My inclination is to believe that any living creature that has some capacity to construct a neurological representation of their surroundings is by that very capacity employing something akin to abstract thinking.

This ability for the inner to mirror the outer has no doubt evolved, becoming progressively more complex and more deeply abstract, and yet mind, if defined as world-mirroring, seems to have been born when life first moved.

Facebooktwittermail

Long before we learned how to make wine our ancestors acquired a taste for rotten fruit

Live Science: Human ancestors may have begun evolving the knack for consuming alcohol about 10 million years ago, long before modern humans began brewing booze, researchers say.

The ability to break down alcohol likely helped human ancestors make the most out of rotting, fermented fruit that fell onto the forest floor, the researchers said. Therefore, knowing when this ability developed could help researchers figure out when these human ancestors began moving to life on the ground, as opposed to mostly in trees, as earlier human ancestors had lived.

“A lot of aspects about the modern human condition — everything from back pain to ingesting too much salt, sugar and fat — goes back to our evolutionary history,” said lead study author Matthew Carrigan, a paleogeneticist at Santa Fe College in Gainesville, Florida. “We wanted to understand more about the modern human condition with regards to ethanol,” he said, referring to the kind of alcohol found in rotting fruit and that’s also used in liquor and fuel. [Continue reading…]

Facebooktwittermail

Societies in harsh environments more likely to believe in moralizing high gods

EurekAlert!: Just as physical adaptations help populations prosper in inhospitable habitats, belief in moralizing, high gods might be similarly advantageous for human cultures in poorer environments. A new study from the National Evolutionary Synthesis Center (NESCent) suggests that societies with less access to food and water are more likely to believe in these types of deities.

“When life is tough or when it’s uncertain, people believe in big gods,” says Russell Gray, a professor at the University of Auckland and a founding director of the Max Planck Institute for History and the Sciences in Jena, Germany. “Prosocial behavior maybe helps people do well in harsh or unpredictable environments.”

Gray and his coauthors found a strong correlation between belief in high gods who enforce a moral code and other societal characteristics. Political complexity–namely a social hierarchy beyond the local community– and the practice of animal husbandry were both strongly associated with a belief in moralizing gods.

The emergence of religion has long been explained as a result of either culture or environmental factors but not both. The new findings imply that complex practices and characteristics thought to be exclusive to humans arise from a medley of ecological, historical, and cultural variables.

“When researchers discuss the forces that shaped human history, there is considerable disagreement as to whether our behavior is primarily determined by culture or by the environment,” says primary author Carlos Botero, a researcher at the Initiative for Biological Complexity at North Carolina State University. “We wanted to throw away all preconceived notions regarding these processes and look at all the potential drivers together to see how different aspects of the human experience may have contributed to the behavioral patterns we see today.” [Continue reading…]

Facebooktwittermail

Cooperation is what makes us human

Kat McGowan writes: Tales about the origins of our species always start off like this: A small band of hunter-gatherers roams the savannah, loving, warring, and struggling for survival under the African sun. They do not start like this: A fat guy falls off a New York City subway platform onto the tracks.

But what happens next is a quintessential story of who we are as human beings.

On Feb. 17, 2013, around 2:30 a.m., Garrett O’Hanlon, a U.S. Air Force Academy cadet third class, was out celebrating his 22nd birthday in New York City. He and his sister were in the subway waiting for a train when a sudden silence came over the platform, followed by a shriek. People pointed down to the tracks.

O’Hanlon turned and saw a man sprawled facedown on the tracks. “The next thing that happened, I was on the tracks, running toward him,” he says. “I honestly didn’t have a thought process.”

O’Hanlon grabbed the unconscious man by the shoulders, lifting his upper body off the tracks, but struggled to move him. He was deadweight. According to the station clock, the train would arrive in less than two minutes. From the platform, O’Hanlon’s sister was screaming at him to save himself.

Suddenly other arms were there: Personal trainer Dennis Codrington Jr. and his friend Matt Foley had also jumped down to help. “We grabbed him, one by the legs, one by the shoulders, one by the chest,” O’Hanlon says. They got the man to the edge of the platform, where a dozen or more people muscled him up and over. More hands seized the rescuers’ arms and shoulders, helping them up to safety as well.

In the aftermath of the rescue, O’Hanlon says he has been surprised that so many people have asked him why he did it. “I get stunned by the question,” he says. In his view, anybody else would’ve done the same thing. “I feel like it’s a normal reaction,” he says. “To me that’s just what people do.”

More precisely, it is something only people do, according to developmental psychologist Michael Tomasello, codirector of the Max Planck Institute for Evolutionary Anthropology.

For decades Tomasello has explored what makes humans distinctive. His conclusion? We cooperate. Many species, from ants to orcas to our primate cousins, cooperate in the wild. But Tomasello has identified a special form of cooperation. In his view, humans alone are capable of shared intentionality—they intuitively grasp what another person is thinking and act toward a common goal, as the subway rescuers did. This supremely human cognitive ability, Tomasello says, launched our species on its extraordinary trajectory. It forged language, tools, and cultures—stepping-stones to our colonization of every corner of the planet. [Continue reading…]

Facebooktwittermail

35,000 year-old Indonesian cave paintings suggest art came out of Africa

The Guardian reports: Paintings of wild animals and hand markings left by adults and children on cave walls in Indonesia are at least 35,000 years old, making them some of the oldest artworks known.

The rock art was originally discovered in caves on the island of Sulawesi in the 1950s, but dismissed as younger than 10,000 years old because scientists thought older paintings could not possibly survive in a tropical climate.

But fresh analysis of the pictures by an Australian-Indonesian team has stunned researchers by dating one hand marking to at least 39,900 years old, and two paintings of animals, a pig-deer or babirusa, and another animal, probably a wild pig, to at least 35,400 and 35,700 years ago respectively.

The work reveals that rather than Europe being at the heart of an explosion of creative brilliance when modern humans arrived from Africa, the early settlers of Asia were creating their own artworks at the same time or even earlier.

Archaeologists have not ruled out that the different groups of colonising humans developed their artistic skills independently of one another, but an enticing alternative is that the modern human ancestors of both were artists before they left the African continent.

“Our discovery on Sulawesi shows that cave art was made at opposite ends of the Pleistocene Eurasian world at about the same time, suggesting these practices have deeper origins, perhaps in Africa before our species left this continent and spread across the globe,” said Dr Maxime Aubert, an archaeologist at the University of Wollongong. [Continue reading…]

Facebooktwittermail

Ants are cool but teach us nothing

E.O. Wilson writes: For nearly seven decades, starting in boyhood, I’ve studied hundreds of kinds of ants around the world, and this qualifies me, I believe, to offer some advice on ways their lives can be applied to ours. I’ll start with the question I’m most often asked: “What can I do about the ants in my kitchen?” My response comes from the heart: Watch your step, be careful of little lives. Ants especially like honey, tuna and cookie crumbs. So put down bits of those on the floor, and watch as the first scout finds the bait and reports back to her colony by laying an odor trail. Then, as a little column follows her out to the food, you will see social behavior so strange it might be on another planet. Think of kitchen ants not as pests or bugs, but as your personal guest superorganism.

Another question I hear a lot is, “What can we learn of moral value from the ants?” Here again I will answer definitively: nothing. Nothing at all can be learned from ants that our species should even consider imitating. For one thing, all working ants are female. Males are bred and appear in the nest only once a year, and then only briefly. They are pitiful creatures with wings, huge eyes, small brains and genitalia that make up a large portion of their rear body segment. They have only one function in life: to inseminate the virgin queens during the nuptial season. They are built to be robot flying sexual missiles. Upon mating or doing their best to mate, they are programmed to die within hours, usually as victims of predators.

Many kinds of ants eat their dead — and their injured, too. You may have seen ant workers retrieve nestmates that you have mangled or killed underfoot (accidentally, I hope), thinking it battlefield heroism. The purpose, alas, is more sinister. [Continue reading…]

Facebooktwittermail

How ancient DNA is rewriting human history

Michael White writes: There are no written records of the most important developments in our history: the transition from hunting and gathering to farming, the initial colonization of regions outside Africa, and, most crucially, the appearance of modern humans and the vanishing of archaic ones. Our primary information sources about these “pre-historic” events are ancient tools, weapons, bones, and, more recently, DNA. Like an ancient text that has picked up interpolations over the millennia, our genetic history can be difficult to recover from the DNA of people alive today. But with the invention of methods to read DNA taken from ancient bones, we now have access to much older copies of our genetic history, and it’s radically changing how we understand our deep past. What seemed like an episode of Lost turns out to be much more like Game of Thrones: instead of a story of small, isolated groups that colonized distant new territory, human history is a story of ancient populations that migrated and mixed all over the world.

There is no question that most human evolutionary history took place in Africa. But by one million years ago—long before modern humans evolved — archaic human species were already living throughout Asia and Europe. By 30,000 years ago, the archaic humans had vanished, and modern humans had taken their place. How did that happen?

From the results of early DNA studies in the late 1980s and early ’90s, scientists argued that anatomically modern humans evolved in Africa, and then expanded into Asia, Oceania, and Europe, beginning about 60,000 years ago. The idea was that modern humans colonized the rest of the world in a succession of small founding groups — each one a tiny sampling of the total modern human gene pool. These small, isolated groups settled new territory and replaced the archaic humans that lived there. As a result, humans in different parts of the world today have their own distinctive DNA signature, consisting of the genetic quirks of their ancestors who first settled the area, as well as the genetic adaptations to the local environment that evolved later.
There are very few isolated branches of the human family tree. People in nearly every part of the world are a product of many different ancient populations, and sometimes surprisingly close relationships span a wide geographical distance.

This view of human history, called the “serial founder effect model,” has big implications for our understanding of how we came to be who we are. Most importantly, under this model, genetic differences between geographically separated human populations reflect deep branchings in the human family tree, branches that go back tens of thousands of years. It also declares that people have evolutionary adaptations that are matched to their geographical area, such as lighter skin in Asians and Europeans or high altitude tolerance among Andeans and Tibetans. With a few exceptions, such as the genetic mixing after Europeans colonized the Americas, our geography reflects our deep ancestry.

Well, it’s time to scrap this picture of human history. [Continue reading…]

Facebooktwittermail