Yuval Noah Harari writes: Over the last decade, I have been writing a history of humankind, tracking down the transformation of our species from an insignificant African ape into the master of the planet. It was not easy to understand what turned Homo sapiens into an ecological serial killer; why men dominated women in most human societies; or why capitalism became the most successful religion ever. It wasn’t easy to address such questions because scholars have offered so many different and conflicting answers. In contrast, when it came to assessing the bottom line – whether thousands of years of inventions and discoveries have made us happier – it was surprising to realise that scholars have neglected even to ask the question. This is the largest lacuna in our understanding of history.
Though few scholars have studied the long-term history of happiness, almost everybody has some idea about it. One common preconception – often termed “the Whig view of history” – sees history as the triumphal march of progress. Each passing millennium witnessed new discoveries: agriculture, the wheel, writing, print, steam engines, antibiotics. Humans generally use newly found powers to alleviate miseries and fulfil aspirations. It follows that the exponential growth in human power must have resulted in an exponential growth in happiness. Modern people are happier than medieval people, and medieval people were happier than stone age people.
But this progressive view is highly controversial. Though few would dispute the fact that human power has been growing since the dawn of history, it is far less clear that power correlates with happiness. The advent of agriculture, for example, increased the collective power of humankind by several orders of magnitude. Yet it did not necessarily improve the lot of the individual. For millions of years, human bodies and minds were adapted to running after gazelles, climbing trees to pick apples, and sniffing here and there in search of mushrooms. Peasant life, in contrast, included long hours of agricultural drudgery: ploughing, weeding, harvesting and carrying water buckets from the river. Such a lifestyle was harmful to human backs, knees and joints, and numbing to the human mind.
In return for all this hard work, peasants usually had a worse diet than hunter-gatherers, and suffered more from malnutrition and starvation. Their crowded settlements became hotbeds for new infectious diseases, most of which originated in domesticated farm animals. Agriculture also opened the way for social stratification, exploitation and possibly patriarchy. From the viewpoint of individual happiness, the “agricultural revolution” was, in the words of the scientist Jared Diamond, “the worst mistake in the history of the human race”.
The case of the agricultural revolution is not a single aberration, however. Themarch of progress from the first Sumerian city-states to the empires of Assyria and Babylonia was accompanied by a steady deterioration in the social status and economic freedom of women. The European Renaissance, for all its marvellous discoveries and inventions, benefited few people outside the circle of male elites. The spread of European empires fostered the exchange of technologies, ideas and products, yet this was hardly good news for millions of Native Americans, Africans and Aboriginal Australians.
The point need not be elaborated further. Scholars have thrashed the Whig view of history so thoroughly, that the only question left is: why do so many people still believe in it? [Continue reading…]
For such a large and culturally diverse place, Europe has surprisingly little genetic variety. Learning how and when the modern gene-pool came together has been a long journey. But thanks to new technological advances a picture is slowly coming together of repeated colonisation by peoples from the east with more efficient lifestyles.
In a new study, we have added a piece to the puzzle: the Y chromosomes of the majority of European men can be traced back to just three individuals living between 3,500 and 7,300 years ago. How their lineages came to dominate Europe makes for interesting speculation. One possibility could be that their DNA rode across Europe on a wave of new culture brought by nomadic people from the Steppe known as the Yamnaya.
Stone Age Europe
The first-known people to enter Europe were the Neanderthals – and though they have left some genetic legacy, it is later waves who account for the majority of modern European ancestry. The first “anatomically modern humans” arrived in the continent around 40,000 years ago. These were the Palaeolithic hunter-gatherers sometimes called the Cro-Magnons. They populated Europe quite sparsely and lived a lifestyle not very different from that of the Neanderthals they replaced.
Then something revolutionary happened in the Middle East – farming, which allowed for enormous population growth. We know that from around 8,000 years ago a wave of farming and population growth exploded into both Europe and South Asia. But what has been much less clear is the mechanism of this spread. How much was due to the children of the farmers moving into new territories and how much was due to the neighbouring hunter-gathers adopting this new way of life?
Smithsonian magazine: Approximately 3.3 million years ago someone began chipping away at a rock by the side of a river. Eventually, this chipping formed the rock into a tool used, perhaps, to prepare meat or crack nuts. And this technological feat occurred before humans even showed up on the evolutionary scene.
That’s the conclusion of an analysis published today in Nature of the oldest stone tools yet discovered. Unearthed in a dried-up riverbed in Kenya, the shards of scarred rock, including what appear to be early hammers and cutting instruments, predate the previous record holder by around 700,000 years. Though it’s unclear who made the tools, the find is the latest and most convincing in a string of evidence that toolmaking began before any members of the Homo genus walked the Earth.
“This discovery challenges the idea that the main characters that make us human — making stone tools, eating more meat, maybe using language — all evolved at once in a punctuated way, near the origins of the genus Homo,” says Jason Lewis, a paleoanthropologist at Rutgers University and co-author of the study. [Continue reading…]
Is the Earth now spinning through the “Age of Humans?” More than a few scientists think so. They’ve suggested, in fact, that we modify the name of the current geological epoch (the Holocene, which began roughly 12,000 years ago) to the “Anthropocene.” It’s a term first put into wide circulation by Nobel-Prize winning atmospheric chemist Paul Crutzen in an article published in Nature in 2002. And it’s stirring up a good deal of debate, not only among geologists.
The idea is that we needed a new planetary marker to account for the scale of human changes to the Earth: extensive land transformation, mass extinctions, control of the nitrogen cycle, large-scale water diversion, and especially change of the atmosphere through the emission of greenhouse gases. Although naming geological epochs isn’t usually a controversial act, the Anthropocene proposal is radical because it means that what had been an environmental fixture against which people acted, the geological record, is now just another expression of the human presence.
It seems to be a particularly bitter pill to swallow for nature preservationists, heirs to the American tradition led by writers, scientists and activists such as John Muir, Aldo Leopold, David Brower, Rachel Carson and Edward Abbey. That’s because some have argued the traditional focus on the goal of wilderness protection rests on a view of “pristine” nature that is simply no longer viable on a planet hurtling toward nine billion human inhabitants.
Given this situation, we felt the time was ripe to explore the impact of the Anthropocene on the idea and practice of nature preservation. Our plan was to create a salon, a kind of literary summit. But we wanted to cut to the chase: What does it mean to “save American nature” in the age of humans?
We invited a distinguished group of environmental writers – scientists, philosophers, historians, journalists, agency administrators and activists – to give it their best shot. The essays appear in the new collection, After Preservation: Saving American Nature in the Age of Humans.
Scott Atran recently addressed the UN Security Council’s Ministerial Debate on “The Role of Youth in Countering Violent Extremism and Promoting Peace.” This post is an adaptation of his remarks: I am an anthropologist. Anthropologists, as a group, study the diversity of human cultures to understand our commonalities and differences, and to use the knowledge of what is common to us all to help us bridge our differences. My research aims to help reduce violence between peoples, by first trying to understand thoughts and behaviors as different from my own as any I can imagine: such as suicide actions that kill masses of people innocent of direct harm to others. The key, as Margaret Mead taught me long ago, when I worked as her assistant at the American Museum of Natural History in New York, was to empathize with people, without always sympathizing: to participate in their lives to the extent you feel is morally possible. And then report.
I’ve spent much time observing, interviewing and carrying out systematic studies among people on six continents who are drawn to violent action for a group and its cause. Most recently with colleagues last month in Kirkuk, Iraq among young men who had killed for ISIS, and with young adults in the banlieus of Paris and barrios of Barcelona who seek to join it.
With some insights from social science research, I will try to outline a few conditions that may help move such youth from taking the path of violent extremism.
But first, who are these young people? None of the ISIS fighters we interviewed in Iraq had more than primary school education, some had wives and young children. When asked “what is Islam?” they answered “my life.” They knew nothing of the Quran or Hadith, or of the early caliphs Omar and Othman, but had learned of Islam from Al Qaeda and ISIS propaganda, teaching that Muslims like them were targeted for elimination unless they first eliminated the impure. This isn’t an outlandish proposition in their lived circumstances: as they told of growing up after the fall of Saddam Hussein in a hellish world of constant guerrilla war, family deaths and dislocation, and of not being even able to go out of their homes or temporary shelters for months on end. [Continue reading…]
Scientific American reports: Opera singers and dry air don’t get along. In fact, the best professional singers require humid settings to help them achieve the right pitch. “When your vocal cords are really dry, they’re a little less elastic,” says Caleb Everett, an anthropological linguist at the University of Miami. As a result, singers experience tiny variations in pitch, called jitter, as well as wavering volume—both of which contribute to rougher refrains.
If the amount of moisture in the air influences musical pitch, Everett wondered, has that translated into the development of fewer tonal languages in arid locations? Tonal languages, such as Mandarin Chinese and Cherokee, rely on variations in pitch to differentiate meaning: the same syllable spoken at a higher pitch can specify a different word if spoken at a lower pitch or in a rising or falling tone.
In a survey of more than 3,700 languages, Everett and his collaborators found that those with complex tones do indeed occur less frequently in dry areas than they do in humid ones, even after accounting for the clustering of related languages. For instance, more than half of the hundreds of languages spoken in tropical sub-Saharan locations feature complex tones, whereas none of the two dozen languages in the Sahara do. Overall, only one in 30 complex tonal languages flourished in dry areas; one in three nontonal languages cropped up in those same regions. The results appeared in February in the Proceedings of the National Academy of Sciences USA. [Continue reading…]
Jedediah Purdy writes: As much as a scientific concept, the Anthropocene is a political and ethical gambit. Saying that we live in the Anthropocene is a way of saying that we cannot avoid responsibility for the world we are making. So far so good. The trouble starts when this charismatic, all-encompassing idea of the Anthropocene becomes an all-purpose projection screen and amplifier for one’s preferred version of ‘taking responsibility for the planet’.
Peter Kareiva, the controversial chief scientist of the Nature Conservancy, uses the theme ‘Conservation in the Anthropocene’ to trash environmentalism as philosophically naïve and politically backward. Kareiva urges conservationists to give up on wilderness and embrace what the writer Emma Marris calls the ‘rambunctious garden’. Specifically, Kareiva wants to rank ecosystems by the quality of ‘ecosystem services’ they provide for human beings instead of ‘pursuing the protection of biodiversity for biodiversity’s sake’. He wants a pro‑development stance that assumes that ‘nature is resilient rather than fragile’. He insists that: ‘Instead of scolding capitalism, conservationists should partner with corporations in a science-based effort to integrate the value of nature’s benefits into their operations and cultures.’ In other words, the end of nature is the signal to carry on with green-branded business as usual, and the business of business is business, as the Nature Conservancy’s partnerships with Dow, Monsanto, Coca-Cola, Pepsi, J P Morgan, Goldman Sachs and the mining giant Rio Tinto remind us.
Kareiva is a favourite of Andrew Revkin, the roving environmental maven of The New York Times Magazine, who touts him as a paragon of responsibility-taking, a leader among ‘scholars and doers who see that new models for thinking and acting are required in this time of the Anthropocene’. This pair and their friends at the Breakthrough Institute in California can be read as making a persistent effort to ‘rebrand’ environmentalism as humanitarian and development-friendly (and capture speaking and consultancy fees, which often seem to be the major ecosystem services of the Anthropocene). This is itself a branding strategy, an opportunity to slosh around old plonk in an ostentatiously shiny bottle. [Continue reading…]
Chelsea Wald writes: Not long ago I diagnosed myself with the recently identified condition of sidewalk rage. It’s most pronounced when it comes to a certain friend who is a slow walker. Last month, as we sashayed our way to dinner, I found myself biting my tongue, thinking, I have to stop going places with her if I ever want to … get there!
You too can measure yourself on the “Pedestrian Aggressiveness Syndrome Scale,” a tool developed by University of Hawaii psychologist Leon James. While walking in a crowd, do you find yourself “acting in a hostile manner (staring, presenting a mean face, moving closer or faster than expected)” and “enjoying thoughts of violence?”
Slowness rage is not confined to the sidewalk, of course. Slow drivers, slow Internet, slow grocery lines — they all drive us crazy. Even the opening of this article may be going on a little too long for you. So I’ll get to the point. Slow things drive us crazy because the fast pace of society has warped our sense of timing. Things that our great-great-grandparents would have found miraculously efficient now drive us around the bend. Patience is a virtue that’s been vanquished in the Twitter age.
Once upon a time, cognitive scientists tell us, patience and impatience had an evolutionary purpose. They constituted a yin and yang balance, a finely tuned internal timer that tells when we’ve waited too long for something and should move on. When that timer went buzz, it was time to stop foraging at an unproductive patch or abandon a failing hunt.
“Why are we impatient? It’s a heritage from our evolution,” says Marc Wittmann, a psychologist at the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany. Impatience made sure we didn’t die from spending too long on a single unrewarding activity. It gave us the impulse to act.
But that good thing is gone. The fast pace of society has thrown our internal timer out of balance. It creates expectations that can’t be rewarded fast enough — or rewarded at all. When things move more slowly than we expect, our internal timer even plays tricks on us, stretching out the wait, summoning anger out of proportion to the delay. [Continue reading…]
Douglas Preston writes: The ruins were first identified in May 2012, during an aerial survey of a remote valley in La Mosquitia, a vast region of swamps, rivers, and mountains containing some of the last scientifically unexplored places on earth.
For a hundred years, explorers and prospectors told tales of the white ramparts of a lost city glimpsed above the jungle foliage. Indigenous stories speak of a “white house” or a “place of cacao” where Indians took refuge from Spanish conquistadores — a mystical, Eden-like paradise from which no one ever returned.
Since the 1920s, several expeditions had searched for the White City, or Ciudad Blanca. The eccentric explorer Theodore Morde mounted the most famous of these in 1940, under the aegis of the Museum of the American Indian (now part of the Smithsonian Institution).
Morde returned from Mosquitia with thousands of artifacts, claiming to have entered the City. According to Morde, the indigenous people there said it contained a giant, buried statue of a monkey god. He refused to divulge the location out of fear, he said, that the site would be looted. He later committed suicide and his site—if it existed at all—was never identified.
They identified a crater-shaped valley, encircled by steep mountains, as a possible location.
To survey it, in 2012 they enlisted the help of the Center for Airborne Laser Mapping at the University of Houston. A Cessna Skymaster, carrying a million-dollar lidar scanner, flew over the valley, probing the jungle canopy with laser light. lidar — “Light Detection and Ranging” — is able to map the ground even through dense rain forest, delineating any archaeological features that might be present.
When the images were processed, they revealed unnatural features stretching for more than a mile through the valley. When Fisher analyzed the images, he found that the terrain along the river had been almost entirely reshaped by human hands. [Continue reading…]
ASU News: The earliest evidence of our human genus – Homo – was found in Ethiopia by a team of Arizona State University scientists and students during field research in 2013.
The fossil, the left side of a lower jaw with five teeth, has been dated to 2.8 million years ago, which predates the previously known fossils of the Homo lineage by approximately 400,000 years.
The discovery is being published for the first time in the March 4 online version of the journal Science.
For decades, scientists who study the origins of modern-day humans have been searching for fossils documenting the earliest phases of the Homo lineage.
Researchers have found fossils that are 3 million years old and older. The most famous example of those human ancestors is the skeleton of Lucy, found in northeastern Africa in 1974 by ASU researcher Donald Johanson. Lucy and her relatives, though they walked on two feet, were smaller-brained and more apelike than later members of the human family tree.
Scientists have also found fossils that are 2.3 million years old and younger. These ancestors are in the genus Homo and are closer to modern day humans.
But very little had been found in between – that 700,000-year gap had turned up few fossils with which to determine the evolution from Lucy to the genus Homo. Because of that gap, there has been little agreement on the time of origin of the Homo lineage.
With this find, that mysterious time period has gotten a little clearer. [Continue reading…]
The Los Angeles Times adds: The significance of this discovery, according to some researchers, is that it firmly fixes the origins of Homo in East Africa and fits the hypothesis that climate change drove key developments in a variety of mammals, including our early forebears.
When Lucy roamed Ethiopia roughly 3.2 million years ago, the region enjoyed long rainy seasons that supported the growth of many trees and a wide variety of vegetation, according to researchers.
By the time of Homo’s first established appearance in the Horn of Africa, however, things had become much drier and the landscape had transformed into a vast, treeless expanse of grasslands with a few rivers and lakes — a scene very similar to today’s Serengeti plains or Kalahari.
It was an unforgiving climate when it came to survival.
But the hallmark of the genus that includes Homo sapiens is resourcefulness. Larger brains, the ability to fashion stone tools, and teeth suited to chewing a variety of foods would have given our early ancestors the flexibility to live in an inflexible environment, researchers say. [Continue reading…]
Scientific American reports: People are fascinated by the intelligence of animals. In fact, cave paintings dating back some 40,000 years suggest that we have long harbored keen interest in animal behavior and cognition. Part of that interest may have been practical: animals can be dangerous, they can be sources of food and clothing, and they can serve as sentries or mousers.
But, another part of that fascination is purely theoretical. Because animals resemble us in form, perhaps they also resemble us in thought. For many philosophers — including René Descartes and John Locke — granting intelligence to animals was a bridge too far. They especially deemed abstract reasoning to be uniquely human and to perfectly distinguish people from “brutes.” Why? Because animals do not speak, they must have no thoughts.
Nevertheless, undeterred by such pessimistic pronouncements, informed by Darwin’s theory of evolution, and guided by the maxim that “actions speak more loudly than words,” researchers today are fashioning powerful behavioral tests that provide nonverbal ways for animals to disclose their intelligence to us. Although animals may not use words, their behavior may serve as a suitable substitute; its study may allow us to jettison the stale convention that thought without language is impossible. [Continue reading…]
Ivan Semeniuk reports: Francesco Berna still remembers his first visit to Manot Cave, accidentally discovered in 2008 on a ridge in northern Israel. A narrow passage steeply descends into darkness. It then opens onto a 60-metre-long cavern with side chambers, all dramatically ornamented with stalactites and stalagmites.“It’s a spectacular cave,” said Dr. Berna, a geoarcheologist at Simon Fraser University in Burnaby, B.C. “It’s basically untouched.”
Now Manot Cave has yielded a tantalizing sign of humanity’s initial emergence out of Africa and a possible forerunner of the first modern humans in Europe, an international team of researchers that includes Dr. Berna said on Wednesday.
The find also establishes the Levant region (including Israel, Lebanon and part of Syria) as a plausible setting where our species interbred with its Neanderthal cousins.
The team’s key piece of evidence is a partial human skull found during the initial reconnaissance of the cave.
Based on its features and dimensions, the skull is unquestionably that of an anatomically modern human, the first such find in the region. The individual would probably have looked like the first Homo sapiens that appeared in Africa about 200,000 years ago and been physically indistinguishable from humans today.
“He or she would look very modern. With a tie on, you would not be able to tell the difference,” said Israel Hershkovitz, a biological anthropologist at Tel Aviv University and lead author of a paper published this week in the journal Nature that documents the Manot Cave find.
The age of the fossil is the crucial detail. The team’s analysis shows it is about 55,000 years old. That is more recent than the fragmentary remains of some not-so-modern-looking humans that drifted into the region at an earlier stage. But it coincides exactly with a period when a wetter climate may have opened the door to the first modern human migration out of Africa.
Fossils of modern humans that are only slightly less old than the Manot Cave skull have been found in the Czech Republic and Romania, making the new find a potential forerunner of the first Europeans. [Continue reading…]
Much of the reporting on these findings makes reference to “the first Europeans” and even though anthropologists might be clear about what they mean when they use to term Europe, they might consider avoiding using it, given the common meaning that is usually attached to the word.
Indeed, the lead researcher cited above, Israel Hershkovitz, illustrates the problem as he reinforces cultural stereotypes by implying that the human has fully evolved once he adorns the symbol of European, masculine power: a necktie. The irony is compounded by the fact that he and his team were trumpeting the significance of their discovery of a woman’s skull.
(No doubt many Europeans and others with European affectations have been disturbed this week to see Greece’s new prime minister, in the birthplace of democracy, assuming power without a necktie.)
The Oxford archeologist, Barry Cunliffe, has referred to the region of land that recently got dubbed “Europe” as “the westerly excrescence of the continent of Asia.”
Europeans might object to the suggestion that they inhabit an excrescence — especially since the terms suggests an abnormality — but in terms of continental topography, it points to Europe’s unique feature: its eastern boundaries have always been elastic and somewhat arbitrary.
More importantly, when it comes to human evolution, to frame this in terms of the advance into Europe revives so many echoes of nineteenth century racism.
It cannot be overstated that the first Europeans were not European.
Europe is an idea that has only been around for a few hundred years during which time it has been under constant revision.
Migration is also a misleading term since it evokes images of migrants: people who travel vast distances to inhabit new lands.
Human dispersal most likely involved rather short hops, one generation at a time, interspersed with occasional actual migrations driven by events like floods or famine.
Nicholas St. Fleur writes: Somewhere in a dense forest of ash and elm trees, a hunter readies his spear for the kill. He hurls his stone-tipped weapon at his prey, an unsuspecting white-tailed deer he has tracked since morning. The crude projectile pierces the animal’s hide, killing it and giving the hunter food to bring back to his family many miles away. Such was survival circa 5,000 B.C. in ancient North America.
But today, the average person barely has to lift a finger, let alone throw a spear to quell their appetite. The next meal is a mere online order away. And according to anthropologists, this convenient, sedentary way of life is making bones weak. Ahead, there’s a future of fractures, breaks, and osteoporosis. But for some anthropologists, the key to preventing aches in bones is by better understanding the skeletons of our hunter-gatherer ancestors.
“Over the vast majority of human prehistory, our ancestors engaged in far more activity over longer distances than we do today,” said Brian Richmond, an anthropologist from the American Museum of Natural History in New York, in a statement. “We cannot fully understand human health today without knowing how our bodies evolved to work in the past, so it is important to understand how our skeletons evolved within the context of those high levels of activity.”
For thousands of years, Native American hunter-gatherers trekked on strenuous ventures for food. And for those same thousands of years, dense skeletons supported their movements. But about 6,000 years later with the advent of agriculture the bones and joints of Native Americans became less rigid and more fragile. Similar transitions occurred across the world as populations shifted from foraging to farming, according to two new papers published Monday in the Proceedings of the National Academies of Sciences. [Continue reading…]
John Tierney writes: Just be yourself.
The advice is as maddening as it is inescapable. It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself.
But when you’re nervous, how can you be yourself? How you can force yourself to relax? How can you try not to try?
It makes no sense, but the paradox is essential to civilization, according to Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.
He calls it the paradox of wu wei, the Chinese term for “effortless action.” Pronounced “ooo-way,” it has similarities to the concept of flow, that state of effortless performance sought by athletes, but it applies to a lot more than sports. Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.
Dr. Slingerland, a professor of Asian studies at the University of British Columbia, argues that the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good. [Continue reading…]
Scientific American reports: Songbirds stutter, babble when young, become mute if parts of their brains are damaged, learn how to sing from their elders and can even be “bilingual” — in other words, songbirds’ vocalizations share a lot of traits with human speech. However, that similarity goes beyond behavior, researchers have found. Even though humans and birds are separated by millions of years of evolution, the genes that give us our ability to learn speech have much in common with those that lend birds their warble.
A four-year long effort involving more than 100 researchers around the world put the power of nine supercomputers into analyzing the genomes of 48 species of birds. The results, published this week in a package of eight articles in Science and 20 papers in other journals, provides the most complete picture of the bird family tree thus far. The project has also uncovered genetic signatures in song-learning bird brains that have surprising similarities to the genetics of speech in humans, a finding that could help scientists study human speech.
The analysis suggests that most modern birds arose in an impressive speciation event, a “big bang” of avian diversification, in the 10 million years immediately following the extinction of dinosaurs. This period is more recent than posited in previous genetic analyses, but it lines up with the fossil record. By delving deeper into the rich data set, research groups identified when birds lost their teeth, investigated the relatively slow evolution of crocodiles and outlined the similarities between birds’ and humans’ vocal learning ability, among other findings. [Continue reading…]
The discovery of what appear to have been deliberately etched markings made by a human ancestor, Homo erectus, on the surface of a shell, call for a reconsideration of assumptions that have been made about the origins of abstract thought.
While the meaning of these zigzag markings will most likely remain forever unknown, it can reasonably be inferred that for the individual who created them, the marks had some significance.
In a report in Nature, Josephine Joordens, a biologist at Leiden University whose team discovered the markings, says:
“We’ve looked at all possibilities, but in the end we are really certain that this must have been made by an agent who did a very deliberate action with a very sharp implement,” says Joordens. Her team tried replicating the pattern on fresh and fossilized shells, “and that made us realize how difficult it really was”, she says.
Saying much more about the engraving is tricky. “If you don’t know the intention of the person who made it, it’s impossible to call it art,” says Joordens.
“But on the other hand, it is an ancient drawing. It is a way of expressing yourself. What was meant by the person who did this, we simply don’t know, ” she adds. “It could have been to impress his girlfriend, or to doodle a bit, or to mark the shell as his own property.”
Clive Finlayson, a zoologist at the Gibraltar Museum who was part of the team that described cross-hatch patterns linked to Neanderthals, is also agnostic about whether to call the H. erectus doodles art. What is more important, he says, is the growing realization that abilities such as abstract thinking, once ascribed to only H. sapiens, were present in other archaic humans, including, now, their ancestors.
“I’ve been suggesting increasingly strongly that a lot of these things that are meant to be modern human we’re finding in other hominids,” he says. “We really need to revisit these concepts and take stock.”
Palaeoanthropology, by necessity, is a highly speculative discipline — therein lies both its strength and its weakness.
The conservatism of hard science recoils at the idea that some scratches on a single shell amount to sufficient evidence to prompt a reconsideration about the origins of the human mind, and yet to refrain from such speculation seems like an effort to restrain the powers of the very thing we are trying to understand.
Rationally, there is as much reason to assume that abstract thinking long predates modern humans and thus searching for evidence of its absence and finding none would leave us agnostic about its presence or absence, than there is reason to assume that at some juncture it was born.
My inclination is to believe that any living creature that has some capacity to construct a neurological representation of their surroundings is by that very capacity employing something akin to abstract thinking.
This ability for the inner to mirror the outer has no doubt evolved, becoming progressively more complex and more deeply abstract, and yet mind, if defined as world-mirroring, seems to have been born when life first moved.
Live Science: Human ancestors may have begun evolving the knack for consuming alcohol about 10 million years ago, long before modern humans began brewing booze, researchers say.
The ability to break down alcohol likely helped human ancestors make the most out of rotting, fermented fruit that fell onto the forest floor, the researchers said. Therefore, knowing when this ability developed could help researchers figure out when these human ancestors began moving to life on the ground, as opposed to mostly in trees, as earlier human ancestors had lived.
“A lot of aspects about the modern human condition — everything from back pain to ingesting too much salt, sugar and fat — goes back to our evolutionary history,” said lead study author Matthew Carrigan, a paleogeneticist at Santa Fe College in Gainesville, Florida. “We wanted to understand more about the modern human condition with regards to ethanol,” he said, referring to the kind of alcohol found in rotting fruit and that’s also used in liquor and fuel. [Continue reading…]
EurekAlert!: Just as physical adaptations help populations prosper in inhospitable habitats, belief in moralizing, high gods might be similarly advantageous for human cultures in poorer environments. A new study from the National Evolutionary Synthesis Center (NESCent) suggests that societies with less access to food and water are more likely to believe in these types of deities.
“When life is tough or when it’s uncertain, people believe in big gods,” says Russell Gray, a professor at the University of Auckland and a founding director of the Max Planck Institute for History and the Sciences in Jena, Germany. “Prosocial behavior maybe helps people do well in harsh or unpredictable environments.”
Gray and his coauthors found a strong correlation between belief in high gods who enforce a moral code and other societal characteristics. Political complexity–namely a social hierarchy beyond the local community– and the practice of animal husbandry were both strongly associated with a belief in moralizing gods.
The emergence of religion has long been explained as a result of either culture or environmental factors but not both. The new findings imply that complex practices and characteristics thought to be exclusive to humans arise from a medley of ecological, historical, and cultural variables.
“When researchers discuss the forces that shaped human history, there is considerable disagreement as to whether our behavior is primarily determined by culture or by the environment,” says primary author Carlos Botero, a researcher at the Initiative for Biological Complexity at North Carolina State University. “We wanted to throw away all preconceived notions regarding these processes and look at all the potential drivers together to see how different aspects of the human experience may have contributed to the behavioral patterns we see today.” [Continue reading…]