Roc Morin writes: One of the first words that Koko used to describe herself was Queen. The gorilla was only a few years old when she first made the gesture — sweeping a paw diagonally across her chest as if tracing a royal sash.
“It was a sign we almost never used!” Koko’s head-caretaker Francine Patterson laughed. “Koko understands that she’s special because of all the attention she’s had from professors, and caregivers, and the media.”
The cause of the primate’s celebrity is her extraordinary aptitude for language. Over the past 43 years, since Patterson began teaching Koko at the age of 1, the gorilla has learned more than 1,000 words of modified American Sign Language—a vocabulary comparable to that of a 3-year-old human child. While there have been many attempts to teach human languages to animals, none have been more successful than Patterson’s achievement with Koko.
If Koko is a queen, then her kingdom is a sprawling research facility in the mountains outside Santa Cruz, California. It was there, under a canopy of stately redwoods, that I met research-assistant Lisa Holliday.
“You came on a good day,” Holliday smiled. “Koko’s in a good mood. She was playing the spoon game all morning! That’s when she takes the spoon and runs off with it so you can’t give her another bite. She’s an active girl. She’s always got her dolls, and in the afternoon, her kittens — or as we call them, her kids.”
It was a winding stroll up a sun-spangled trail toward the cabin where Patterson was busy preparing a lunch of diced apples and nuts for Koko. The gorilla’s two kitten playmates romped in a crate by her feet. We would go deliver the meal together shortly, but first I had some questions for the 68-year-old researcher. I wanted to understand more about her famous charge and the rest of our closest living relatives. [Continue reading…]
Claire Cameron writes: English speakers and others are highly egocentric when it comes to orienting themselves in the world. Objects and people exist to the left, right, in front, and to the back of you. You move forward and backward in relation to the direction you are facing. For an aboriginal tribe in north Queensland, Australia, called the Guugu Ymithirr, such a “me me me” approach to spatial information makes no sense. Instead, they use cardinal directions to express spatial information (pdf). So rather than “Can you move to my left?” they would say “Can you move to the west?”
Linguist Guy Deustcher says that Guugu Ymithirr speakers have a kind of “internal compass” that is imprinted from an extremely young age. In the same way that English-speaking infants learn to use different tenses when they speak, so do Guugu Ymithirr children learn to orient themselves along compass lines, not relative to themselves. In fact, says Deustcher, if a Guugu Ymithirr speaker wants to direct your attention to the direction behind him, he “points through himself, as if he were thin air and his own existence were irrelevant.” Whether that translates into less egocentric worldviews is a matter for further study and debate.
Other studies have shown that speakers of languages that use cardinal directions to express locations have fantastic spatial memory and navigation skills — perhaps because their experience of an event is so well-defined by the directions it took place in. [Continue reading…]
Discover a society with no absolutes, populated by the ultimate empiricists — people happy without God
Daniel Everett summarizes the lesson for linguistics from his research of the Pirahã people and their language:
The lesson is that language is not something mysterious that is outside the bounds of natural selection, or just popped into being through some mutated gene. But that language is a human invention to solve a human problem. Other creatures can’t use it for the same reason they can’t use a shovel: it was invented by humans, for humans and its success is judged by humans.
Francie Diep writes: Once upon a time, 4,000 to 8,000 years after humanity invented agriculture, something very strange happened to human reproduction. Across the globe, for every 17 women who were reproducing, passing on genes that are still around today—only one man did the same.
“It wasn’t like there was a mass death of males. They were there, so what were they doing?” asks Melissa Wilson Sayres, a computational biologist at Arizona State University, and a member of a group of scientists who uncovered this moment in prehistory by analyzing modern genes.
Another member of the research team, a biological anthropologist, hypothesizes that somehow, only a few men accumulated lots of wealth and power, leaving nothing for others. These men could then pass their wealth on to their sons, perpetuating this pattern of elitist reproductive success. Then, as more thousands of years passed, the numbers of men reproducing, compared to women, rose again. “Maybe more and more people started being successful,” Wilson Sayres says. In more recent history, as a global average, about four or five women reproduced for every one man. [Continue reading…]
Sebastian Junger writes: In two American studies of middle-class families during the 1980s, 85 percent of young children slept alone — a figure that rose to 95 percent among families considered “well-educated.” Northern European societies, including America, are the only ones in history to make very young children sleep alone in such numbers. The isolation is thought to trigger fears that make many children bond intensely with stuffed animals for reassurance. Only in Northern European societies do children go through the well-known developmental stage of bonding with stuffed animals; elsewhere, children get their sense of safety from the adults sleeping near them.
More broadly, in most human societies, almost nobody sleeps alone. Sleeping in family groups of one sort or another has been the norm throughout human history and is still commonplace in most of the world. Again, Northern European societies are among the few where people sleep alone or with a partner in a private room. When I was with American soldiers at a remote outpost in Afghanistan, we slept in narrow plywood huts where I could reach out and touch three other men from where I slept. They snored, they talked, they got up in the middle of the night to use the piss tubes, but we felt safe because we were in a group. The Taliban attacked the position regularly, and the most determined attacks often came at dawn. Another unit in a nearby valley was almost overrun and took 50 percent casualties in just such an attack. And yet I slept better surrounded by those noisy, snoring men than I ever did camping alone in the woods of New England.
Many soldiers will tell you that one of the hardest things about coming home is learning to sleep without the security of a group of heavily armed men around them. In that sense, being in a war zone with your platoon feels safer than being in an American suburb by yourself. I know a vet who felt so threatened at home that he would get up in the middle of the night to build fighting positions out of the living-room furniture. This is a radically different experience from what warriors in other societies go through, such as the Yanomami, of the Orinoco and Amazon Basins, who go to war with their entire age cohort and return to face, together, whatever the psychological consequences may be. As one anthropologist pointed out to me, trauma is usually a group experience, so trauma recovery should be a group experience as well. But in our society it’s not.
“Our whole approach to mental health has been hijacked by pharmaceutical logic,” I was told by Gary Barker, an anthropologist whose group, Promundo, is dedicated to understanding and preventing violence. “PTSD is a crisis of connection and disruption, not an illness that you carry within you.”
This individualizing of mental health is not just an American problem, or a veteran problem; it affects everybody. A British anthropologist named Bill West told me that the extreme poverty of the 1930s and the collective trauma of the Blitz served to unify an entire generation of English people. “I link the experience of the Blitz to voting in the Labour Party in 1945, and the establishing of the National Health Service and a strong welfare state,” he said. “Those policies were supported well into the 60s by all political parties. That kind of cultural cohesiveness, along with Christianity, was very helpful after the war. It’s an open question whether people’s problems are located in the individual. If enough people in society are sick, you have to wonder whether it isn’t actually society that’s sick.”
Ideally, we would compare hunter-gatherer society to post-industrial society to see which one copes better with PTSD. When the Sioux, Cheyenne, and Arapaho fighters returned to their camps after annihilating Custer and his regiment at Little Bighorn, for example, were they traumatized and alienated by the experience — or did they fit right back into society? There is no way to know for sure, but less direct comparisons can still illuminate how cohesiveness affects trauma. In experiments with lab rats, for example, a subject that is traumatized — but not injured — after an attack by a larger rat usually recovers within 48 hours unless it is kept in isolation, according to data published in 2005 in Neuroscience & Biobehavioral Reviews. The ones that are kept apart from other rats are the only ones that develop long-term traumatic symptoms. And a study of risk factors for PTSD in humans closely mirrored those results. In a 2000 study in the Journal of Consulting and Clinical Psychology, “lack of social support” was found to be around two times more reliable at predicting who got PTSD and who didn’t than the severity of the trauma itself. You could be mildly traumatized, in other words—on a par with, say, an ordinary rear-base deployment to Afghanistan — and experience long-term PTSD simply because of a lack of social support back home.
Anthropologist and psychiatrist Brandon Kohrt found a similar phenomenon in the villages of southern Nepal, where a civil war has been rumbling for years. Kohrt explained to me that there are two kinds of villages there: exclusively Hindu ones, which are extremely stratified, and mixed Buddhist/Hindu ones, which are far more open and cohesive. He said that child soldiers, both male and female, who go back to Hindu villages can remain traumatized for years, while those from mixed-religion villages tended to recover very quickly. “PTSD is a disorder of recovery, and if treatment only focuses on identifying symptoms, it pathologizes and alienates vets,” according to Kohrt. “But if the focus is on family and community, it puts them in a situation of collective healing.” [Continue reading…]
American culture has made a fetish out of the rights and aspirations of the individual — hence at our head we have gathered an ineffectual assembly of dunces in the United States Capitol. But as Yuval Noah Harari points out, our human potential expresses itself much less in what we accomplish individually than in what we do together.
70,000 years ago humans were insignificant animals. The most important thing to know about prehistoric humans is that they were unimportant. Their impact on the world was very small, less than that of jellyfish, woodpeckers or bumblebees.
Today, however, humans control this planet. How did we reach from there to here? What was our secret of success, that turned us from insignificant apes minding their own business in a corner of Africa, into the rulers of the world?
We often look for the difference between us and other animals on the individual level. We want to believe that there is something special about the human body or human brain that makes each individual human vastly superior to a dog, or a pig, or a chimpanzee. But the fact is that one-on-one, humans are embarrassingly similar to chimpanzees. If you place me and a chimpanzee together on a lone island, to see who survives better, I would definitely place my bets on the chimp.
The real difference between us and other animals is on the collective level. [Continue reading…]
Carl Zimmer writes: For centuries, archaeologists have reconstructed the early history of Europe by digging up ancient settlements and examining the items that their inhabitants left behind. More recently, researchers have been scrutinizing something even more revealing than pots, chariots and swords: DNA.
On Wednesday in the journal Nature, two teams of scientists — one based at the University of Copenhagen and one based at Harvard University — presented the largest studies to date of ancient European DNA, extracted from 170 skeletons found in countries from Spain to Russia. Both studies indicate that today’s Europeans descend from three groups who moved into Europe at different stages of history.
The first were hunter-gatherers who arrived some 45,000 years ago in Europe. Then came farmers who arrived from the Near East about 8,000 years ago.
Finally, a group of nomadic sheepherders from western Russia called the Yamnaya arrived about 4,500 years ago. The authors of the new studies also suggest that the Yamnaya language may have given rise to many of the languages spoken in Europe today. [Continue reading…]
Tom Jacobs writes: Since the discoveries of Darwin, evidence has gradually mounted refuting the notion that the natural world is the product of a deity or other outside designer. Yet this idea remains firmly lodged in the human brain.
Just how firmly is the subject of newly published research, which finds even self-proclaimed atheists instinctively think of natural phenomena as being purposefully created.
The findings “suggest that there is a deeply rooted natural tendency to view nature as designed,” writes a research team led by Elisa Järnfelt of Newman University. They also provide evidence that, in the researchers’ words, “religious non-belief is cognitively effortful.” [Continue reading…]
Heather Pringle writes: In a spacious, art-filled apartment in Brasília, 75-year-old Sydney Possuelo takes a seat near a large portrait of his younger self. On the canvas, Possuelo stares with calm assurance from the stern of an Amazon riverboat, every bit the famous sertanista, or Amazon frontiersman, that he once was. But on this late February morning, that confidence is nowhere to be seen. Possuelo, now sporting a beard neatly trimmed for city life, seethes with anger over the dangers now threatening the Amazon’s isolated tribespeople. “These are the last few groups of humans who are really free,” he says. “But we will kill them.”
For decades, Possuelo worked for Brazil’s National Indian Foundation (FUNAI), the federal agency responsible for the country’s indigenous peoples. In the 1970s and 1980s, he and other sertanistas made contact with isolated tribespeople so they could be moved off their land and into settlements. But Possuelo and others grew alarmed by the human toll. The newly contacted had no immunity to diseases carried by outsiders, and the flu virus, he recalls, “was like a suicide bomber,” stealing into a village unnoticed. Among some groups, 50% to 90% died (see sidebar, p. 1084). In 1987, Possuelo and fellow sertanistas met to try to stop this devastation.
In Brasília, a futuristic city whose central urban footprint evokes the shape of an airplane, the frontiersmen agreed that contact was inherently damaging to isolated tribespeople. They drew up a new action plan for FUNAI, based solidly on the principle of no contact unless groups faced extinction. They recommended mapping and legally recognizing the territories of isolated groups, and keeping out loggers, miners, and settlers. If contact proved unavoidable, protecting tribespeople’s health should be top priority.
The recommendations became FUNAI policy, and a model for other countries where isolated populations are emerging, such as neighboring Peru (see companion story, p. 1072). In remote regions, FUNAI has designated a dozen “protection fronts” — official front lines in the battle to defend isolated groups, each dotted with one or more frontier bases to track tribes and sound the alarm when outsiders invade. In an interview in February, FUNAI’s interim president, Flávio Chiarelli, told Science that his agency is “doing great” at protecting the country’s isolated tribes.
But some experts say that as the pace of economic activity in the Amazon accelerates, the protection system that was once the envy of South America is falling apart. [Continue reading…]
Jeff Wheelwright writes: I sat in my padded desk chair, hunched over, alternately entering notes on my computer and reading a book called The Story of the Human Body. It was the sort of book guaranteed to make me increasingly, uncomfortably aware of my own body. I squirmed to relieve an ache in my lower back. When I glanced out the window, the garden looked fuzzy. Where were my glasses? My toes felt hot and itchy: My athlete’s foot was flaring up again.
I returned to the book. “This chapter focuses on just three behaviors … that you are probably doing right now: wearing shoes, reading, and sitting.” OK, I was. What could be more normal?
According to the author, a human evolutionary biologist at Harvard named Daniel Lieberman, shoes, books and padded chairs are not normal at all. My body had good reason to complain because it wasn’t designed for these accessories. Too much sitting caused back pain. Too much focusing on books and computer screens at a young age fostered myopia. Enclosed, cushioned shoes could lead to foot problems, including bunions, fungus between the toes and plantar fasciitis, an inflammation of the tissue below weakened arches.
Those are small potatoes compared with obesity, Type 2 diabetes, osteoporosis, heart disease and many cancers also on the rise in the developed and developing parts of the world. These serious disorders share several characteristics: They’re chronic, noninfectious, aggravated by aging and strongly influenced by affluence and culture. Modern medicine has come up with treatments for them, but not solutions; the deaths and disabilities continue to climb.
An evolutionary perspective is critical to understanding the body’s pitfalls in a time of plenty, Lieberman suggests. [Continue reading…]
Yuval Noah Harari writes: Over the last decade, I have been writing a history of humankind, tracking down the transformation of our species from an insignificant African ape into the master of the planet. It was not easy to understand what turned Homo sapiens into an ecological serial killer; why men dominated women in most human societies; or why capitalism became the most successful religion ever. It wasn’t easy to address such questions because scholars have offered so many different and conflicting answers. In contrast, when it came to assessing the bottom line – whether thousands of years of inventions and discoveries have made us happier – it was surprising to realise that scholars have neglected even to ask the question. This is the largest lacuna in our understanding of history.
Though few scholars have studied the long-term history of happiness, almost everybody has some idea about it. One common preconception – often termed “the Whig view of history” – sees history as the triumphal march of progress. Each passing millennium witnessed new discoveries: agriculture, the wheel, writing, print, steam engines, antibiotics. Humans generally use newly found powers to alleviate miseries and fulfil aspirations. It follows that the exponential growth in human power must have resulted in an exponential growth in happiness. Modern people are happier than medieval people, and medieval people were happier than stone age people.
But this progressive view is highly controversial. Though few would dispute the fact that human power has been growing since the dawn of history, it is far less clear that power correlates with happiness. The advent of agriculture, for example, increased the collective power of humankind by several orders of magnitude. Yet it did not necessarily improve the lot of the individual. For millions of years, human bodies and minds were adapted to running after gazelles, climbing trees to pick apples, and sniffing here and there in search of mushrooms. Peasant life, in contrast, included long hours of agricultural drudgery: ploughing, weeding, harvesting and carrying water buckets from the river. Such a lifestyle was harmful to human backs, knees and joints, and numbing to the human mind.
In return for all this hard work, peasants usually had a worse diet than hunter-gatherers, and suffered more from malnutrition and starvation. Their crowded settlements became hotbeds for new infectious diseases, most of which originated in domesticated farm animals. Agriculture also opened the way for social stratification, exploitation and possibly patriarchy. From the viewpoint of individual happiness, the “agricultural revolution” was, in the words of the scientist Jared Diamond, “the worst mistake in the history of the human race”.
The case of the agricultural revolution is not a single aberration, however. Themarch of progress from the first Sumerian city-states to the empires of Assyria and Babylonia was accompanied by a steady deterioration in the social status and economic freedom of women. The European Renaissance, for all its marvellous discoveries and inventions, benefited few people outside the circle of male elites. The spread of European empires fostered the exchange of technologies, ideas and products, yet this was hardly good news for millions of Native Americans, Africans and Aboriginal Australians.
The point need not be elaborated further. Scholars have thrashed the Whig view of history so thoroughly, that the only question left is: why do so many people still believe in it? [Continue reading…]
For such a large and culturally diverse place, Europe has surprisingly little genetic variety. Learning how and when the modern gene-pool came together has been a long journey. But thanks to new technological advances a picture is slowly coming together of repeated colonisation by peoples from the east with more efficient lifestyles.
In a new study, we have added a piece to the puzzle: the Y chromosomes of the majority of European men can be traced back to just three individuals living between 3,500 and 7,300 years ago. How their lineages came to dominate Europe makes for interesting speculation. One possibility could be that their DNA rode across Europe on a wave of new culture brought by nomadic people from the Steppe known as the Yamnaya.
Stone Age Europe
The first-known people to enter Europe were the Neanderthals – and though they have left some genetic legacy, it is later waves who account for the majority of modern European ancestry. The first “anatomically modern humans” arrived in the continent around 40,000 years ago. These were the Palaeolithic hunter-gatherers sometimes called the Cro-Magnons. They populated Europe quite sparsely and lived a lifestyle not very different from that of the Neanderthals they replaced.
Then something revolutionary happened in the Middle East – farming, which allowed for enormous population growth. We know that from around 8,000 years ago a wave of farming and population growth exploded into both Europe and South Asia. But what has been much less clear is the mechanism of this spread. How much was due to the children of the farmers moving into new territories and how much was due to the neighbouring hunter-gathers adopting this new way of life?
Smithsonian magazine: Approximately 3.3 million years ago someone began chipping away at a rock by the side of a river. Eventually, this chipping formed the rock into a tool used, perhaps, to prepare meat or crack nuts. And this technological feat occurred before humans even showed up on the evolutionary scene.
That’s the conclusion of an analysis published today in Nature of the oldest stone tools yet discovered. Unearthed in a dried-up riverbed in Kenya, the shards of scarred rock, including what appear to be early hammers and cutting instruments, predate the previous record holder by around 700,000 years. Though it’s unclear who made the tools, the find is the latest and most convincing in a string of evidence that toolmaking began before any members of the Homo genus walked the Earth.
“This discovery challenges the idea that the main characters that make us human — making stone tools, eating more meat, maybe using language — all evolved at once in a punctuated way, near the origins of the genus Homo,” says Jason Lewis, a paleoanthropologist at Rutgers University and co-author of the study. [Continue reading…]
Is the Earth now spinning through the “Age of Humans?” More than a few scientists think so. They’ve suggested, in fact, that we modify the name of the current geological epoch (the Holocene, which began roughly 12,000 years ago) to the “Anthropocene.” It’s a term first put into wide circulation by Nobel-Prize winning atmospheric chemist Paul Crutzen in an article published in Nature in 2002. And it’s stirring up a good deal of debate, not only among geologists.
The idea is that we needed a new planetary marker to account for the scale of human changes to the Earth: extensive land transformation, mass extinctions, control of the nitrogen cycle, large-scale water diversion, and especially change of the atmosphere through the emission of greenhouse gases. Although naming geological epochs isn’t usually a controversial act, the Anthropocene proposal is radical because it means that what had been an environmental fixture against which people acted, the geological record, is now just another expression of the human presence.
It seems to be a particularly bitter pill to swallow for nature preservationists, heirs to the American tradition led by writers, scientists and activists such as John Muir, Aldo Leopold, David Brower, Rachel Carson and Edward Abbey. That’s because some have argued the traditional focus on the goal of wilderness protection rests on a view of “pristine” nature that is simply no longer viable on a planet hurtling toward nine billion human inhabitants.
Given this situation, we felt the time was ripe to explore the impact of the Anthropocene on the idea and practice of nature preservation. Our plan was to create a salon, a kind of literary summit. But we wanted to cut to the chase: What does it mean to “save American nature” in the age of humans?
We invited a distinguished group of environmental writers – scientists, philosophers, historians, journalists, agency administrators and activists – to give it their best shot. The essays appear in the new collection, After Preservation: Saving American Nature in the Age of Humans.
Scott Atran recently addressed the UN Security Council’s Ministerial Debate on “The Role of Youth in Countering Violent Extremism and Promoting Peace.” This post is an adaptation of his remarks: I am an anthropologist. Anthropologists, as a group, study the diversity of human cultures to understand our commonalities and differences, and to use the knowledge of what is common to us all to help us bridge our differences. My research aims to help reduce violence between peoples, by first trying to understand thoughts and behaviors as different from my own as any I can imagine: such as suicide actions that kill masses of people innocent of direct harm to others. The key, as Margaret Mead taught me long ago, when I worked as her assistant at the American Museum of Natural History in New York, was to empathize with people, without always sympathizing: to participate in their lives to the extent you feel is morally possible. And then report.
I’ve spent much time observing, interviewing and carrying out systematic studies among people on six continents who are drawn to violent action for a group and its cause. Most recently with colleagues last month in Kirkuk, Iraq among young men who had killed for ISIS, and with young adults in the banlieus of Paris and barrios of Barcelona who seek to join it.
With some insights from social science research, I will try to outline a few conditions that may help move such youth from taking the path of violent extremism.
But first, who are these young people? None of the ISIS fighters we interviewed in Iraq had more than primary school education, some had wives and young children. When asked “what is Islam?” they answered “my life.” They knew nothing of the Quran or Hadith, or of the early caliphs Omar and Othman, but had learned of Islam from Al Qaeda and ISIS propaganda, teaching that Muslims like them were targeted for elimination unless they first eliminated the impure. This isn’t an outlandish proposition in their lived circumstances: as they told of growing up after the fall of Saddam Hussein in a hellish world of constant guerrilla war, family deaths and dislocation, and of not being even able to go out of their homes or temporary shelters for months on end. [Continue reading…]
Scientific American reports: Opera singers and dry air don’t get along. In fact, the best professional singers require humid settings to help them achieve the right pitch. “When your vocal cords are really dry, they’re a little less elastic,” says Caleb Everett, an anthropological linguist at the University of Miami. As a result, singers experience tiny variations in pitch, called jitter, as well as wavering volume—both of which contribute to rougher refrains.
If the amount of moisture in the air influences musical pitch, Everett wondered, has that translated into the development of fewer tonal languages in arid locations? Tonal languages, such as Mandarin Chinese and Cherokee, rely on variations in pitch to differentiate meaning: the same syllable spoken at a higher pitch can specify a different word if spoken at a lower pitch or in a rising or falling tone.
In a survey of more than 3,700 languages, Everett and his collaborators found that those with complex tones do indeed occur less frequently in dry areas than they do in humid ones, even after accounting for the clustering of related languages. For instance, more than half of the hundreds of languages spoken in tropical sub-Saharan locations feature complex tones, whereas none of the two dozen languages in the Sahara do. Overall, only one in 30 complex tonal languages flourished in dry areas; one in three nontonal languages cropped up in those same regions. The results appeared in February in the Proceedings of the National Academy of Sciences USA. [Continue reading…]
Jedediah Purdy writes: As much as a scientific concept, the Anthropocene is a political and ethical gambit. Saying that we live in the Anthropocene is a way of saying that we cannot avoid responsibility for the world we are making. So far so good. The trouble starts when this charismatic, all-encompassing idea of the Anthropocene becomes an all-purpose projection screen and amplifier for one’s preferred version of ‘taking responsibility for the planet’.
Peter Kareiva, the controversial chief scientist of the Nature Conservancy, uses the theme ‘Conservation in the Anthropocene’ to trash environmentalism as philosophically naïve and politically backward. Kareiva urges conservationists to give up on wilderness and embrace what the writer Emma Marris calls the ‘rambunctious garden’. Specifically, Kareiva wants to rank ecosystems by the quality of ‘ecosystem services’ they provide for human beings instead of ‘pursuing the protection of biodiversity for biodiversity’s sake’. He wants a pro‑development stance that assumes that ‘nature is resilient rather than fragile’. He insists that: ‘Instead of scolding capitalism, conservationists should partner with corporations in a science-based effort to integrate the value of nature’s benefits into their operations and cultures.’ In other words, the end of nature is the signal to carry on with green-branded business as usual, and the business of business is business, as the Nature Conservancy’s partnerships with Dow, Monsanto, Coca-Cola, Pepsi, J P Morgan, Goldman Sachs and the mining giant Rio Tinto remind us.
Kareiva is a favourite of Andrew Revkin, the roving environmental maven of The New York Times Magazine, who touts him as a paragon of responsibility-taking, a leader among ‘scholars and doers who see that new models for thinking and acting are required in this time of the Anthropocene’. This pair and their friends at the Breakthrough Institute in California can be read as making a persistent effort to ‘rebrand’ environmentalism as humanitarian and development-friendly (and capture speaking and consultancy fees, which often seem to be the major ecosystem services of the Anthropocene). This is itself a branding strategy, an opportunity to slosh around old plonk in an ostentatiously shiny bottle. [Continue reading…]