Category Archives: Culture

The long history of a short form

decay12

Ryan Ruby writes: For a word that literally means definition, the aphorism is a rather indefinite genre. It bears a family resemblance to the fragment, the proverb, the maxim, the hypomnema, the epigram, the mantra, the parable, and the prose poem. Coined sometime between the fifth and third centuries BC as the title for one of the books of the Corpus Hippocraticum, the Aphorismi were originally a compendium of the latest medical knowledge. The penultimate aphorism, “In chronic disease an excessive flux from the bowels is bad,” is more representative of the collection’s contents than the first — “Life is short, art is long” — for which it is best known.

But in those six words lies a clue to the particular space aphorisms were supposed to define. Thanks to a semantic slippage between the Greek word techne and its English translation (via the Latin ars), the saying is often taken to mean that the works of human beings outlast their days. But in its original context, Hippocrates or his editors probably intended something more pragmatic: the craft of medicine takes a long time to learn, and physicians have a short time in which to learn it. Although what aphorisms have in common with the forms listed above is their brevity, what is delimited by the aphorism is not the number of words in which ideas are expressed but the scope of their inquiry. Unlike Hebrew proverbs, in which the beginning of wisdom is the fear of God, the classical aphorism is a secular genre concerned with the short span of time we are allotted on earth. Books of aphorisms are also therapeutic in nature, collections of practical wisdom through which we can rid ourselves of unnecessary suffering and achieve what Hippocrates’ contemporary Socrates called eudaimonia, the good life.

This is certainly what the Stoic philosopher Arrian had in mind when he whittled down the discourses of his master, Epictetus, into a handbook of aphorisms. The Enchiridion is composed of that mixture of propositional assertion and assertive imperative that is now a hallmark of the form. In it, Epictetus, a former slave, outlines the Stoic view that, while “some things are in our control,” most things are ruled by fate. The way to the good life is to bring what is up to us — our attitudes, judgments, and desires — into harmony with what is not up to us: what happens to our bodies, possessions, and reputations. If we accept that what does happen must happen, we will never be disappointed by vain hopes or sudden misfortunes. Our dispositions, not our destinies, are the real source of our unhappiness. [Continue reading…]

Facebooktwittermail

Animals are us

elephant

Tania Lombrozo writes: Researchers have studied how people think about humans in relation to the natural world, and how the way we reason about humans and other animals changes over the course of development and as a function of education and culture.

The findings from this body of work suggest that by age 5, Western children growing up in urban environments are anomalous in the extent to which they regard humans as central to the biological world. Much of the rest of the world — including 3-year-olds, 5-year-olds in rural environments and adults from indigenous populations in South America — are more inclined to think about humans as one animal species among others, at least when it comes to reasoning about the properties that human and non-human animals are likely to possess.

To illustrate, consider a study by Patricia Herrmann, Sandra Waxman and Douglas Medin published in the Proceedings of the National Academy of Sciences in 2010. In one experiment, 64 urban children, aged 3 or 5, were asked a series of questions that assessed their willingness to generalize an unknown property from one object to another. For instance, they might be told that people “have andro inside,” and would then have to guess whether it’s right or wrong to say that dogs “have andro inside.”

The findings with 5-year-olds replicated classic work in developmental psychology and suggested a strong “anthropocentric” bias: The children were more likely to generalize from humans to non-humans than the other way around, consistent with a privileged place for humans in the biological world. The 3-year-olds, by contrast, showed no signs of this bias: They generalized from humans to non-humans and from non-humans to humans in just the same way. These findings suggest that an anthropocentric perspective isn’t a necessary starting point for human reasoning about the biological world, but rather a perspective we acquire through experience.

So what happens between the ages of 3 and 5 to induce an anthropocentric bias?

Perhaps surprisingly, one influence seems to be anthropomorphism in storybooks. [Continue reading…]

Facebooktwittermail

‘Children today are less free than they have ever been’

Jenny Anderson writes: “Something in modern life is undermining mental health,” Jean Twenge, a professor of psychology at San Diego State University, wrote in a recent paper.

Specifically, something is undermining young people’s mental health, especially girls.

In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.

“It indicates a lot of suffering,” Twenge told Quartz.

It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:

Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.

The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).

“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.

Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.

Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.

Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.

“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.

“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]

Facebooktwittermail

The time has come for a ‘Sexual Spring’ in the Arab world

Kacem El Ghazzali writes: When we say that nowadays to call for sexual freedom in Arab and Muslim societies is more dangerous than the demand to topple monarchies or dictatorial regimes, we are not playing with metaphor or attempting to gain sympathy. We are stating a bitter and painful fact of the reality in which we are living.

In Arab and Muslim milieus, sex is considered a means and not an end, hedged by many prickly restrictions that make it an objectionable matter and synonymous with sin. Its function within marriage is confined to procreation and nothing else, and all sexual activity outside the institution of marriage is banned legally and rejected socially. Innocent children born out of wedlock are socially rejected and considered foundlings.

This situation cannot be said to be characteristic of Arab societies only, but we experience these miseries in far darker and more intense ways than in other countries. This is especially so because of the dominance of machismo, which considers a man’s sexual adventures as heroics worthy of pride, while a woman who dares to give in to her sexual desires is destined to be killed — or at best beaten and expelled from home — because she has brought dishonor upon her family. [Continue reading…]

Facebooktwittermail

It’s time to reinstate the forgotten ideal of the commons

Antonia Malchik writes: The ranch my mother was born on was not built solely by her family’s labour. It relied on water aquifers deep beneath the surface, the health of soil on plains and hills beyond their borders, on hundreds – perhaps thousands – of years of care by the Blackfoot tribe whose land it should have remained, the weather over which they had no control, the sun, seeds, and a community who knew in their bones that nobody could do this alone. These things comprised an ecosystem that was vital to their survival, and the same holds true today. These are our shared natural resources, or what was once known as ‘the commons’.

We live on and in the commons, even if we don’t recognise it as such. Every time we take a breath, we’re drawing from the commons. Every time we walk down a road we’re using the commons. Every time we sit in the sunshine or shelter from the rain, listen to birdsong or shut our windows against the stench from a nearby oil refinery, we are engaging with the commons. But we have forgotten the critical role that the commons play in our existence. The commons make life possible. Beyond that, they make private property possible. When the commons become degraded or destroyed, enjoyment and use of private property become untenable. A Montana rancher could own ten thousand acres and still be dependent on the health of the commons. Neither a gated community nor high-rise penthouse apartments can close a human being from the wider world that we all rely on. [Continue reading…]

Facebooktwittermail

Islam is reshaping Europe

europe-middle-east-north-africa

Robert Kaplan writes: Orientalism, through which one culture appropriated and dominated another, is slowly evaporating in a world of cosmopolitan interactions and comparative studies, as [Edward] Said intuited it might. Europe has responded by artificially reconstructing national-cultural identities on the extreme right and left, to counter the threat from the civilization it once dominated.

Although the idea of an end to history — with all its ethnic and territorial disputes — turns out to have been a fantasy, this realization is no excuse for a retreat into nationalism. The cultural purity that Europe craves in the face of the Muslim-refugee influx is simply impossible in a world of increasing human interactions.

“The West,” if it does have a meaning beyond geography, manifests a spirit of ever more inclusive liberalism. Just as in the 19th century there was no going back to feudalism, there is no going back now to nationalism, not without courting disaster. [Continue reading…]

Facebooktwittermail

Technology is not ruining our kids. Parents (and their technology) are ruining them

Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.

Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.

A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”

Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.

Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)

That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]

Facebooktwittermail

Culture without borders: The history of culture is the history of cultural appropriation

Kenan Malik writes: Cultural appropriation is, in the words of Susan Scafidi, professor of law at Fordham University, and author of Who Owns Culture? Appropriation and Authenticity in American Law, “Taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission”. This can include the “unauthorised use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”

stepping-stonesBut what is it for knowledge or an object to “belong” to a culture? And who gives permission for someone from another culture to use such knowledge or forms?

The idea that the world could be divided into distinct cultures, and that every culture belonged to a particular people, has its roots in late 18th-century Europe.

The Romantic movement, which developed in part in opposition to the rationalism of the Enlightenment, celebrated cultural differences and insisted on the importance of “authentic” ways of being.

For Johann Gottfried Herder, the German philosopher who best articulated the Romantic notion of culture, what made each people – or “volk” – unique was its particular language, history and modes of living. The unique nature of each volk was expressed through its “volksgeist” – the unchanging spirit of a people refined through history.

Herder was no reactionary – he was an important champion of equality – but his ideas about culture were adopted by reactionary thinkers. Those ideas became central to racial thinking – the notion of the volksgeist was transformed into the concept of racial make-up – and fuelled the belief that non-Western societies were “backward” because of their “backward” cultures.

Radicals challenging racism and colonialism rejected the Romantic view of culture, adopting instead a universalist perspective. From the struggle against slavery to the anti-colonial movements, the aim not to protect one’s own special culture but to create a more universal culture in which all could participate on equal terms.

In recent decades, however, the universalist viewpoint has eroded, largely as many of the social movements that embodied that viewpoint have disintegrated. The social space vacated by that disintegration became filled by identity politics.

As the broader struggles for social transformation have faded, people have tended to retreat into their particular faiths or cultures, and to embrace more parochial forms of identity. In this process, the old cultural arguments of the racists have returned, but now rebranded as “antiracist”.

But how does creating gated cultures, and preventing others from trespassing upon one’s culture without permission, challenge racism or promote social justice? [Continue reading…]

Facebooktwittermail

The strange history of secularism twists debate about British Muslim attitudes

By Humeira Iqtidar, King’s College London

Governments in Britain have tended to treat Muslim citizens much like colonial administrations treated their subjects. Intermediaries – tribal leaders or religious figures – are found to establish communication between the empire and its people. One positive thing about a recent ICM poll of British muslims is that it offers an alternative. The survey, carried out for a Channel 4 documentary, was never going to be able to reflect the complexity of British Muslim life accurately, but it does signal a shift by engaging directly with Muslim citizens.

How poll data is used is one way to test how colonialism’s legacy might linger on. The Daily Mail chose for its headline the quote: “Muslims are not like us and we should just accept that they will not integrate …” while Sky News highlighted that: “Half of British Muslims want homosexuality banned.”

Few media outlets rushed to use the headline that “86% of Muslims feel strong affiliation with UK, higher than the national average”, although this too is one of the findings from the survey. It is an “us and them” framework that fails to spark debate about who “we” might be and why “they”, with all their differences, might need greater integration with us, as the report has suggested.

We don’t have space here to discuss how the category Muslim may be broken up across class, regional or ethnic background. Nor will we get into comparisons with others: whether, for instance, British Catholics, or for that matter, members of the Conservative Party, might have similar sentiments towards homosexuality.

Continue reading

Facebooktwittermail

What I learned from tickling apes

chimp

Frans de Waal writes: Tickling a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself.

The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter.

Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal.

The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners.

Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. [Continue reading…]

Facebooktwittermail

How Cervantes and Shakespeare wrote the modern literary rule book

destruction2

Salman Rushdie writes: As we honour the four hundredth anniversaries of the deaths of William Shakespeare and Miguel de Cervantes Saavedra, it may be worth noting that while it’s generally accepted that the two giants died on the same date, 23 April 1616, it actually wasn’t the same day. By 1616 Spain had moved on to using the Gregorian calendar, while England still used the Julian, and was 11 days behind. (England clung to the old ­Julian dating system until 1752, and when the change finally came, there were riots and, it’s said, mobs in the streets shouting, “Give us back our 11 days!”) Both the coincidence of the dates and the difference in the calendars would, one suspects, have delighted the playful, erudite sensibilities of the two fathers of modern literature.

We don’t know if they were aware of each other, but they had a good deal in common, beginning right there in the “don’t know” zone, because they are both men of mystery; there are missing years in the record and, even more tellingly, missing documents. Neither man left behind much personal material. Very little to nothing in the way of letters, work diaries, abandoned drafts; just the colossal, completed oeuvres. “The rest is silence.” Consequently, both men have been prey to the kind of idiot theories that seek to dispute their authorship.

A cursory internet search “reveals”, for example, that not only did Francis Bacon write Shakespeare’s works, he wrote Don Quixote as well. (My favourite crazy Shakespeare theory is that his plays were not written by him but by someone else of the same name.) And of course Cervantes faced a challenge to his authorship in his own lifetime, when a certain pseudonymous Alonso Fernández de Avellaneda, whose identity is also uncertain, published his fake sequel to Don Quixote and goaded Cervantes into writing the real Book II, whose characters are aware of the plagiarist Avellaneda and hold him in much contempt. [Continue reading…]

Facebooktwittermail

The craving for public squares

Michael Kimmelman writes: Squares have defined urban living since the dawn of democracy, from which they are inseparable. The public square has always been synonymous with a society that acknowledges public life and a life in public, which is to say a society distinguishing the individual from the state. There were, strictly speaking, no public squares in ancient Egypt or India or Mesopotamia. There were courts outside temples and royal houses, and some wide processional streets.

By the sixth century BC, the agora in Athens was a civic center, and with the rise of democracy, became a center for democracy’s institutions, the heart of public life. In ancient Greek, the word “agora” is hard to translate. In Homer it could imply a “gathering” or “assembly”; by the time of Thucydides it had come to connote the public center of a city, the place around which the rest of the city was arranged, where business and politics were conducted in public — the place without which Greeks did not really regard a town or city as a town or city at all. Rather, such a place was, as the second-century writer Pausanias roughly put it, just a sorry assortment of houses and ancient shrines.

The agora announced the town as a polis. Agoras grew in significance during the Classical and Hellenistic years, physical expressions of civic order and life, with their temples and fishmongers and bankers at money-changing tables and merchants selling oil and wine and pottery. Stoas, or colonnades, surrounded the typical agora, and sometimes trees provided shade. People who didn’t like cities, and disliked democracy in its messiness, complained that agoras mixed religious and sacrilegious life, commerce, politics, and theater. But of course that was also their attraction and significance. The agora symbolized civil justice; it was organic, changeable, urbane. Even as government moved indoors and the agora evolved over time into the Roman forum, a grander, more formal place, the notion of the public square as the soul of urban life remained, for thousands of years, critical to the self-identity of the state.

I don’t think it’s coincidental that early in 2011 the Egyptian revolution centered around Tahrir Square, or that the Occupy Movement later that same year, partly inspired by the Arab Spring, expressed itself by taking over squares like Taksim in Istanbul, the Plaça de Catalunya in Barcelona, and Zuccotti Park in Lower Manhattan. And I don’t think it’s coincidental that the strangers who came together at places like Zuccotti and Taksim all formed pop-up towns on these sites, producing in miniature form (at least temporarily) what they imagined to be the outlines of a city, with distinct spaces designated for legal services, libraries, medical stations, media centers, kitchens serving free food, and general stores handing out free clothing. [Continue reading…]

Facebooktwittermail

How Shakespeare lives now

Stephen Greenblatt writes: A few years ago, during a merciful remission in the bloodshed and mayhem that has for so many years afflicted Afghanistan, a young Afghan poet, Qais Akbar Omar, had an idea. It was, he brooded, not only lives and livelihood that had been ruthlessly attacked by the Taliban, it was also culture. The international emblem of that cultural assault was the dynamiting of the Bamiyan Buddhas, but the damage extended to painting, music, dance, fiction, film, and poetry. It extended as well to the subtle web of relations that link one culture to another across boundaries and make us, each in our provincial worlds, feel that we are part of a larger humanity. This web is not only a contemporary phenomenon, the result of modern technology; it is as old as culture itself, and it has been particularly dense and vital in Afghanistan with its ancient trade routes and its endless succession of would-be conquerors.

Omar thought that the time was ripe to mark the restoration of civil society and repair some of the cultural damage. He wanted to stage a play with both men and women actors performing in public in an old garden in Kabul. He chose a Shakespeare play. No doubt the choice had something to do with the old imperial presence of the British in Afghanistan, but it was not only this particular history that was at work. Shakespeare is the embodiment worldwide of a creative achievement that does not remain within narrow boundaries of the nation-state or lend itself to the secure possession of a particular faction or speak only for this or that chosen group. He is the antithesis of intolerant provinciality and fanaticism. He could make with effortless grace the leap from Stratford to Kabul, from English to Dari.

Omar did not wish to put on a tragedy; his country, he thought, had suffered through quite enough tragedy of its own. Considering possible comedies, he shied away from those that involved cross-dressing. It was risky enough simply to have men and women perform together on stage. In the end he chose Love’s Labour’s Lost, a comedy that arranged the sexes in distinct male and female groups, had relatively few openly transgressive or explicitly erotic moments, and decorously deferred the final consummation of desire into an unstaged future. As a poet, Omar was charmed by the play’s gorgeous language, language that he felt could be rendered successfully in Dari.

The complex story of the mounting of the play is told in semifictionalized form in a 2015 book Omar coauthored with Stephen Landrigan, A Night in the Emperor’s Garden. Measured by the excitement it generated, this production of Love’s Labor’s Lost was a great success. The overflow crowds on the opening night gave way to ever-larger crowds clamoring to get in, along with worldwide press coverage.

But the attention came at a high price. The Taliban took note of Shakespeare in Kabul and what it signified. In the wake of the production, virtually everyone involved in it began to receive menacing messages. Spouses, children, and the extended families of the actors were not exempt from harrassment and warnings. The threats were not idle. The husband of one of the performers answered a loud knock on the door one night and did not return. His mutilated body was found the next morning.

What had seemed like a vigorous cultural renaissance in Afghanistan quickly faded and died. In the wake of the resurgence of the Taliban, Qais Akbar Omar and all the others who had had the temerity to mount Shakespeare’s delicious comedy of love were in terrible trouble. They are now, every one of them, in exile in different parts of the world.

Love’s labors lost indeed. But the subtitle of Omar’s account—“A True Story of Hope and Resilience in Afghanistan”—is not or at least not only ironic. The humane, inexhaustible imaginative enterprise that Shakespeare launched more than four hundred years ago in one small corner of the world is more powerful than all the oppressive forces that can be gathered against it. [Continue reading…]

Facebooktwittermail

Technology, the faux equalizer

structure3

Adrienne LaFrance writes: Just over a century ago, an electric company in Minnesota took out a full-page newspaper advertisement and listed 1,000 uses for electricity.

Bakers could get ice-cream freezers and waffle irons! Hat makers could put up electric signs! Paper-box manufacturers could use glue pots and fans! Then there were the at-home uses: decorative lights, corn poppers, curling irons, foot warmers, massage machines, carpet sweepers, sewing machines, and milk warmers all made the list. “Make electricity cut your housework in two,” the advertisement said.

This has long been the promise of new technology: That it will make your work easier, which will make your life better. The idea is that the arc of technology bends toward social progress. This is practically the mantra of Silicon Valley, so it’s not surprising that Google’s CEO, Sundar Pichai, seems similarly teleological in his views. [Continue reading…]

Facebooktwittermail

Why science and religion aren’t as opposed as you might think

By Stephen Jones, Newman University and Carola Leicht, University of Kent

The debate about science and religion is usually viewed as a competition between worldviews. Differing opinions on whether the two subjects can comfortably co-exist – even among scientists – are pitted against each other in a battle for supremacy.

For some, like the late paleontologist Stephen Jay Gould, science and religion represent two separate areas of enquiry, asking and answering different questions without overlap. Others, such as the biologist Richard Dawkins – and perhaps the majority of the public – see the two as fundamentally opposed belief systems.

But another way to look at the subject is to consider why people believe what they do. When we do this, we discover that the supposed conflict between science and religion is nowhere near as clear cut as some might assume.

Continue reading

Facebooktwittermail

What we could learn from bonobos

Cari Romm writes: In a lot of ways, we have more in common with chimpanzees than we do with bonobos. Both species of ape are considered humans’ genetically closest living relatives, but chimpanzees live in patriarchal societies, start wars with their neighbors, and, as a paper published today in Proceedings of the National Academy of Sciences put it, “do not take kindly to strangers.”

By contrast, bonobos, which form female-dominated societies, have no problem welcoming outsiders into the fold: They mate, share food, and readily form bonds with strangers. They’re also great at defusing conflicts before they escalate — when bonobos stumble upon a new feeding ground, for example, they tend to celebrate with group sex before eating, a habit researchers believe is meant to relieve tension that could otherwise translate into competition for food.

We do share some things with the warmer, fuzzier contingent of our ape family tree: In 2013, for example, researchers from Emory University found strong similarities between the emotional development of young bonobos to that of human children. But in the recent PNAS paper, a team of researchers from the Netherlands found one more difference: Where humans are primed to pay more attention to threats, bonobos are more captivated by examples of cooperation. [Continue reading…]

Facebooktwittermail

The secret of our evolutionary success is faith

pattern8 (1)

Brian Gallagher writes: The staunch atheist and essayist Christopher Hitchens once said that “the most overrated of the virtues is faith.” It’s a reasonable conclusion if you believe, as the astrophysicist Carl Sagan did, that “extraordinary claims require extraordinary evidence.” To believe something without evidence — or have faith — is, in their view, something to avoid (and, when called for, to mock).

Yet it was arguably faith — rather than reason — that had been instrumental to our ancestors’ survival. That’s just one of the many intriguing and paradoxical claims that Joseph Henrich, an evolutionary anthropologist at Harvard University, defends in his new book, The Secret of Our Success: How Culture Is Driving Human Evolution, Domesticating Our Species, and Making Us Smarter. His central thesis, reiterated confidently, is that natural selection — the mechanism of biological evolution — is not the “only process capable of creating complex adaptations.” Cultural evolution, he says, is quite capable of generating “complex adaptive products” essential to our survival, which no one designed or understood “before they emerged.”

Consider, for example, the art of hunting, a complex adaptive product that Henrich unpacks in a section titled “Divination and Game Theory.” To decide where to go looking for caribou, the hunters of the Naskapi tribe, in Labrador, Canada, would not do something most would consider common sense: Go to the spot where you last killed some. That tactic would be ineffective because the caribou know to avoid places where their comrades were last slayed. Of course, the Naskapi don’t realize this; the reason they don’t go to the spot of their last kill is because they rely on the result of a ritual to point the way instead. [Continue reading…]

Facebooktwittermail

Mysterious chimpanzee behaviour may be evidence of ‘sacred’ rituals

By Laura Kehoe, Humboldt University of Berlin

I trampled clumsily through the dense undergrowth, attempting in vain to go a full five minutes without getting snarled in the thorns that threatened my every move. It was my first field mission in the savannahs of the Republic of Guinea. The aim was to record and understand a group of wild chimpanzees who had never been studied before. These chimps are not lucky enough to enjoy the comforts of a protected area, but instead carve out their existence in the patches of forests between farms and villages.

We paused at a clearing in the bush. I let out a sigh of relief that no thorns appeared to be within reach, but why had we stopped? I made my way to the front of the group to ask the chief of the village and our legendary guide, Mamadou Alioh Bah. He told me he had found something interesting – some innocuous markings on a tree trunk. Something that most of us wouldn’t have even noticed in the complex and messy environment of a savannah had stopped him in his tracks. Some in our group of six suggested that wild pigs had made these marks, while scratching up against the tree trunk, others suggested it was teenagers messing around.

But Alioh had a hunch – and when a man that can find a single fallen chimp hair on the forest floor and can spot chimps kilometres away with his naked eye better than you can (with expensive binoculars) as a hunch, you listen to that hunch. We set up a camera trap in the hope that whatever made these marks would come back and do it again, but this time we would catch it all on film.

Continue reading

Facebooktwittermail