Category Archives: Culture

Sino-Tibetan populations shed light on human cooperation

By Ruth Mace, UCL

One of the big questions in anthropology is why humans, unlike most animals, cooperate with those we are not closely related to. Exactly what has driven this behaviour is not well understood. Anthropologists suspect it could be down to the fact that women have usually left their homes after marriage to go and live with their husband’s family. This creates links between distant families, which may explain our tendency to cooperate beyond our own households.

Now our study on the Tibetan borderlands of China, published in Nature Communications, shows that it is indeed the case that cooperation is greater in populations where females disperse for marriage.

A natural experiment in social structure

There are a lot of different theories about the link between dispersal, kinship and cooperation, which is what we wanted to test. Anthropologists believe that dispersal leads to cooperation through links between families, and some evolutionary models predict that when nobody moves this leads to residents competing for the same resources and greater conflict between kin. But there are also models that suggest the opposite is true – that if nobody moves, neighbours are more likely to be related, leading to more cooperation in the neighbourhood.

Continue reading

Facebooktwittermail

Humans are natural polymaths, at our best when we turn our minds to many things

Robert Twigger writes: I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.

We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?

The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’ [Continue reading…]

Facebooktwittermail

Technology is implicated in an assault on empathy

Sherry Turkle writes: Studies of conversation both in the laboratory and in natural settings show that when two people are talking, the mere presence of a phone on a table between them or in the periphery of their vision changes both what they talk about and the degree of connection they feel. People keep the conversation on topics where they won’t mind being interrupted. They don’t feel as invested in each other. Even a silent phone disconnects us.

In 2010, a team at the University of Michigan led by the psychologist Sara Konrath put together the findings of 72 studies that were conducted over a 30-year period. They found a 40 percent decline in empathy among college students, with most of the decline taking place after 2000.

Across generations, technology is implicated in this assault on empathy. We’ve gotten used to being connected all the time, but we have found ways around conversation — at least from conversation that is open-ended and spontaneous, in which we play with ideas and allow ourselves to be fully present and vulnerable. But it is in this type of conversation — where we learn to make eye contact, to become aware of another person’s posture and tone, to comfort one another and respectfully challenge one another — that empathy and intimacy flourish. In these conversations, we learn who we are.

Of course, we can find empathic conversations today, but the trend line is clear. It’s not only that we turn away from talking face to face to chat online. It’s that we don’t allow these conversations to happen in the first place because we keep our phones in the landscape. [Continue reading…]

Facebooktwittermail

Bible Belt atheist

Jason Cohn and Camille Servan-Schreiber: Growing up in Los Angeles and Paris, we both were raised secular and embraced atheism early and easily. It’s not that we didn’t ponder life’s mysteries; it’s just that after we reasoned away our religious questions, we stopped worrying about them and moved on. When we learned about the former pastor Jerry DeWitt’s struggles with being an “outed” atheist in rural Louisiana, we realized for the first time just how difficult being an atheist can be in some communities, where religion is woven deeply into the social fabric. [Continue reading…]

Facebooktwittermail

Paleogenetics is helping to solve the great mystery of prehistory: How did humans spread out over the earth?

Jacob Mikanowski writes: Most of human history is prehistory. Of the 200,000 or more years that humans have spent on Earth, only a tiny fraction have been recorded in writing. Even in our own little sliver of geologic time, the 12,000 years of the Holocene, whose warm weather and relatively stable climate incubated the birth of agriculture, cities, states, and most of the other hallmarks of civilisation, writing has been more the exception than the rule.

Professional historians can’t help but pity their colleagues on the prehistoric side of the fence. Historians are accustomed to drawing on vast archives, but archaeologists must assemble and interpret stories from scant material remains. In the annals of prehistory, cultures are designated according to modes of burial such as ‘Single Grave’, or after styles of arrowhead, such as ‘Western Stemmed Point’. Whole peoples are reduced to styles of pottery, such as Pitted Ware, Corded Ware or Funnel Beaker, all of them spread across the map in confusing, amoeba-like blobs.

In recent years, archaeologists have become reluctant to infer too much from assemblages of ceramics, weapons and grave goods. For at least a generation, they have been drilled on the mantra that ‘pots are not people’. Material culture is not a proxy for identity. Artefacts recovered from a dig can provide a wealth of information about a people’s mode of subsistence, funeral rites and trade contacts, but they are not a reliable guide to their language or ethnicity – or their patterns of migration.

Before the Second World War, prehistory was seen as a series of invasions, with proto-Celts and Indo-Aryans swooping down on unsuspecting swaths of Europe and Asia like so many Vikings, while megalith builders wandered between continents in indecisive meanders. After the Second World War, this view was replaced by the processual school, which attributed cultural changes to internal adaptations. Ideas and technologies might travel, but people by and large stayed put. Today, however, migration is making a comeback.

Much of this shift has to do with the introduction of powerful new techniques for studying ancient DNA. The past five years have seen a revolution in the availability and scope of genetic testing that can be performed on prehistoric human and animal remains. Ancient DNA is tricky to work with. Usually it’s degraded, chemically altered and cut into millions of short fragments. But recent advances in sequencing technology have made it possible to sequence whole genomes from samples reaching back thousands, and tens of thousands, of years. Whole-genome sequencing yields orders of magnitude more data than organelle-based testing, and allows geneticists to make detailed comparisons between individuals and populations. Those comparisons are now illuminating new branches of the human family tree. [Continue reading…]

Facebooktwittermail

A Flemish family care system

Mike Jay writes: Half an hour on the slow train from Antwerp, surrounded by flat, sparsely populated farmlands, Geel (pronounced, roughly, ‘Hyale’) strikes the visitor as a quiet, tidy but otherwise unremarkable Belgian market town. Yet its story is unique. For more than 700 years its inhabitants have taken the mentally ill and disabled into their homes as guests or ‘boarders’. At times, these guests have numbered in the thousands, and arrived from all over Europe. There are several hundred in residence today, sharing their lives with their host families for years, decades or even a lifetime. One boarder recently celebrated 50 years in the Flemish town, arranging a surprise party at the family home. Friends and neighbours were joined by the mayor and a full brass band.

Among the people of Geel, the term ‘mentally ill’ is never heard: even words such as ‘psychiatric’ and ‘patient’ are carefully hedged with finger-waggling and scare quotes. The family care system, as it’s known, is resolutely non-medical. When boarders meet their new families, they do so, as they always have, without a backstory or clinical diagnosis. If a word is needed to describe them, it’s often a positive one such as ‘special’, or at worst, ‘different’. This might in fact be more accurate than ‘mentally ill’, since the boarders have always included some who would today be diagnosed with learning difficulties or special needs. But the most common collective term is simply ‘boarders’, which defines them at the most pragmatic level by their social, not mental, condition. These are people who, whatever their diagnosis, have come here because they’re unable to cope on their own, and because they have no family or friends who can look after them.

The origins of the Geel story lie in the 13th century, in the martyrdom of Saint Dymphna, a legendary seventh-century Irish princess whose pagan father went mad with grief after the death of his Christian wife and demanded that Dymphna marry him. To escape the king’s incestuous passion, Dymphna fled to Europe and holed up in the marshy flatlands of Flanders. Her father finally tracked her down in Geel, and when she refused him once more, he beheaded her. Over time, she became revered as a saint with powers of intercession for the mentally afflicted, and her shrine attracted pilgrims and tales of miraculous cures. [Continue reading…]

Facebooktwittermail

Why futurism has a cultural blindspot

Tom Vanderbilt writes: In early 1999, during the halftime of a University of Washington basketball game, a time capsule from 1927 was opened. Among the contents of this portal to the past were some yellowing newspapers, a Mercury dime, a student handbook, and a building permit. The crowd promptly erupted into boos. One student declared the items “dumb.”

Such disappointment in time capsules seems to run endemic, suggests William E. Jarvis in his book Time Capsules: A Cultural History. A headline from The Onion, he notes, sums it up: “Newly unearthed time capsule just full of useless old crap.” Time capsules, after all, exude a kind of pathos: They show us that the future was not quite as advanced as we thought it would be, nor did it come as quickly. The past, meanwhile, turns out to not be as radically distinct as we thought.

In his book Predicting the Future, Nicholas Rescher writes that “we incline to view the future through a telescope, as it were, thereby magnifying and bringing nearer what we can manage to see.” So too do we view the past through the other end of the telescope, making things look farther away than they actually were, or losing sight of some things altogether.

These observations apply neatly to technology. We don’t have the personal flying cars we predicted we would. Coal, notes the historian David Edgerton in his book The Shock of the Old, was a bigger source of power at the dawn of the 21st century than in sooty 1900; steam was more significant in 1900 than 1800.

But when it comes to culture we tend to believe not that the future will be very different than the present day, but that it will be roughly the same. Try to imagine yourself at some future date. Where do you imagine you will be living? What will you be wearing? What music will you love?

Chances are, that person resembles you now. As the psychologist George Lowenstein and colleagues have argued, in a phenomenon they termed “projection bias,” people “tend to exaggerate the degree to which their future tastes will resemble their current tastes.” [Continue reading…]

Facebooktwittermail

The politics of human evolution

Candida Moss writes: On Thursday morning The New York Times ran a high profile story about the discovery of a new human ancestor species — Homo naledi — in the Rising Star cave in South Africa. The discovery, announced by professor Lee Berger, was monumental because the evidence for Homo naledi were discovered in a burial chamber. Concern for burial is usually seen as distinctive characteristic of humankind, so the possibility that this new non-human hominid species was ”deliberately disposing of its dead” was especially exciting.

To anthropologists the article was not only newsworthy it was also humorous, for the Times illustrated the piece with a photograph of Australopithecus africanus, a species already well-known. This howler of a mistake (at least to self-identified science nerds) was also somewhat understandable because the differences between the two skulls are sufficiently subtle that a lay viewer can indeed easily mistake them for one another. In fact, some have pointed to that similarity and wondered (while acknowledging the importance of the discovery) if it is indeed a “new species.”And that gets to the deeper issue: What and who were our ancestors?

It might seem as if the answer to this question is simply a question of biology, but in his new book Tales of the Ex-Apes: How we think about human evolution anthropologist Jonathan Marks argues that the story we tell about our origins, the study of our evolutionary tree, has cultural roots. Evolution isn’t just a question of biology, he argues, it’s also a question of mythology. Our scientific facts, he says, are the product of bioculture and biopolitics. [Continue reading…]

Facebooktwittermail

Guns, germs, and steal

We have all been raised to believe that civilization is, in large part, sustained by law and order. Without complex social institutions and some form of governance, we would be at the mercy of the law of the jungle — so the argument goes.

But there is a basic flaw in this Hobbesian view of a collective human need to tame the savagery in our nature.

For human beings to be vulnerable to the selfish drives of those around them, they generally need to possess things that are worth stealing. For things to be worth stealing, they must have durable value. People who own nothing, have little need to worry about thieves.

While Jared Diamond has argued that civilization arose in regions where agrarian societies could accumulate food surpluses, new research suggests that the value of cereal crops did not derive simply from the fact that the could be stored, but rather from the fact that having been stored they could subsequently be stolen or confiscated.

Joram Mayshar, Omer Moav, Zvika Neeman, and Luigi Pascali write: In a recent paper (Mayshar et al. 2015), we contend that fiscal capacity and viable state institutions are conditioned to a major extent by geography. Thus, like Diamond, we argue that geography matters a great deal. But in contrast to Diamond, and against conventional opinion, we contend that it is not high farming productivity and the availability of food surplus that accounts for the economic success of Eurasia.

  • We propose an alternative mechanism by which environmental factors imply the appropriability of crops and thereby the emergence of complex social institutions.

To understand why surplus is neither necessary nor sufficient for the emergence of hierarchy, consider a hypothetical community of farmers who cultivate cassava (a major source of calories in sub-Saharan Africa, and the main crop cultivated in Nigeria), and assume that the annual output is well above subsistence. Cassava is a perennial root that is highly perishable upon harvest. Since this crop rots shortly after harvest, it isn’t stored and it is thus difficult to steal or confiscate. As a result, the assumed available surplus would not facilitate the emergence of a non-food producing elite, and may be expected to lead to a population increase.

Consider now another hypothetical farming community that grows a cereal grain – such as wheat, rice or maize – yet with an annual produce that just meets each family’s subsistence needs, without any surplus. Since the grain has to be harvested within a short period and then stored until the next harvest, a visiting robber or tax collector could readily confiscate part of the stored produce. Such ongoing confiscation may be expected to lead to a downward adjustment in population density, but it will nevertheless facilitate the emergence of non-producing elite, even though there was no surplus.

This simple scenario shows that surplus isn’t a precondition for taxation. It also illustrates our alternative theory that the transition to agriculture enabled hierarchy to emerge only where the cultivated crops were vulnerable to appropriation.

  • In particular, we contend that the Neolithic emergence of fiscal capacity and hierarchy was conditioned on the cultivation of appropriable cereals as the staple crops, in contrast to less appropriable staples such as roots and tubers.

According to this theory, complex hierarchy did not emerge among hunter-gatherers because hunter-gatherers essentially live from hand-to-mouth, with little that can be expropriated from them to feed a would-be elite. [Continue reading…]

Facebooktwittermail

Is there anything wrong with men who cry?

Sandra Newman writes: One of our most firmly entrenched ideas of masculinity is that men don’t cry. Although he might shed a discreet tear at a funeral, and it’s acceptable for him to well up when he slams his fingers in a car door, a real man is expected to quickly regain control. Sobbing openly is strictly for girls.

This isn’t just a social expectation; it’s a scientific fact. All the research to date finds that women cry significantly more than men. A meta-study by the German Society of Ophthalmology in 2009 found that women weep, on average, five times as often, and almost twice as long per episode. The discrepancy is such a commonplace, we tend to assume it’s biologically hard-wired; that, whether you like it or not, this is one gender difference that isn’t going away.

But actually, the gender gap in crying seems to be a recent development. Historical and literary evidence suggests that, in the past, not only did men cry in public, but no one saw it as feminine or shameful. In fact, male weeping was regarded as normal in almost every part of the world for most of recorded history. [Continue reading…]

Facebooktwittermail

The death of culture

In a review of Notes On The Death Of Culture, Anne Haverty writes: We may not be living in the worst of times, although a case might very well be made for it, but anyone with a thought in their head would be entitled to say that we’re living in the stupidest. Mario Vargas Llosa, the Nobel Prize-winning novelist, certainly believes we are. In this series of coruscating and passionate essays on the state of culture he argues that we have, en masse, capitulated to idiocy. And it is leading us to melancholy and despair.

This is a book of mourning. What Vargas Llosa writes is a lament for how things used to be and how they are now in all aspects of life from the political to the spiritual. Like TS Eliot in his essay Notes Towards the Definition of Culture, written in 1948, he takes the concept of culture in the general sense as a shared sensibility, a way of life.

Eliot too saw culture decaying around him and foresaw a time in which there would be no culture. This time, Vargas Llosa argues, is ours. Eliot has since been under attack for what his critics often describe as his elitist attitudes – as well as much else – and Vargas Llosa will probably also be tarred with the same brush for his pains.

But we must be grateful to him for describing in a relatively orderly manner the chaos of hypocrisy and emptiness into which our globalised culture has plunged and to which we seem to have little option but to subscribe.

It’s not easy, however, to be orderly on such an all-encompassing and sensitive subject as the way we live now. On some aspects, such as the art business, Vargas Llosa practically foams at the mouth. The art world is “rotten to the core”, a world in which artists cynically contrive “cheap stunts”. Stars like Damien Hirst are purveyors of “con-tricks”, and their “boring, farcical and bleak” productions are aided by “half-witted critics”.

We have abandoned the former minority culture, which was truth-seeking, profound, quiet and subtle, in favour of mainstream or mass entertainment, which has to be accessible – and how brave if foolhardy of anyone these days to cast aspersions on accessibility – as well as sensation-loving and frivolous.

Value-free, this kind of culture is essentially valueless. [Continue reading…]

Facebooktwittermail

The difference between Americans who do or don’t believe in evolution

Dan Kahan writes: It’s well established that there is no meaningful correlation between what a person says he or she “believes” about evolution and having the rudimentary understanding of natural selection, random mutation, and genetic variance necessary to pass a high school biology exam (Bishop & Anderson 1990; Shtulman 2006).

There is a correlation between “belief” in evolution and possession of the kinds of substantive knowledge and reasoning skills essential to science comprehension generally.

But what the correlation is depends on religiosity: a relatively nonreligious person is more likely to say he or she “believes in” evolution, but a relatively religious person less likely to do so, as their science comprehension capacity goes up (Kahan 2015).

That’s what “belief in” evolution of the sort measured in a survey item signifies: who one is, not what one knows.

Americans don’t disagree about evolution because they have different understandings of or commitments to science. They disagree because they subscribe to competing cultural worldviews that invest positions on evolution with identity-expressive significance. [Continue reading…]

Facebooktwittermail

The coddling of the American mind

Greg Lukianoff and Jonathan Haidt write: Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law — or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia — and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.

Two terms have risen quickly from obscurity into common campus parlance. Microaggressions are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless. For example, by some campus guidelines, it is a microaggression to ask an Asian American or Latino American “Where were you born?,” because this implies that he or she is not a real American. Trigger warnings are alerts that professors are expected to issue if something in a course might cause a strong emotional response. For example, some students have called for warnings that Chinua Achebe’s Things Fall Apart describes racial violence and that F. Scott Fitzgerald’s The Great Gatsby portrays misogyny and physical abuse, so that students who have been previously victimized by racism or domestic violence can choose to avoid these works, which they believe might “trigger” a recurrence of past trauma.

Some recent campus actions border on the surreal. In April, at Brandeis University, the Asian American student association sought to raise awareness of microaggressions against Asians through an installation on the steps of an academic hall. The installation gave examples of microaggressions such as “Aren’t you supposed to be good at math?” and “I’m colorblind! I don’t see race.” But a backlash arose among other Asian American students, who felt that the display itself was a microaggression. The association removed the installation, and its president wrote an e-mail to the entire student body apologizing to anyone who was “triggered or hurt by the content of the microaggressions.” [Continue reading…]

Facebooktwittermail

Recognizing whales and dolphins as cultural beings

Barbara J. King writes: The idea that our oceans teem with cultural animals — and have for millions of years — is the central conclusion of a new book by two whale scientists. And it’s a convincing one.

Whales and dolphins, as they forage for food and interact with each other in their social units, may learn specific ways of doing things from their mothers or their pod-mates.

Certain killer whales (orcas), for example, learn to hunt communally with such precision that they cause waves to wash seals — of only certain species, because other seals are rejected as prey — off their ice floes and into the sea. And the complex patterned songs of humpback whales evolve so quickly over time and space that only learning can explain it.

“The song being sung at any location can change dramatically into an entirely new form, with new units, new phrases, and new themes within less than a year,” write authors Hal Whitehead and Luke Rendell in their book The Cultural Lives of Whales and Dolphins. “A revolution, rather than an evolution.”

The two scientists, who have been studying sperm whales for a collective half century, offer this working definition of culture: Behavior that is shared by some identifiable group such as a family, community or population, and that is acquired by learning from others.

In order for culture to be ruled in as the primary explanation for some behavior, then, genetics and features of the habitat in which the marine mammals live should be ruled out. [Continue reading…]

Facebooktwittermail

Marks of ownership

Matthew Battles writes: In ancient Greece, writing arose among traders and artisans doing business in the markets with foreigners and visitors from other cities. Their alphabet emerged not in scribal colleges or the king’s halls, nor was it brought by conquerors, but instead came ashore in the freewheeling, acquisitive, materialistic atmosphere of the agora, the Greek marketplace that also birthed democracy and the public sphere.

The Phoenician letters, transformed by Greeks into the alphabet, share an origin with the Hebrew characters. They crossed the Aegean Sea with trade that flourished between the Greek peninsula and the Canaanite mainland in the ninth century BC. The first alphabetic inscriptions in Greek appear on goods—keepsake vases, containers for oil and olives. The likely earliest such inscription extant, the “Dipylon inscription,” is on a wine jug; it reads something like this: “Whichever dancer dances most fleetly, he shall get me [this vessel]” — a trophy cup. The so-called Cup of Nestor, a clay vessel dating from the eighth century BC, bears an inscription that begins “Nestor’s cup am I, good to drink from.” For the next couple of centuries, Greek letters are used mostly to inscribe dedications — indexing acquisition and ownership in a society where property was the basis of participation in the lettered public sphere.

This was a society of freeborn traders and artisans, a culture that prized beauty, expressiveness, and originality — the perfect environment for the kind of flourishing public space writing seems everywhere to wish to build. And yet the magisterium of writing grows slowly in ancient Greece. Centuries pass before the first texts appear. [Continue reading…]

Facebooktwittermail

ISIS is selling looted art online for needed cash

Bloomberg reports: The Whatsapp message appeared on his iPhone: photos of an ancient Mesopotamian vase worth $250,000, part of a highly-valued set that is waiting to be extracted.

The recipient, Amr Al Azm, replied that he was interested. How to proceed? A message from a different account followed. The vase could be smuggled through Lebanon.

Al Azm, an anthropology professor in Ohio, was faking it, as he does when photos of looted antiquities are sent to him in the belief that he is a collector or dealer. He is a detective – – self-appointed — hoping to save some of mankind’s rarest and most vulnerable artifacts by tracking the burgeoning antiquities trade of Islamic State in Iraq and Syria. [Continue reading…]

Facebooktwittermail

PTSD calls for collective healing

Sebastian Junger writes: In two American studies of middle-class families during the 1980s, 85 percent of young children slept alone — a figure that rose to 95 percent among families considered “well-educated.” Northern European societies, including America, are the only ones in history to make very young children sleep alone in such numbers. The isolation is thought to trigger fears that make many children bond intensely with stuffed animals for reassurance. Only in Northern European societies do children go through the well-known developmental stage of bonding with stuffed animals; elsewhere, children get their sense of safety from the adults sleeping near them.

More broadly, in most human societies, almost nobody sleeps alone. Sleeping in family groups of one sort or another has been the norm throughout human history and is still commonplace in most of the world. Again, Northern European societies are among the few where people sleep alone or with a partner in a private room. When I was with American soldiers at a remote outpost in Afghanistan, we slept in narrow plywood huts where I could reach out and touch three other men from where I slept. They snored, they talked, they got up in the middle of the night to use the piss tubes, but we felt safe because we were in a group. The Taliban attacked the position regularly, and the most determined attacks often came at dawn. Another unit in a nearby valley was almost overrun and took 50 percent casualties in just such an attack. And yet I slept better surrounded by those noisy, snoring men than I ever did camping alone in the woods of New England.

Many soldiers will tell you that one of the hardest things about coming home is learning to sleep without the security of a group of heavily armed men around them. In that sense, being in a war zone with your platoon feels safer than being in an American suburb by yourself. I know a vet who felt so threatened at home that he would get up in the middle of the night to build fighting positions out of the living-room furniture. This is a radically different experience from what warriors in other societies go through, such as the Yanomami, of the Orinoco and Amazon Basins, who go to war with their entire age cohort and return to face, together, whatever the psychological consequences may be. As one anthropologist pointed out to me, trauma is usually a group experience, so trauma recovery should be a group experience as well. But in our society it’s not.

“Our whole approach to mental health has been hijacked by pharmaceutical logic,” I was told by Gary Barker, an anthropologist whose group, Promundo, is dedicated to understanding and preventing violence. “PTSD is a crisis of connection and disruption, not an illness that you carry within you.”

This individualizing of mental health is not just an American problem, or a veteran problem; it affects everybody. A British anthropologist named Bill West told me that the extreme poverty of the 1930s and the collective trauma of the Blitz served to unify an entire generation of English people. “I link the experience of the Blitz to voting in the Labour Party in 1945, and the establishing of the National Health Service and a strong welfare state,” he said. “Those policies were supported well into the 60s by all political parties. That kind of cultural cohesiveness, along with Christianity, was very helpful after the war. It’s an open question whether people’s problems are located in the individual. If enough people in society are sick, you have to wonder whether it isn’t actually society that’s sick.”

Ideally, we would compare hunter-gatherer society to post-industrial society to see which one copes better with PTSD. When the Sioux, Cheyenne, and Arapaho fighters returned to their camps after annihilating Custer and his regiment at Little Bighorn, for example, were they traumatized and alienated by the experience — or did they fit right back into society? There is no way to know for sure, but less direct comparisons can still illuminate how cohesiveness affects trauma. In experiments with lab rats, for example, a subject that is traumatized — but not injured — after an attack by a larger rat usually recovers within 48 hours unless it is kept in isolation, according to data published in 2005 in Neuroscience & Biobehavioral Reviews. The ones that are kept apart from other rats are the only ones that develop long-term traumatic symptoms. And a study of risk factors for PTSD in humans closely mirrored those results. In a 2000 study in the Journal of Consulting and Clinical Psychology, “lack of social support” was found to be around two times more reliable at predicting who got PTSD and who didn’t than the severity of the trauma itself. You could be mildly traumatized, in other words—on a par with, say, an ordinary rear-base deployment to Afghanistan — and experience long-term PTSD simply because of a lack of social support back home.

Anthropologist and psychiatrist Brandon Kohrt found a similar phenomenon in the villages of southern Nepal, where a civil war has been rumbling for years. Kohrt explained to me that there are two kinds of villages there: exclusively Hindu ones, which are extremely stratified, and mixed Buddhist/Hindu ones, which are far more open and cohesive. He said that child soldiers, both male and female, who go back to Hindu villages can remain traumatized for years, while those from mixed-religion villages tended to recover very quickly. “PTSD is a disorder of recovery, and if treatment only focuses on identifying symptoms, it pathologizes and alienates vets,” according to Kohrt. “But if the focus is on family and community, it puts them in a situation of collective healing.” [Continue reading…]

Facebooktwittermail