Why we think the very first farmers were small groups with property rights

By Elizabeth Gallagher, UCL

For 95% of the history of modern humans we were exclusively hunter gatherers. Then suddenly about 12,000 years ago, something happened that revolutionised the way humans lived and enabled the complex societies we have today: farming.

But what triggered this revolution? Understanding this is incredibly challenging – because this occurred so far in the past, there are many factors to consider. However, by simulating the past using a complex computational model, we found that the switch from foraging to farming most likely began with very small groups of people that were using the concept of property rights.

Farming: an unlikely choice

It may seem obvious why we switched from foraging to farming: it made it possible to stay in one place, feed larger populations, have greater food security and build increasingly complex societies, political structures, economies and technologies. However, these advantages took time to develop and our early farmer ancestors would not have seen these coming.

Indeed, archaeological research suggests that when farming began it was not a particularly attractive lifestyle. It involved more work, a decrease in the quality of nutrition and health, an increase in disease and infection, and greater challenges in defending resources. For a hunter-gatherer at the cusp of the “agricultural revolution”, a switch to farming wasn’t the obvious choice.

[Read more…]


The strange persistence of first languages

Julie Sedivy writes: Like a household that welcomes a new child, a single mind can’t admit a new language without some impact on other languages already residing there. Languages can co-exist, but they tussle, as do siblings, over mental resources and attention. When a bilingual person tries to articulate a thought in one language, words and grammatical structures from the other language often clamor in the background, jostling for attention. The subconscious effort of suppressing this competition can slow the retrieval of words—and if the background language elbows its way to the forefront, the speaker may resort to code-switching, plunking down a word from one language into the sentence frame of another.

Meanwhile, the weaker language is more likely to become swamped; when resources are scarce, as they are during mental exhaustion, the disadvantaged language may become nearly impossible to summon. Over time, neglecting an earlier language makes it harder and harder for it to compete for access.

According to a 2004 survey conducted in the Los Angeles metropolitan area, fewer than half of people belonging to Generation 1.5 — immigrants who arrive before their teenage years — claimed to speak the language they were born into “very well.” A 2006 study of immigrant languages in Southern California forecast that even among Mexican Americans, the slowest group to assimilate within Southern California, new arrivals would live to hear only 5 out of every 100 of their great-grandchildren speak fluent Spanish.

When a childhood language decays, so does the ability to reach far back into your own private history. Language is memory’s receptacle. It has Proustian powers. Just as smells are known to trigger vivid memories of past experiences, language is so entangled with our experiences that inhabiting a specific language helps surface submerged events or interactions that are associated with it. [Continue reading…]


Sino-Tibetan populations shed light on human cooperation

By Ruth Mace, UCL

One of the big questions in anthropology is why humans, unlike most animals, cooperate with those we are not closely related to. Exactly what has driven this behaviour is not well understood. Anthropologists suspect it could be down to the fact that women have usually left their homes after marriage to go and live with their husband’s family. This creates links between distant families, which may explain our tendency to cooperate beyond our own households.

Now our study on the Tibetan borderlands of China, published in Nature Communications, shows that it is indeed the case that cooperation is greater in populations where females disperse for marriage.

A natural experiment in social structure

There are a lot of different theories about the link between dispersal, kinship and cooperation, which is what we wanted to test. Anthropologists believe that dispersal leads to cooperation through links between families, and some evolutionary models predict that when nobody moves this leads to residents competing for the same resources and greater conflict between kin. But there are also models that suggest the opposite is true – that if nobody moves, neighbours are more likely to be related, leading to more cooperation in the neighbourhood.

[Read more…]


Humans are natural polymaths, at our best when we turn our minds to many things

Robert Twigger writes: I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.

We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?

The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’ [Continue reading…]


Technology is implicated in an assault on empathy

Sherry Turkle writes: Studies of conversation both in the laboratory and in natural settings show that when two people are talking, the mere presence of a phone on a table between them or in the periphery of their vision changes both what they talk about and the degree of connection they feel. People keep the conversation on topics where they won’t mind being interrupted. They don’t feel as invested in each other. Even a silent phone disconnects us.

In 2010, a team at the University of Michigan led by the psychologist Sara Konrath put together the findings of 72 studies that were conducted over a 30-year period. They found a 40 percent decline in empathy among college students, with most of the decline taking place after 2000.

Across generations, technology is implicated in this assault on empathy. We’ve gotten used to being connected all the time, but we have found ways around conversation — at least from conversation that is open-ended and spontaneous, in which we play with ideas and allow ourselves to be fully present and vulnerable. But it is in this type of conversation — where we learn to make eye contact, to become aware of another person’s posture and tone, to comfort one another and respectfully challenge one another — that empathy and intimacy flourish. In these conversations, we learn who we are.

Of course, we can find empathic conversations today, but the trend line is clear. It’s not only that we turn away from talking face to face to chat online. It’s that we don’t allow these conversations to happen in the first place because we keep our phones in the landscape. [Continue reading…]


Bible Belt atheist

Jason Cohn and Camille Servan-Schreiber: Growing up in Los Angeles and Paris, we both were raised secular and embraced atheism early and easily. It’s not that we didn’t ponder life’s mysteries; it’s just that after we reasoned away our religious questions, we stopped worrying about them and moved on. When we learned about the former pastor Jerry DeWitt’s struggles with being an “outed” atheist in rural Louisiana, we realized for the first time just how difficult being an atheist can be in some communities, where religion is woven deeply into the social fabric. [Continue reading…]


Paleogenetics is helping to solve the great mystery of prehistory: How did humans spread out over the earth?

Jacob Mikanowski writes: Most of human history is prehistory. Of the 200,000 or more years that humans have spent on Earth, only a tiny fraction have been recorded in writing. Even in our own little sliver of geologic time, the 12,000 years of the Holocene, whose warm weather and relatively stable climate incubated the birth of agriculture, cities, states, and most of the other hallmarks of civilisation, writing has been more the exception than the rule.

Professional historians can’t help but pity their colleagues on the prehistoric side of the fence. Historians are accustomed to drawing on vast archives, but archaeologists must assemble and interpret stories from scant material remains. In the annals of prehistory, cultures are designated according to modes of burial such as ‘Single Grave’, or after styles of arrowhead, such as ‘Western Stemmed Point’. Whole peoples are reduced to styles of pottery, such as Pitted Ware, Corded Ware or Funnel Beaker, all of them spread across the map in confusing, amoeba-like blobs.

In recent years, archaeologists have become reluctant to infer too much from assemblages of ceramics, weapons and grave goods. For at least a generation, they have been drilled on the mantra that ‘pots are not people’. Material culture is not a proxy for identity. Artefacts recovered from a dig can provide a wealth of information about a people’s mode of subsistence, funeral rites and trade contacts, but they are not a reliable guide to their language or ethnicity – or their patterns of migration.

Before the Second World War, prehistory was seen as a series of invasions, with proto-Celts and Indo-Aryans swooping down on unsuspecting swaths of Europe and Asia like so many Vikings, while megalith builders wandered between continents in indecisive meanders. After the Second World War, this view was replaced by the processual school, which attributed cultural changes to internal adaptations. Ideas and technologies might travel, but people by and large stayed put. Today, however, migration is making a comeback.

Much of this shift has to do with the introduction of powerful new techniques for studying ancient DNA. The past five years have seen a revolution in the availability and scope of genetic testing that can be performed on prehistoric human and animal remains. Ancient DNA is tricky to work with. Usually it’s degraded, chemically altered and cut into millions of short fragments. But recent advances in sequencing technology have made it possible to sequence whole genomes from samples reaching back thousands, and tens of thousands, of years. Whole-genome sequencing yields orders of magnitude more data than organelle-based testing, and allows geneticists to make detailed comparisons between individuals and populations. Those comparisons are now illuminating new branches of the human family tree. [Continue reading…]


A Flemish family care system

Mike Jay writes: Half an hour on the slow train from Antwerp, surrounded by flat, sparsely populated farmlands, Geel (pronounced, roughly, ‘Hyale’) strikes the visitor as a quiet, tidy but otherwise unremarkable Belgian market town. Yet its story is unique. For more than 700 years its inhabitants have taken the mentally ill and disabled into their homes as guests or ‘boarders’. At times, these guests have numbered in the thousands, and arrived from all over Europe. There are several hundred in residence today, sharing their lives with their host families for years, decades or even a lifetime. One boarder recently celebrated 50 years in the Flemish town, arranging a surprise party at the family home. Friends and neighbours were joined by the mayor and a full brass band.

Among the people of Geel, the term ‘mentally ill’ is never heard: even words such as ‘psychiatric’ and ‘patient’ are carefully hedged with finger-waggling and scare quotes. The family care system, as it’s known, is resolutely non-medical. When boarders meet their new families, they do so, as they always have, without a backstory or clinical diagnosis. If a word is needed to describe them, it’s often a positive one such as ‘special’, or at worst, ‘different’. This might in fact be more accurate than ‘mentally ill’, since the boarders have always included some who would today be diagnosed with learning difficulties or special needs. But the most common collective term is simply ‘boarders’, which defines them at the most pragmatic level by their social, not mental, condition. These are people who, whatever their diagnosis, have come here because they’re unable to cope on their own, and because they have no family or friends who can look after them.

The origins of the Geel story lie in the 13th century, in the martyrdom of Saint Dymphna, a legendary seventh-century Irish princess whose pagan father went mad with grief after the death of his Christian wife and demanded that Dymphna marry him. To escape the king’s incestuous passion, Dymphna fled to Europe and holed up in the marshy flatlands of Flanders. Her father finally tracked her down in Geel, and when she refused him once more, he beheaded her. Over time, she became revered as a saint with powers of intercession for the mentally afflicted, and her shrine attracted pilgrims and tales of miraculous cures. [Continue reading…]


Why futurism has a cultural blindspot

Tom Vanderbilt writes: In early 1999, during the halftime of a University of Washington basketball game, a time capsule from 1927 was opened. Among the contents of this portal to the past were some yellowing newspapers, a Mercury dime, a student handbook, and a building permit. The crowd promptly erupted into boos. One student declared the items “dumb.”

Such disappointment in time capsules seems to run endemic, suggests William E. Jarvis in his book Time Capsules: A Cultural History. A headline from The Onion, he notes, sums it up: “Newly unearthed time capsule just full of useless old crap.” Time capsules, after all, exude a kind of pathos: They show us that the future was not quite as advanced as we thought it would be, nor did it come as quickly. The past, meanwhile, turns out to not be as radically distinct as we thought.

In his book Predicting the Future, Nicholas Rescher writes that “we incline to view the future through a telescope, as it were, thereby magnifying and bringing nearer what we can manage to see.” So too do we view the past through the other end of the telescope, making things look farther away than they actually were, or losing sight of some things altogether.

These observations apply neatly to technology. We don’t have the personal flying cars we predicted we would. Coal, notes the historian David Edgerton in his book The Shock of the Old, was a bigger source of power at the dawn of the 21st century than in sooty 1900; steam was more significant in 1900 than 1800.

But when it comes to culture we tend to believe not that the future will be very different than the present day, but that it will be roughly the same. Try to imagine yourself at some future date. Where do you imagine you will be living? What will you be wearing? What music will you love?

Chances are, that person resembles you now. As the psychologist George Lowenstein and colleagues have argued, in a phenomenon they termed “projection bias,” people “tend to exaggerate the degree to which their future tastes will resemble their current tastes.” [Continue reading…]


The politics of human evolution

Candida Moss writes: On Thursday morning The New York Times ran a high profile story about the discovery of a new human ancestor species — Homo naledi — in the Rising Star cave in South Africa. The discovery, announced by professor Lee Berger, was monumental because the evidence for Homo naledi were discovered in a burial chamber. Concern for burial is usually seen as distinctive characteristic of humankind, so the possibility that this new non-human hominid species was ”deliberately disposing of its dead” was especially exciting.

To anthropologists the article was not only newsworthy it was also humorous, for the Times illustrated the piece with a photograph of Australopithecus africanus, a species already well-known. This howler of a mistake (at least to self-identified science nerds) was also somewhat understandable because the differences between the two skulls are sufficiently subtle that a lay viewer can indeed easily mistake them for one another. In fact, some have pointed to that similarity and wondered (while acknowledging the importance of the discovery) if it is indeed a “new species.”And that gets to the deeper issue: What and who were our ancestors?

It might seem as if the answer to this question is simply a question of biology, but in his new book Tales of the Ex-Apes: How we think about human evolution anthropologist Jonathan Marks argues that the story we tell about our origins, the study of our evolutionary tree, has cultural roots. Evolution isn’t just a question of biology, he argues, it’s also a question of mythology. Our scientific facts, he says, are the product of bioculture and biopolitics. [Continue reading…]


Guns, germs, and steal

We have all been raised to believe that civilization is, in large part, sustained by law and order. Without complex social institutions and some form of governance, we would be at the mercy of the law of the jungle — so the argument goes.

But there is a basic flaw in this Hobbesian view of a collective human need to tame the savagery in our nature.

For human beings to be vulnerable to the selfish drives of those around them, they generally need to possess things that are worth stealing. For things to be worth stealing, they must have durable value. People who own nothing, have little need to worry about thieves.

While Jared Diamond has argued that civilization arose in regions where agrarian societies could accumulate food surpluses, new research suggests that the value of cereal crops did not derive simply from the fact that the could be stored, but rather from the fact that having been stored they could subsequently be stolen or confiscated.

Joram Mayshar, Omer Moav, Zvika Neeman, and Luigi Pascali write: In a recent paper (Mayshar et al. 2015), we contend that fiscal capacity and viable state institutions are conditioned to a major extent by geography. Thus, like Diamond, we argue that geography matters a great deal. But in contrast to Diamond, and against conventional opinion, we contend that it is not high farming productivity and the availability of food surplus that accounts for the economic success of Eurasia.

  • We propose an alternative mechanism by which environmental factors imply the appropriability of crops and thereby the emergence of complex social institutions.

To understand why surplus is neither necessary nor sufficient for the emergence of hierarchy, consider a hypothetical community of farmers who cultivate cassava (a major source of calories in sub-Saharan Africa, and the main crop cultivated in Nigeria), and assume that the annual output is well above subsistence. Cassava is a perennial root that is highly perishable upon harvest. Since this crop rots shortly after harvest, it isn’t stored and it is thus difficult to steal or confiscate. As a result, the assumed available surplus would not facilitate the emergence of a non-food producing elite, and may be expected to lead to a population increase.

Consider now another hypothetical farming community that grows a cereal grain – such as wheat, rice or maize – yet with an annual produce that just meets each family’s subsistence needs, without any surplus. Since the grain has to be harvested within a short period and then stored until the next harvest, a visiting robber or tax collector could readily confiscate part of the stored produce. Such ongoing confiscation may be expected to lead to a downward adjustment in population density, but it will nevertheless facilitate the emergence of non-producing elite, even though there was no surplus.

This simple scenario shows that surplus isn’t a precondition for taxation. It also illustrates our alternative theory that the transition to agriculture enabled hierarchy to emerge only where the cultivated crops were vulnerable to appropriation.

  • In particular, we contend that the Neolithic emergence of fiscal capacity and hierarchy was conditioned on the cultivation of appropriable cereals as the staple crops, in contrast to less appropriable staples such as roots and tubers.

According to this theory, complex hierarchy did not emerge among hunter-gatherers because hunter-gatherers essentially live from hand-to-mouth, with little that can be expropriated from them to feed a would-be elite. [Continue reading…]


Is there anything wrong with men who cry?

Sandra Newman writes: One of our most firmly entrenched ideas of masculinity is that men don’t cry. Although he might shed a discreet tear at a funeral, and it’s acceptable for him to well up when he slams his fingers in a car door, a real man is expected to quickly regain control. Sobbing openly is strictly for girls.

This isn’t just a social expectation; it’s a scientific fact. All the research to date finds that women cry significantly more than men. A meta-study by the German Society of Ophthalmology in 2009 found that women weep, on average, five times as often, and almost twice as long per episode. The discrepancy is such a commonplace, we tend to assume it’s biologically hard-wired; that, whether you like it or not, this is one gender difference that isn’t going away.

But actually, the gender gap in crying seems to be a recent development. Historical and literary evidence suggests that, in the past, not only did men cry in public, but no one saw it as feminine or shameful. In fact, male weeping was regarded as normal in almost every part of the world for most of recorded history. [Continue reading…]


Video: Slavoj Zizek — political correctness solidifies hatred, it doesn’t work


The death of culture

In a review of Notes On The Death Of Culture, Anne Haverty writes: We may not be living in the worst of times, although a case might very well be made for it, but anyone with a thought in their head would be entitled to say that we’re living in the stupidest. Mario Vargas Llosa, the Nobel Prize-winning novelist, certainly believes we are. In this series of coruscating and passionate essays on the state of culture he argues that we have, en masse, capitulated to idiocy. And it is leading us to melancholy and despair.

This is a book of mourning. What Vargas Llosa writes is a lament for how things used to be and how they are now in all aspects of life from the political to the spiritual. Like TS Eliot in his essay Notes Towards the Definition of Culture, written in 1948, he takes the concept of culture in the general sense as a shared sensibility, a way of life.

Eliot too saw culture decaying around him and foresaw a time in which there would be no culture. This time, Vargas Llosa argues, is ours. Eliot has since been under attack for what his critics often describe as his elitist attitudes – as well as much else – and Vargas Llosa will probably also be tarred with the same brush for his pains.

But we must be grateful to him for describing in a relatively orderly manner the chaos of hypocrisy and emptiness into which our globalised culture has plunged and to which we seem to have little option but to subscribe.

It’s not easy, however, to be orderly on such an all-encompassing and sensitive subject as the way we live now. On some aspects, such as the art business, Vargas Llosa practically foams at the mouth. The art world is “rotten to the core”, a world in which artists cynically contrive “cheap stunts”. Stars like Damien Hirst are purveyors of “con-tricks”, and their “boring, farcical and bleak” productions are aided by “half-witted critics”.

We have abandoned the former minority culture, which was truth-seeking, profound, quiet and subtle, in favour of mainstream or mass entertainment, which has to be accessible – and how brave if foolhardy of anyone these days to cast aspersions on accessibility – as well as sensation-loving and frivolous.

Value-free, this kind of culture is essentially valueless. [Continue reading…]


The difference between Americans who do or don’t believe in evolution

Dan Kahan writes: It’s well established that there is no meaningful correlation between what a person says he or she “believes” about evolution and having the rudimentary understanding of natural selection, random mutation, and genetic variance necessary to pass a high school biology exam (Bishop & Anderson 1990; Shtulman 2006).

There is a correlation between “belief” in evolution and possession of the kinds of substantive knowledge and reasoning skills essential to science comprehension generally.

But what the correlation is depends on religiosity: a relatively nonreligious person is more likely to say he or she “believes in” evolution, but a relatively religious person less likely to do so, as their science comprehension capacity goes up (Kahan 2015).

That’s what “belief in” evolution of the sort measured in a survey item signifies: who one is, not what one knows.

Americans don’t disagree about evolution because they have different understandings of or commitments to science. They disagree because they subscribe to competing cultural worldviews that invest positions on evolution with identity-expressive significance. [Continue reading…]