Gary Slutkin: Let’s treat violence like a contagious disease

facebooktwittermail

Mali’s irrepressible musical spirit resounds after jihadi-imposed silence

The Guardian reports: n the courtyard of a colonial villa in Bamako, four young men crouch around a tiny camping stove. The Malian tradition of simmering tea for hours is as old as the ancient trade routes crossing the Sahara desert. There is even a saying behind the practice, says Aliou Touré, a singer in the Mali band Songhoy Blues.

“Here in Mali we say that the first cup is bitter like life, the second is sweet like love and the third is soft like the breath of a dying man,” he says.

Songhoy Blues are one of the latest musical acts to emerge from the west African country that has produced artists such as Salif Keita and Toumani Diabaté – both multiple Grammy winners – Tinariwen, Ali Farka Touré, Bassekou Kouyaté, and Rokia Traoré.

The band is one of a dozen acts at this week’s Bamako acoustic festival, the first major music festival in the capital since 2012, when Islamist extremists seized northern Mali and imposed their hardline interpretation of sharia law that, among other things, banned music. [Continue reading…]

facebooktwittermail

Ancient societies were far more advanced than we commonly assume

Pacific Standard reports: Trapezoids are, oddly enough, fundamental to modern science. When European scientists used them to simplify certain astronomical calculations in the 14th century, it was an important first step toward calculus—the mathematics Isaac Newton and Gottfried Leibniz developed to understand the physics of astronomical objects like planets. In other words, trapezoids are important, and we’ve known this for nearly 700 years.

Well, the Babylonians knew all of that 14 centuries earlier, according to new research published in Science, proving once again that ancient societies were way more advanced than we’d like to think. [Continue reading…]

facebooktwittermail

The new atheists’ faith in demons

shadow13bw

John Gray writes: An American scientist visiting the home of Niels Bohr, the Nobel Prize-winning Danish physicist and refugee from Nazism who was a leading figure in the Manhattan Project, which produced the atomic bomb, was surprised to discover a horseshoe hanging over Bohr’s desk: “Surely you don’t believe the horseshoe will bring you good luck, Professor Bohr?” he asked. “After all, as a scientist . . .”

Bohr laughed. “I believe no such thing, my good friend. Not at all. I am scarcely likely to believe such foolish nonsense. However, I am told that a horseshoe will bring one good luck whether you believe it or not.”

Dominic Johnson, who tells this story, acknowledges that Bohr might have been joking. But the physicist’s response captured an important truth. Human beings never cease looking for a pattern in events that transcends the workings of cause and effect. No matter how much they may think their view of the world has been shaped by science, they cannot avoid thinking and acting as if their lives are subject to some kind of non-human oversight. As Johnson puts it, “Humans the world over find themselves, consciously or subconsciously, believing that we live in a just world or a moral universe, where people are supposed to get what they deserve. Our brains are wired such that we cannot help but search for meaning in the randomness of life.”

An evolutionary biologist trained at Oxford who also holds a doctorate in political science, Johnson believes that the need to find a more-than-natural meaning in natural events is universal – “a ubiquitous phenomenon of human nature” – and performs a vital role in maintaining order in society. Extending far beyond cultures shaped by monotheism, it “spans cultures across the globe and every historical period, from indigenous tribal societies . . . to modern world religions – and includes atheists, too”.

Reward and punishment may not emanate from a single omnipotent deity, as imagined in Western societies. Justice may be dispensed by a vast unseen army of gods, angels, demons and ghosts, or else by an impersonal cosmic process that rewards good deeds and punishes wrongdoing, as in the Hindu and Buddhist conception of karma. But some kind of moral order beyond any human agency seems to be demanded by the human mind, and this sense that our actions are overseen and judged from beyond the natural world serves a definite evolutionary role. Belief in supernatural reward and punishment promotes social co-operation in a way nothing else can match. The belief that we live under some kind of supernatural guidance is not a relic of superstition that might some day be left behind but an evolutionary adaptation that goes with being human.

It’s a conclusion that is anathema to the current generation of atheists – Richard Dawkins, Daniel Dennett, Sam Harris and others – for whom religion is a poisonous concoction of lies and delusion. These “new atheists” are simple souls. In their view, which derives from rationalist philosophy and not from evolutionary theory, the human mind is a faculty that seeks an accurate representation of the world. This leaves them with something of a problem. Why are most human beings, everywhere and at all times, so wedded to some version of religion? It can only be that their minds have been deformed by malignant priests and devilish power elites. Atheists have always been drawn to demonology of this kind; otherwise, they cannot account for the ­persistence of the beliefs they denounce as poisonously irrational. The inveterate human inclination to religion is, in effect, the atheist problem of evil. [Continue reading…]

facebooktwittermail

Evidence of a prehistoric massacre extends the history of warfare

University of Cambridge: Skeletal remains of a group of foragers massacred around 10,000 years ago on the shores of a lagoon is unique evidence of a violent encounter between clashing groups of ancient hunter-gatherers, and suggests the “presence of warfare” in late Stone Age foraging societies.

The fossilised bones of a group of prehistoric hunter-gatherers who were massacred around 10,000 years ago have been unearthed 30km west of Lake Turkana, Kenya, at a place called Nataruk.

Researchers from Cambridge University’s Leverhulme Centre for Human Evolutionary Studies (LCHES) found the partial remains of 27 individuals, including at least eight women and six children.

Twelve skeletons were in a relatively complete state, and ten of these showed clear signs of a violent death: including extreme blunt-force trauma to crania and cheekbones, broken hands, knees and ribs, arrow lesions to the neck, and stone projectile tips lodged in the skull and thorax of two men.

Several of the skeletons were found face down; most had severe cranial fractures. Among the in situ skeletons, at least five showed “sharp-force trauma”, some suggestive of arrow wounds. Four were discovered in a position indicating their hands had probably been bound, including a woman in the last stages of pregnancy. Foetal bones were uncovered.

The bodies were not buried. Some had fallen into a lagoon that has long since dried; the bones preserved in sediment.

The findings suggest these hunter-gatherers, perhaps members of an extended family, were attacked and killed by a rival group of prehistoric foragers. Researchers believe it is the earliest scientifically-dated historical evidence of human conflict – an ancient precursor to what we call warfare.

[Read more…]

facebooktwittermail

Migrant communities think more like non-migrants after just one generation, study suggests

By Alex Mesoudi, University of Exeter and Kesson Magid, Durham University

A common fear among the general public in many Western countries is that immigrants have ways of thinking or social values that are fundamentally different to them, and that these differences prevent them from integrating into Western societies.

Our new research, published in PLOS ONE, suggests such fears are misplaced.

We found that British Bangladeshi migrants in East London shifted towards the thinking styles of the wider non-migrant population in just a single generation. Our study also provides insights into how and why people from different parts of the world think and reason differently.

We were motivated by recent findings in the field of “cultural psychology” that show striking variations between cultures in what had long been assumed to be universal ways of thinking and reasoning.

Here’s an example. In the 1990s, Nick Leeson infamously made unauthorised speculative trades that eventually brought down Barings Bank, Britain’s oldest merchant bank. How would you explain Leeson’s actions? Cultural psychologists have found variations between cultures in how people answer this question.

People from the West, it is suggested, would typically say that Leeson was greedy or dishonest. Psychologists call these “dispositional” explanations, which involve intrinsic aspects of people’s personality or character.

But people from East Asia might explain Leeson’s actions as resulting from a corrosive banking system that lacks proper checks, and which values profits above all else. These are typical “situational explanations”, which refer to the external situation or context.

In 2010, the psychologists Joseph Henrich, Steven Heine and Ara Norenzayan collected many examples of cultural variation like this, including variation in whether people punish cheats, what people consider to be moral and immoral, reactions to aggression, and susceptibility to perceptual illusions.

They coined the acronym WEIRD to describe people from Western, Educated, Industrialised, Rich, Democratic countries. While the vast majority of psychology experiments are conducted on WEIRD people, they are far from representative of our species as a whole.

[Read more…]

facebooktwittermail

Forgotten excrementitious humours of the third concoction shed by the English

Erica Wagner writes: Helen Maria Williams observed the French Revolution at first hand. A poet, essayist and novelist known for her support of radical causes, she entertained the likes of Thomas Paine and Mary Wollstonecraft in her salons. Among the things she perceived, in her accounts of political turmoil across the English Channel, were differences in national character when it came to expressing emotion.

“You will see Frenchmen bathed in tears at a tragedy,” she wrote in 1792. “An Englishman has quite as much sensibility to a generous or tender sentiment; but he thinks it would be unmanly to weep; and, though half choaked with emotion, he scorns to be overcome, contrives to gain the victory over his feelings, and throws into his countenance as much apathy as he can well wish.”

And so you would be forgiven for thinking that the stiff upper lip – the complete refusal of lachrymosity, no matter what disaster befalls us – has been paralysing the faces of Britishers since Stonehenge was raised on Salisbury Plain. But, as Thomas Dixon shows in his erudite and entertaining book Weeping Britannia, you would be wrong. Once upon a time and not so very long ago, this nation was given to paroxysms of sobbing at almost any opportunity. Dixon, a historian of emotions, philosophy, science and religion (phew!) at Queen Mary, University of London, asks what dried our tears and wonders whether the death of Diana, Princess of Wales, in 1997 unlocked the floodgates again.

Both he and Tiffany Watt Smith, in The Book of Human Emotions, offer a reminder that “emotion” is a pretty novel idea. [Continue reading…]

facebooktwittermail

America’s national eating disorder

Julia Belluz spoke to journalist and food advocate Michael Pollan: Julia Belluz: Other countries seem to be doing a much better job of advising people on how to eat, like Brazil. It gives people simple advice — cook at home more often, eat more fruits and vegetables, and eat less meat.

Michael Pollan: That’s an interesting case. The Brazilians have tried to revolutionize the whole concept of dietary guidelines and get away from talking about nutrients completely. They talk not only about foods and science but food culture.

They have recommendations about how you eat, not just what you eat, so for example they encourage Brazilians to eat with other people. What does this have to do with health? It turns out it has everything to do with health. We know that snacking and eating alone are destructive to health.

Julia Belluz: What would it take to see guidelines like Brazil’s in the US?

Michael Pollan: It would take viewing food through a different lens. The mindset that produced those Brazilian guidelines is influenced as much by culture as it is about science — a very foreign idea to us. But food is not just about science. The assumption built into the process here is that it’s strictly a scientific process, a matter of fuel. Eating is essentially a negotiation between the eater and a bunch of chemicals out there. That’s a mistake.

We also have a very powerful food industry that cares deeply about what the government tells the public about food. They don’t want anyone else to be talking to the public about food in a way that might contradict their own messages. So they’re in there lobbying. When the government is deciding about the guidelines for school lunches, industry is in the room, making sure the potato doesn’t get tossed and gets the same respect accorded to vegetables. [Continue reading…]

facebooktwittermail

Why wealth hasn’t brought health: The body isn’t built to be an exclusive neighborhood

microbiome

Matt Ridley writes: As Stewart Brand acutely says, most of the things that dominate the news are not really new: love, scandal, crime, and war come round again and again. Only science and invention deliver truly new stuff, like double helixes and search engines. In this respect, the new news from recent science that most intrigues me is that we may have a way to explain why certain diseases are getting worse as we get richer. We are defeating infectious diseases, slowing or managing many diseases of ageing like heart disease and cancer, but we are faced with a growing epidemic of allergy, auto-immunity, and things like autism. Some of it is due to more diagnosis, some of it is no doubt hypochondria, but there does seem to be a real increase in these kinds of problems.

Take hay fever. It is plainly a modern disease, far more common in urban, middle-class people than it used to be in peasants in the past, or still is in subsistence farmers in Africa today. There’s really good timeline data on this, chronicling the appearance of allergies as civilization advances, province by province or village by village. And there’s really good evidence that what causes this is the suppression of parasites. You can see this happen in eastern Europe and in Africa in real time: get rid of worms and a few years later children start getting hay fever. Moises Velasquez-Manoff chronicles this in glorious detail in his fine book An Epidemic of Absence.

This makes perfect sense. In the arms race with parasites, immune systems evolved to “expect” to be down-regulated by parasites, so they over-react in their absence. A good balance is reached when parasites try down-regulating the immune system, but it turns rogue when there are no parasites. [Continue reading…]

Nina Jablonski writes: The taxonomic diversity and census of our resident bacteria are more than just subjects of scientific curiosity; they matter greatly to our health. The normal bacteria on our skin, for instance, are essential to maintaining the integrity of the skin’s barrier functions. Many diseases, from psoriasis to obesity, inflammatory bowel disease, some cancers, and even cardiovascular disease, are associated with shifts in our microbiota.

While it’s too early to tell if the changing bacteria are the cause or the result of these problems, the discovery of robust associations between bacterial profiles and disease states opens the door for new treatments and targeted preventive measures. The body’s microbiota also affects and is affected by the body’s epigenome, the chemical factors influencing gene expression. Thus, the bugs on us and in us are controlling the normal action of genes in the cells of our bodies, and changes in the proportions or overall numbers of bacterial affect how our cells work and respond to stress.

Let’s stop thinking about our bodies as temples of sinew and cerebrum, and instead as evolving and sloshing ecosystems full of bacteria, which are regulating our health in more ways than we could ever imagine. As we learn more about our single-celled companions in the coming years, we will take probiotics for curing acute and chronic diseases, we’ll undertake affirmative action to maintain diversity of our gut microflora as we age, and we’ll receive prescriptions for increasingly narrow-spectrum antibiotics to exterminate only the nastiest of the nasties when we have a serious acute infection. Hand sanitizers and colon cleansing will probably be with us for some time, but it’s best just to get used to it now: Bugs R us. [Continue reading…]

facebooktwittermail

Religious ownership and cultural appropriation

lichen4

If there’s one thing I’m grateful for Donald Trump expressing, it’s his contempt for political correctness.

There is no value in sensitivity if it merely serves to mask bigotry.

Discourse in which, for instance, racism has been thoroughly expunged, turns out to be discourse in which racists can freely participate with much less risk of being challenged.

Trump is being disingenuous, of course, in claiming that he disavows political correctness, because it’s actually an indispensable tool in his art of deceit. He panders to and expresses his own Islamophobia when saying he’d stop Muslims entering the U.S. but at the same time, postures as Muslim-friendly by claiming he loves Muslims.

Political correctness is no substitute for honesty. Indeed, this has been its most corrosive effect: that it inhibits people from honestly expressing their views and exposing their prejudices.

Richard Falk picks up this theme when noting that in the U.S., in the name of being politically correct and culturally sensitive, many Americans avoid referring to Christmas in deference to non-Christians who don’t celebrate this holy day. Falk, however, gladly appropriates the word and in this celebration sees universal meaning.

As a Jew in America I feel the tensions of conflicting identities. I believe, above all, that while exhibiting empathy to all those have been victimized by tribally imposed norms, we need to rise above such provincialism (whether ethnic or nationalistic) and interrogate our own tribal and ‘patriotic’ roots. In this time of deep ecological alienation, when the very fate of the species has become precarious, we need to think, act, and feel as humans and more than this, as empathetic humans responsible for the failed stewardship of the planet. It is here that God or ‘the force’ can provide a revolutionary comfort zone in which we reach out beyond ourselves to touch all that is ‘other,’ whether such otherness is religious, ethnic, or gendered, and learning from Buddhism, reach out beyond the human to exhibit protective compassion toward non-human animate dimensions of our wider experience and reality. It is this kind of radical reworking of identity and worldview that captures what ‘the Christmas spirit’ means to me beyond the enjoyment of holiday cheer.

From this vantage point, the birth of Jesus can be narrated with this universalizing voice. The star of Bethlehem as an ultimate source of guidance and the three wise kings, the Maji, who traveled far to pay homage to this sacred child can in our time bestow the wisdom of pilgrimage, renewal, and transformation that will alone enable the human future to grasp the radical wisdom of St. Augustine’s transformative: “Love one another and do what thou wilt.” Put presciently in a poem by W.H. Auden, “We must love one another or die.”

I referred to Falk appropriating Christmas, aware that there are Christians who might object to a Jew saying what Christmas means (even though Jesus was Jewish) and in order to raise the wider issue that seems to be popping up with unfortunate frequency: one of the bastard children of political correctness, cultural appropriation.

Cultural appropriation is a phrase that can usefully be applied in limited and rather obvious ways.

In 1992, when Hindu nationalists destroyed the 16th-century Babri Masjid mosque in Ayodhya, Uttar Pradesh, India, they did so in the name of reclaiming the site of the birthplace of the god, Rama. This came 464 years after Muslim invaders had apparently destroyed a Hindu temple at this site. In the intervening period, Hindus and Muslims had both worshiped at the same location.

When conflicting parties with differing cultural identities each claim to own the same place and then alternately snatch it from one another, this can reasonably be described as cultural appropriation.

While each advocates its claim in the name of one divine authority or another, all are saying exactly the same thing: this is mine; it’s not yours.

But when someone in Los Angeles practices yoga in an effort to fine-tune a perfect body, does this degrade Indian culture? Have they claimed as their own, something that belongs to someone else? Not really.

The proliferation of yoga studios across America and the secularization of yoga by treating it as a form of fitness training, has done nothing to close off opportunities for people to explore yoga as a spiritual discipline or learn about its roots in Indian culture. Indeed, the fact that yoga has exported so easily is not because it got stolen by cultural plunderers, but because it comes out of a culture that fosters a universal sense of belonging.

As Michelle Goldberg writes:

India is a country of dizzying dynamism, one that has always eagerly absorbed elements from other cultures into its own while proudly sharing the best of its own culture with the world. “All humanity’s greatest is mine,” wrote poet Rabindranath Tagore, who won the 1913 Nobel Prize in Literature. “The infinite personality of man (as the Upanishads say) can only come from the magnificent harmony of all human races. My prayer is that India may represent the co-operation of all the peoples of the world.” Tagore — who, incidentally, wrote India’s national anthem — founded a university whose motto translates to, “Where the whole world meets in a single nest.”

This is the essence of cosmopolitanism. Obviously, power plays a role in the way cultures develop. Symbols and practices can be wrenched from their traditional contexts and used in ways that are disrespectful. When privileged American kids party while wearing Native American headdresses, it looks like they’re donning the spoils of a long-ago war. But the way that some contemporary activists would silo different cultures — as if anything that travels from outside the West is too fragile to survive a collision with raucous mixed-up modernity — is provincialism masquerading as sensitivity. There’s no such thing as cultural purity, and searching for it never leads anywhere good.

Across the globe, many cultures are under threat — languages are being forgotten and indigenous wisdom lost. But the idea that a culture can be protected behind barriers of insulation, treats culture as a static entity that is preservable. If it has arrived at such a condition, it is most likely already dead.

An endangered language can only be protected by being taught, spoken, and shared. It either grows or withers. Likewise and more broadly, cultures are fertilized at their margins where the familiar and unfamiliar interact, thereby generating new cultural forms.

What threatens culture more than anything else is the commercially driven shift away from cultural creation to cultural consumption.

To the extent that culture is something we passively absorb rather than actively construct, the infinitely varied vantage points from which we each see the world will get overshadowed by whatever forms can be most easily reproduced and massively distributed.

Culture is what we make it. It cannot be kept alive in empty vessels.

facebooktwittermail

The holly and the ivy: How pagan practices found their way into Christmas

By Peter Glaves, Northumbria University, Newcastle

Every year, almost without thinking about it, we incorporate certain plant species into out Christmas celebrations. The most obvious is the Christmas tree, linked historically in England to Prince Albert – but its use in British homes goes back to at least 1761 when Charlotte wife of George III put up a tree at the royal court.

(It’s probably worth noting here that the first artificial-brush Christmas tree was produced using the same machinery that was originally designed to produce toilet brushes.)

Three other plants are intimately associated with Christmas: holly, ivy and mistletoe – and in all cases their ecology is closely linked to their cultural uses.

[Read more…]

facebooktwittermail

Connections aren’t conversations – while technology enables, it can also interfere

By Daria Kuss, Nottingham Trent University

A prisoner was in the US was recently released after 44 years of incarceration for the attempted murder of a police officer. Emerging onto the streets of New York City, Otis Johnson, now 69, found himself bewildered by the world before him. Seeing people apparently talking to themselves on the street, futuristic headphones dangling from their ears, reminded him of CIA agents. People barely paid attention to their surroundings, and instead studied their smartphones while crossing the street, engrossed in their own personal bubbles.

Technology had delivered Johnson a massive culture shock, the shock of a world where technology has quickly changed the way we live and the way we relate to one another.

In 2013 Sherry Turkle, a clinical psychologist and esteemed professor at the prestigious Massachusetts Institute of Technology, wrote Alone Together, in which she questioned the extent to which social media is bringing people together. Following decades of research on the profound impact of modern technology on human relationships, Turkle concluded that with the omnipresence of technology “we’re moving from conversation to connection”.

Connection, it seems, denotes a very different quality of social interaction in comparison to conversation, as it refers to continuous streams of little titbits of information, such as those neatly packaged into 140 characters on Twitter.

Conversation, on the other hand, refers to listening and empathic understanding, actively attending to another person, rather than fleetingly commenting on their status updates online while simultaneously talking on the phone, doing the laundry, or preparing the children’s dinner.

[Read more…]

facebooktwittermail

When languages die, we lose a part of who we are

By Anouschka Foltz, Bangor University

The 2015 Paris Climate Conference (COP21) is in full gear and climate change is again on everyone’s mind. It conjures up images of melting glaciers, rising sea levels, droughts, flooding, threatened habitats, endangered species, and displaced people. We know it threatens biodiversity, but what about linguistic diversity?

Humans are the only species on the planet whose communication system exhibits enormous diversity. And linguistic diversity is crucial for understanding our capacity for language. An increase in climate-change related natural disasters may affect linguistic diversity. A good example is Vanuatu, an island state in the Pacific, with quite a dramatic recent rise in sea levels.

There are over 7,000 languages spoken in the world today. These languages exhibit enormous diversity, from the number of distinctive sounds (there are languages with as few as 11 different sounds and as many as 118) to the vast range of possible word orders, structures and concepts that languages use to convey meaning. Every absolute that linguists have posited has been challenged, and linguists are busy debating if there is anything at all that is common to all languages in the world or anything at all that does not exist in the languages of the world. Sign languages show us that languages do not even need to be spoken. This diversity is evidence of the enormous flexibility and plasticity of the human brain and its capacity for communication.

Studying diverse languages gives us invaluable insights into human cognition. But language diversity is at risk. Languages are dying every year. Often a language’s death is recorded when the last known speaker dies, and about 35% of languages in the world are currently losing speakers or are more seriously endangered. Most of these have never been recorded and so would be lost forever. Linguists estimate that about 50% of the languages spoken today will disappear in the next 100 years. Some even argue that up to 90% of today’s languages will have disappeared by 2115.

[Read more…]

facebooktwittermail

Millet: The missing piece in the puzzle of prehistoric humans’ transition from hunter-gatherers to farmers

New research shows a cereal familiar today as birdseed was carried across Eurasia by ancient shepherds and herders laying the foundation, in combination with the new crops they encountered, of ‘multi-crop’ agriculture and the rise of settled societies. Archaeologists say ‘forgotten’ millet has a role to play in modern crop diversity and today’s food security debate.

The domestication of the small-seeded cereal millet in North China around 10,000 years ago created the perfect crop to bridge the gap between nomadic hunter-gathering and organised agriculture in Neolithic Eurasia, and may offer solutions to modern food security, according to new research.

Now a forgotten crop in the West, this hardy grain – familiar in the west today as birdseed – was ideal for ancient shepherds and herders, who carried it right across Eurasia, where it was mixed with crops such as wheat and barley. This gave rise to ‘multi-cropping’, which in turn sowed the seeds of complex urban societies, say archaeologists.

A team from the UK, USA and China has traced the spread of the domesticated grain from North China and Inner Mongolia into Europe through a “hilly corridor” along the foothills of Eurasia. Millet favours uphill locations, doesn’t require much water, and has a short growing season: it can be harvested 45 days after planting, compared with 100 days for rice, allowing a very mobile form of cultivation.

Nomadic tribes were able to combine growing crops of millet with hunting and foraging as they travelled across the continent between 2500 and 1600 BC. Millet was eventually mixed with other crops in emerging populations to create ‘multi-crop’ diversity, which extended growing seasons and provided our ancient ancestors with food security.

The need to manage different crops in different locations, and the water resources required, depended upon elaborate social contracts and the rise of more settled, stratified communities and eventually complex ‘urban’ human societies.

Researchers say we need to learn from the earliest farmers when thinking about feeding today’s populations, and millet may have a role to play in protecting against modern crop failure and famine.

[Read more…]

facebooktwittermail

You can teach philosophy while you’re teaching farming, but you can’t teach farming while you’re teaching philosophy

facebooktwittermail