Humans and animals — the power and limitations of language

Stassa Edwards writes: In his Apology for Raymond Sebond (1576), Michel de Montaigne ascribed animals’ silence to man’s own wilful arrogance. The French essayist argued that animals could speak, that they were in possession of rich consciousness, but that man wouldn’t condescend to listen. ‘It is through the vanity of the same imagination that [man] equates himself with God,’ Montaigne wrote, ‘that he attributes divine attributes for himself, picks himself out and separates himself from the crowd of other creatures.’ Montaigne asked: ‘When I play with my cat, who knows if she is making more of a pastime of me than I of her?’

Montaigne’s question is as playful as his cat. Apology is not meant to answer the age-old question, but rather to provoke; to tap into an unending inquiry about the reasoning of animals. Perhaps, Montaigne implies, we simply misunderstand the foreign language of animals, and the ignorance is not theirs, but ours.

Montaigne’s position was a radical one – the idea the animals could actually speak to humans was decidedly anti-anthropocentric – and when he looked around for like-minded thinkers, he found himself one solitary essayist. But if Montaigne was a 16th century loner, then he could appeal to the Classics. Apology is littered with references to Pliny and a particular appeal to Plato’s account of the Golden Age under Saturn. But even there, Montaigne had little to work with. Aristotle had argued that animals lacked logos (meaning, literally, ‘word’ but also ‘reason’) and, therefore, had no sense of the philosophical world inhabited and animated by humans. And a few decades after Montaigne, the French philosopher René Descartes delivered the final blow, arguing that the uniqueness of man stems from his ownership of reason, which animals are incapable of possessing, and which grants him dominion over them.

Everyone know what it’s like to forget someone’s name. It could be the name of a celebrity and the need to remember might be non-existent, and yet, as though finding this name might be an antidote to looming senility, it’s hard to let go of such a compulsion until it is satisfied.

From infancy we are taught that success in life requires an unceasing commitment to colonize the world with language. To be lost for words, is to be left out.

Without the ability to speak or understand, we would lose our most vital connection with the rest of humanity.

Montaigne understood that it was a human conceit to imagine that among all creatures, we were the only ones endowed with the capacity to communicate:

Can there be a more formall and better ordained policie, divided into so severall charges and offices, more constantly entertained, and better maintained, than that of Bees? Shall we imagine their so orderly disposing of their actions, and managing of their vocations, have so proportioned and formall a conduct without discourse, reason, and forecast?

What Montaigne logically inferred in the 1500s, science would confirm centuries later.

While Stassa Edwards enumerates the many expressions of a human desire for animals to speak, my sense is that behind this desire there is an intuition about the limitations of language: that our mute companions often see more because they can say less.

We view language as a prism that allows us perceive order in the world and yet this facility in representation is so successful and elegantly structured that most of the time we see the representations much more clearly than we see the world.

Our ability to describe and analyze the world has never been more advanced than it is today and yet for millennia, humans have observed that animals seem to be able to do something that we cannot: anticipate earthquakes.

Perhaps our word-constructed world only holds together on condition that our senses remain dull.

The world we imagine we can describe, quantify, and control, is in truth a world we barely understand.

facebooktwittermail

A conversation with Adam Curtis

Jon Ronson writes: I’ve known Adam Curtis for nearly 20 years. We’re friends. We see movies together, and once even went to Romania on a mini-break to attend an auction of Nicolae Ceausescu’s belongings. But it would be wrong to characterise our friendship as frivolous. Most of the time when we’re together I’m just intensely cross-questioning him about some new book idea I have.

Sometimes Adam will say something that seems baffling and wrong at the time, but makes perfect sense a few years later. I could give you lots of examples, but here’s one: I’m about to publish a book – So You’ve Been Publicly Shamed – about how social media is evolving into a cold and conservative place, a giant echo chamber where what we believe is constantly reinforced by people who believe the same thing, and when people step out of line in the smallest ways we destroy them. Adam was warning me about Twitter’s propensity to turn this way six years ago, when it was still a Garden of Eden. Sometimes talking to Adam feels like finding the results of some horse race of the future, where the long-shot horse wins.

I suppose it’s no surprise that Adam would notice this stuff about social media so early on. It’s what his films are almost always about – power and social control. However, people don’t only enjoy them for the subject matter, but for how they look, too – his wonderful, strange use of archive.

His new film, Bitter Lake, is his most experimental yet. And I think it’s his best. It’s still journalism: it’s about our relationship with Afghanistan, and how we don’t know what to do, and so we just repeat the mistakes of the past. But he’s allowed his use of archive to blossom crazily. Fifty percent of the film has no commentary. Instead, he’s created this dreamlike, fantastical collage from historical footage and raw, unedited news footage. Sometimes it’s just a shot of a man walking down a road in some Afghan town, and you don’t know why he’s chosen it, and then something happens and you think, ‘Ah!’ (Or, more often, ‘Oh God.’) It might be something small and odd. Or it might be something huge and terrible.

Nightmarish things happen in Bitter Lake. There are shots of people dying. It’s a film that could never be on TV. It’s too disturbing. And it’s too long as well – nearly two and a half hours. And so he’s putting it straight onto BBC iPlayer. I think, with this film, he’s invented a whole new way of telling a nonfiction story.

VICE asked the two of us to have an email conversation about his work. We started just before Christmas, and carried on until after the New Year. [Continue reading…]

facebooktwittermail

Satire shouldn’t promote ignorance

Alex Andreou writes: In the wake of recent attacks in France, a rule of thumb appears to be emerging: of course we should be free to mock Islam, but we should do it with respect. This might seem irreconcilable, but in practice is perfectly achievable.

Satire has been a tool for expanding the boundaries of free expression since Aristophanes. It does so most effectively by being hyper-aware of those boundaries, not ignorant of them. When it is done with the sole intention to offend it creates disharmony. When the intention is to entertain and challenge, the effect is quite the opposite.

Recently I played Arshad – a sort of cuddly version of Abu Hamza – in David Baddiel’s musical rendering of The Infidel: a farce in which a British Muslim discovers he is adopted and is actually Jewish, on the eve of his son’s nuptials to a fundamentalist’s daughter. The entire cast and creative team were obsessively attentive to religious detail, both Muslim and Jewish. Precisely how do women tie the niqab? What is the correct pronunciation and meaning of HaMotzi? With which hand would a Muslim hold the Qur’an, and how? Which way is the tallit worn, and why? Hours of research and discussion.

Backstage, after a particular scene in which we did a stylised cipher based on morning prayers, we folded our prayer mats carefully and put them away respectfully. They were just props, so why did it matter? Because they looked like prayer mats and seeing them discarded grated on members of the team who came from a Muslim background – even if they were not religious. Such instincts are deeply ingrained.

All this may seem precious, especially when one is about to launch into a ska musical number entitled Put a Fatwa on It, but it is not. The point is artistic control. You want to challenge an audience in precisely the way you intended – not because you are eating with the wrong hand. One is not careful out of a fear to offend, but out of a fear to offend randomly. Just because something is a legitimate target does not mean that one should have a go at it with a rocket launcher. Rockets inflict collateral damage. [Continue reading…]

facebooktwittermail

Trying to read scrolls that can’t be read

The Economist: In 1752 Camillo Paderni, an artist who had been put in charge of the growing pile of antiquities being dug up at Herculaneum, a seaside town near Naples, wrote to a certain Dr Mead, who then wrote to the Royal Society in London reporting that “there were found many volumes of papyrus but turned to a sort of charcoal, and so brittle, that being touched, it fell to ashes. Yet by His Majesty’s orders he made many trials to open them, but all to no purpose; excepting some scraps containing some words.”

The excavation at Herculaneum — which, like nearby Pompeii, was buried in 79AD under ash from Mount Vesuvius — had uncovered a literary time capsule. What came to be called the Villa of the Papyri contained a library of perhaps 2,000 books, the only such collection known to have been preserved from antiquity.

Actually reading these scrolls has, however, proved both tricky and destructive — until now. For a paper just published in Nature Communications, by Vito Mocella of the Institute for Microelectronics and Microsystems, in Naples, describes a way to decipher them without unrolling them.

facebooktwittermail

Don’t restart Europe’s wars of religion

Pankaj Mishra writes: On Jan. 7, the day jihadists attacked the satirical weekly Charlie Hebdo and a Jewish supermarket in France, I was in a small village in Anatolia, Turkey. I had barely registered the horrifying news when a friend forwarded me a tweet from New York Times columnist Roger Cohen. “The entire free world,” it read, “should respond, ruthlessly.”

For a few seconds I was pulled back into the Cold War when Turkey, a NATO member, was technically part of the “free world.” Even back then the category was porous: Ronald Reagan included in it the jihadists fighting the Soviet army in Afghanistan.

The words seem more anachronistic a quarter century later. Our complex and often bewildering political landscape is only superficially similar to the world we knew then. Devout Anatolian masses rising from poverty have transformed Turkey politically and economically. I did not dare show Charlie Hebdo’s cartoons to the local villagers who pass my house several times every day en route to the mosque next door, let alone argue that the magazine had the right to publish them.

There is no disagreement, except from fanatics, about the viciousness of the murderers, and the need to bring their associates to justice. But the aftermath of the attacks revealed strikingly different ways of looking at the broader issues around them: Our views on free speech, secularism, and the nature of religious hurt turn out to be shaped by particular historical and socioeconomic circumstances. [Continue reading…]

facebooktwittermail

Submission

I am powerless and my life is out of control.
I believe a higher power can restore my sanity.
I submit to the will of God, the only power that can guide my life.

OK. I neither believe in God nor am I an alcoholic, but I based the lines above on the first three steps of the twelve-step program created by Alcoholics Anonymous just to convey the fact that submission to the will of God is a practice (or aspiration) that shapes the lives of millions of Americans — people who might not necessarily describe themselves as religious.

Soumission (Submission) is the title of Michel Houellebecq’s new novel — a book which even before its release this week and before the Charlie Hebdo shootings took place, had stirred a huge amount of controversy in France since it depicts a not-too-distant future in which the French submit to Islamic rule.

Given that premise, it’s not hard to see why Houellebecq is being accused of pandering to the fears of the far right — of those who believe in the National Front’s slogan, “France for the French.” But while Houellebecq’s appetite for controversy is undeniable, he says he’s neither trying to defend secularism nor fuel Islamophobia.

In an interview with The Paris Review, Houellebecq says that he thought he was an atheist but was really an agnostic.

Usually that word serves as a screen for atheism but not, I think, in my case. When, in the light of what I know, I reexamine the question whether there is a creator, a cosmic order, that kind of thing, I realize that I don’t actually have an answer.

The Economist summarizes Soumission in this way:

The novel, which has not yet been translated into English, is narrated by François, a literature professor at the Sorbonne, who drifts between casual sex and microwaved ready-made meals in a state of wry detachment and ennui. Then, in an imaginary France of 2022, a political earthquake shakes him out of his torpor. The two mainstream parties, on the left and the right, are eliminated in the first round of a presidential election. This leaves French voters with the choice between Marine Le Pen’s populist National Front—and the Muslim Fraternity, a new party led by Mohammed Ben Abbes. Thanks to an anti-Le Pen front, Mr Ben Abbes is elected and thus begins Muslim rule.

After a period of disorder, France returns to a strange calm under its apparently moderate new Muslim president; and François, who fled briefly, returns to Paris. But the city, and his university, are unrecognisable. More women are veiled, and give up work to look after their menfolk (helping to bring down France’s unemployment rate). Polygamy is made legal. France embarks on a geopolitical project to merge Europe with Muslim Mediterranean states. Saudi Arabia has poured petrodollars into better pay for professors and posh apartments on the city’s left bank. And his own university has been rebranded the Islamic University of Paris-Sorbonne. Will François, an atheist, resist, or flee the new regime or compromise with it?

While this sounds like a graphic representation of Islamophobic fears prevalent not only in France but across much of Europe, Houellebecq says:

I tried to put myself in the place of a Muslim, and I realized that, in reality, they are in a totally schizophrenic situation. Because overall Muslims aren’t interested in economic issues, their big issues are what we nowadays call societal issues. On these issues, obviously, they are very far from the left and even further from the Green Party. Just think of gay marriage and you’ll see what I mean, but the same is true across the board. And one doesn’t really see why they’d vote for the right, much less for the extreme right, which utterly rejects them. So if a Muslim wants to vote, what’s he supposed to do? The truth is, he’s in an impossible situation. He has no representation whatsoever.

I think there is a real need for God and that the return of religion is not a slogan but a reality, and that it is very much on the rise.

That hypothesis is central to the book, but we know that it has been discredited for many years by numerous researchers, who have shown that we are actually witnessing a progressive secularization of Islam, and that violence and radicalism should be understood as the death throes of Islamism. That is the argument made by Olivier Roy, and many other people who have worked on this question for more than twenty years.

This is not what I have observed, although in North and South America, Islam has benefited less than the evangelicals. This is not a French phenomenon, it’s almost global. I don’t know about Asia, but the case of Africa is interesting because there you have the two great religious powers on the rise — evangelical Christianity and Islam. I remain in many ways a Comtean, and I don’t believe that a society can survive without religion.

[I]n your book you describe, in a very blurry and vague way, various world events, and yet the reader never knows quite what these are. This takes us into the realm of fantasy, doesn’t it, into the politics of fear.

Yes, perhaps. Yes, the book has a scary side. I use scare tactics.

Like imagining the prospect of Islam taking over the country?

Actually, it’s not clear what we are meant to be afraid of, nativists or Muslims. I leave that unresolved.

Have you asked yourself what the effect might be of a novel based on such a hypothesis?

None. No effect whatsoever.

You don’t think it will help reinforce the image of France that I just described, in which Islam hangs overhead like the sword of Damocles, like the most frightening thing of all?

In any case, that’s pretty much all the media talks about, they couldn’t talk about it more. It would be impossible to talk about it more than they already do, so my book won’t have any effect.

Doesn’t it make you want to write about something else so as not to join the pack?

No, part of my work is to talk about what everyone is talking about, objectively. I belong to my own time.

[Y]our book describes the replacement of the Catholic religion by Islam.

No. My book describes the destruction of the philosophy handed down by the Enlightenment, which no longer makes sense to anyone, or to very few people. Catholicism, by contrast, is doing rather well. I would maintain that an alliance between Catholics and Muslims is possible. We’ve seen it happen before, it could happen again.

You who have become an agnostic, you can look on cheerfully and watch the destruction of Enlightenment philosophy?

Yes. It has to happen sometime and it might as well be now. In this sense, too, I am a Comtean. We are in what he calls the metaphysical stage, which began in the Middle Ages and whose whole point was to destroy the phase that preceded it. In itself, it can produce nothing, just emptiness and unhappiness. So yes, I am hostile to Enlightenment philosophy, I need to make that perfectly clear.

[I]f Catholicism doesn’t work, that’s because it’s already run its course, it seems to belong to the past, it has defeated itself. Islam is an image of the future. Why has the idea of the Nation stalled out? Because it’s been abused too long.

Some might be surprised that you chose to go in this direction when your last book was greeted as such a triumph that it silenced your critics.

The true answer is that, frankly, I didn’t choose. The book started with a conversion to Catholicism that should have taken place but didn’t.

Isn’t there something despairing about this gesture, which you didn’t really choose?

The despair comes from saying good-bye to a civilization, however ancient. But in the end the Koran turns out to be much better than I thought, now that I’ve reread it — or rather, read it. The most obvious conclusion is that the jihadists are bad Muslims. Obviously, as with all religious texts, there is room for interpretation, but an honest reading will conclude that a holy war of aggression is not generally sanctioned, prayer alone is valid. So you might say I’ve changed my opinion. That’s why I don’t feel that I’m writing out of fear.

In its crudest expressions, the Clash of Cultures discourse presents a Christian West threatened by Islam, but many of those who reject this narrative use one that is no less polarizing. It presents secular moderates challenged by Islamic extremists — it’s still Religion vs. The Enlightenment, superstition vs. reason.

Much as the West promotes the idea of religious freedom in the context of civil liberties, religion is meant to be a private affair that doesn’t intrude into the social sphere outside the carefully circumscribed territories of church, temple, and mosque. We expect religious freedom to be coupled with religious restraint.

The real struggle, it seems to me, is not ultimately philosophical and theological — it’s not about the existence or non-existence of God. It’s about values.

What count are not values that serve as emblems of identity (often wrapped around nationalism), but instead those that guide individual action and shape society.

We profess values which are libertarian and egalitarian and yet have created societies in which the guiding values are those of materialism, competition, and personal autonomy — values that are all socially corrosive.

Society is relentlessly being atomized, reduced to a social unit of one, captured in the lonely image of the selfie. This is what we’ve been sold and what we’ve bought, but I don’t think it’s what we want.

Spellbound by technological progress, we have neither expected nor demanded that material advances should lead to social advances — that better equipped societies should also be better functioning, happier, more caring societies.

What the false promise of materially sustained, individual autonomy has created is the expectation that the more control we possess over life, the better it will get. We imagine that we must either be in control or fall under control.

From this vantage point, the concept of submission provokes fears of domination, and yet what it really all it means is to come into alignment with the way things are.

Where religion intrudes and so often fails is through the forcible imposition of rigid representations of such an alignment. But submission itself means seeing we belong to life — something that cannot be possessed or controlled.

facebooktwittermail

Friluftsliv, shinrin-yoku, hygge, wabi-sabi, kaizen, gemütlichkeit, and jugaad?

forest

Starre Vartan writes about cultural concepts most of us have never heard of: Friluftsliv translates directly from Norwegian as “free air life,” which doesn’t quite do it justice. Coined relatively recently, in 1859, it is the concept that being outside is good for human beings’ mind and spirit. “It is a term in Norway that is used often to describe a way of life that is spent exploring and appreciating nature,” Anna Stoltenberg, culture coordinator for Sons of Norway, a U.S.-based Norwegian heritage group, told MNN. Other than that, it’s not a strict definition: it can include sleeping outside, hiking, taking photographs or meditating, playing or dancing outside, for adults or kids. It doesn’t require any special equipment, includes all four seasons, and needn’t cost much money. Practicing friluftsliv could be as simple as making a commitment to walking in a natural area five days a week, or doing a day-long hike once a month.

Shinrin-yoku is a Japanese term that means “forest bathing” and unlike the Norwegian translation above, this one seems a perfect language fit (though a pretty similar idea). The idea being that spending time in the forest and natural areas is good preventative medicine, since it lowers stress, which causes or exacerbates some of our most intractable health issues. As MNN’s Catie Leary details, this isn’t just a nice idea — there’s science behind it: “The “magic” behind forest bathing boils down to the naturally produced allelochemic substances known as phytoncides, which are kind of like pheromones for plants. Their job is to help ward off pesky insects and slow the growth of fungi and bacteria. When humans are exposed to phytoncides, these chemicals are scientifically proven to lower blood pressure, relieve stress and boost the growth of cancer-fighting white blood cells. Some common examples of plants that give off phytoncides include garlic, onion, pine, tea tree and oak, which makes sense considering their potent aromas.” [Continue reading…]

facebooktwittermail

Neil Postman: The man who predicted Fox News, the internet, Stephen Colbert and reality TV

Scott Timberg writes: These days, even the kind of educated person who might have once disdained TV and scorned electronic gadgets debates plot turns from “Game of Thrones” and carries an app-laden iPhone. The few left concerned about the effects of the Internet are dismissed as Luddites or killjoys who are on the wrong side of history. A new kind of consensus has shaped up as Steve Jobs becomes the new John Lennon, Amanda Palmer the new Liz Phair, and Elon Musk’s rebel cool graces magazines covers. Conservatives praise Silicon Valley for its entrepreneurial energy; a Democratic president steers millions of dollars of funding to Amazon.

It seems like a funny era for the work of a cautionary social critic, one often dubious about the wonders of technology – including television — whose most famous book came out three decades ago. But the neoliberal post-industrial world now looks chillingly like the one Neil Postman foresaw in books like “Amusing Ourselves to Death” and “Technopoly: The Surrender of Culture to Technology.” And the people asking the important questions about where American society is going are taking a page from him.

“Amusing Ourselves” didn’t argue that regular TV shows were bad or dangerous. It insisted instead that the medium would reshape every other sphere with which it engaged: By using the methods of entertainment, TV would trivialize what the book jacket calls “politics, education, religion, and journalism.”

“It just blew me away,” says D.C.-based politics writer Matt Bai, who read the 1985 book “Amusing Ourselves to Death” while trying to figure out how the press and media became obsessed with superficiality beginning in the ‘80s. “So much of what I’d been thinking about was pioneered so many years before,” says Bai – whose recent book, “All the Truth Is Out: The Week Politics Went Tabloid,” looks at the 1987 Gary Hart sex scandal that effectively ended the politician’s career. “It struck me as incredibly relevant … And the more I reported the book, the more relevant it became.”

Bai isn’t alone. While he’s hardly a household name, Postman has become an important guide to the world of the Internet though most of his work was written before its advent. Astra Taylor, a documentary filmmaker and Occupy activist, turned to his books while she was plotting out what became “The People’s Platform: Taking Back Power and Culture in the Digital Age.” Douglas Rushkoff — a media theorist whose book “Present Shock: When Everything Happens Now,” is one of the most lucid guides to our bewildering age — is indebted to his work. Michael Harris’ recent “The End of Absence” is as well. And Jaron Lanier, the virtual-reality inventor and author (“Who Owns the Future?”) who’s simultaneously critic and tech-world insider, sees Postman as an essential figure whose work becomes more crucial every year.

“There’s this kind of dialogue around technology where people dump on each other for ‘not getting it,’” Lanier says. “Postman does not seem to be vulnerable to that accusation: He was old-fashioned but he really transcended that. I don’t remember him saying, ‘When I was a kid, things were better.’ He called on fundamental arguments in very broad terms – the broad arc of human history and ethics.” [Continue reading…]

facebooktwittermail

Why has progress stalled?

Michael Hanlon writes: We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.

The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.

Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.

There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.

Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. [Continue reading…]

facebooktwittermail

E.O. Wilson talks about the threat to Earth’s biodiversity

facebooktwittermail

How civilization has given humans brittle bones

Nicholas St. Fleur writes: Somewhere in a dense forest of ash and elm trees, a hunter readies his spear for the kill. He hurls his stone-tipped weapon at his prey, an unsuspecting white-tailed deer he has tracked since morning. The crude projectile pierces the animal’s hide, killing it and giving the hunter food to bring back to his family many miles away. Such was survival circa 5,000 B.C. in ancient North America.

But today, the average person barely has to lift a finger, let alone throw a spear to quell their appetite. The next meal is a mere online order away. And according to anthropologists, this convenient, sedentary way of life is making bones weak. Ahead, there’s a future of fractures, breaks, and osteoporosis. But for some anthropologists, the key to preventing aches in bones is by better understanding the skeletons of our hunter-gatherer ancestors.

“Over the vast majority of human prehistory, our ancestors engaged in far more activity over longer distances than we do today,” said Brian Richmond, an anthropologist from the American Museum of Natural History in New York, in a statement. “We cannot fully understand human health today without knowing how our bodies evolved to work in the past, so it is important to understand how our skeletons evolved within the context of those high levels of activity.”

For thousands of years, Native American hunter-gatherers trekked on strenuous ventures for food. And for those same thousands of years, dense skeletons supported their movements. But about 6,000 years later with the advent of agriculture the bones and joints of Native Americans became less rigid and more fragile. Similar transitions occurred across the world as populations shifted from foraging to farming, according to two new papers published Monday in the Proceedings of the National Academies of Sciences. [Continue reading…]

facebooktwittermail

UN cites humanity’s immeasurable loss in Syria’s war

AFP reports: Nearly 300 sites of incalculable value for Syria and human history have been destroyed, damaged or looted in almost four years of war, the U.N. said Tuesday, citing “alarming” satellite evidence.

From prehistoric settlements and ancient markets to world-famous mosques and Crusader castles, Syria is home to countless treasures.

However, since the country’s brutal war erupted in 2011, heritage sites have been plundered by all sides – regime loyalists, anti-government rebels, jihadi fighters and even desperate residents.

After a major survey, the United Nations said that detailed analysis of satellite images from several hundred sites had unearthed the full scale of the damage. [Continue reading…]

facebooktwittermail

Co-operation

Patrick Bateson writes: I am disturbed by the way we have created a social environment in which so much emphasis is laid on competition – on forging ahead while trampling on others. The ideal of social cooperation has come to be treated as high-sounding flabbiness, while individual selfishness is regarded as the natural and sole basis for a realistic approach to life. The image of the struggle for existence lies at the back of it, seriously distorting the view we have of ourselves and wrecking mutual trust.

The fashionable philosophy of individualism draws its respectability in part from an appeal to biology and specifically to the Darwinian theory of evolution by natural selection. Now, Darwin’s theory remains the most powerful explanation for the way that each plant and animal evolved so that it is exquisitely adapted to its environment. The theory works just as well for behaviour as it does for anatomy. Individual animals differ in the way they behave. Those that behave in a manner that is better suited to the conditions in which they live are more likely to survive. Finally, if their descendants resemble them in terms of behaviour, then in the course of evolution, the better adapted forms of behaviour will replace those that are not so effective in keeping the individual alive.

It is the Darwinian concept of differential survival that has been picked up and used so insistently in political rhetoric. Biology is thought to be all about competition – and that supposedly means constant struggle. This emphasis has had an insidious effect on the public mind and has encouraged the belief in individual selfishness and in confrontation. Competition is now widely seen as the mainspring of human activity, at least in Western countries. Excellence in the universities and in the arts is thought to be driven by the same ruthless process that supposedly works so well on the sportsfield or the market place, and they all have a lot in common with what supposedly happens in the jungle. The image of selfish genes, competing with each other in the course of evolution has fused imperceptibly with the notion of selfish individuals competing with each other in the course of their life-times. Individuals only thrive by winning. The argument has become so much a part of conventional belief that it is hard at first to see what is wrong with it.

To put it bluntly, thought has been led seriously astray by the rhetoric. [Continue reading…]

facebooktwittermail

Slavoj Žižek: What is freedom today?

facebooktwittermail

How broken sleep can unleash creativity

Karen Emslie writes: It is 4.18am. In the fireplace, where logs burned, there are now orange lumps that will soon be ash. Orion the Hunter is above the hill. Taurus, a sparkling V, is directly overhead, pointing to the Seven Sisters. Sirius, one of Orion’s heel dogs, is pumping red-blue-violet, like a galactic disco ball. As the night moves on, the old dog will set into the hill.

It is 4.18am and I am awake. Such early waking is often viewed as a disorder, a glitch in the body’s natural rhythm – a sign of depression or anxiety. It is true that when I wake at 4am I have a whirring mind. And, even though I am a happy person, if I lie in the dark my thoughts veer towards worry. I have found it better to get up than to lie in bed teetering on the edge of nocturnal lunacy.

If I write in these small hours, black thoughts become clear and colourful. They form themselves into words and sentences, hook one to the next – like elephants walking trunk to tail. My brain works differently at this time of night; I can only write, I cannot edit. I can only add, I cannot take away. I need my day-brain for finesse. I will work for several hours and then go back to bed.

All humans, animals, insects and birds have clocks inside, biological devices controlled by genes, proteins and molecular cascades. These inner clocks are connected to the ceaseless yet varying cycle of light and dark caused by the rotation and tilt of our planet. They drive primal physiological, neural and behavioural systems according to a roughly 24-hour cycle, otherwise known as our circadian rhythm, affecting our moods, desires, appetites, sleep patterns, and sense of the passage of time.

The Romans, Greeks and Incas woke up without iPhone alarms or digital radio clocks. Nature was their timekeeper: the rise of the sun, the dawn chorus, the needs of the field or livestock. Sundials and hourglasses recorded the passage of time until the 14th century when the first mechanical clocks were erected on churches and monasteries. By the 1800s, mechanical timepieces were widely worn on neck chains, wrists or lapels; appointments could be made and meal- or bed-times set.

Societies built around industrialisation and clock-time brought with them urgency and the concept of being ‘on time’ or having ‘wasted time’. Clock-time became increasingly out of synch with natural time, yet light and dark still dictated our working day and social structures.

Then, in the late 19th century, everything changed. [Continue reading…]

facebooktwittermail

Societies in harsh environments more likely to believe in moralizing high gods

EurekAlert!: Just as physical adaptations help populations prosper in inhospitable habitats, belief in moralizing, high gods might be similarly advantageous for human cultures in poorer environments. A new study from the National Evolutionary Synthesis Center (NESCent) suggests that societies with less access to food and water are more likely to believe in these types of deities.

“When life is tough or when it’s uncertain, people believe in big gods,” says Russell Gray, a professor at the University of Auckland and a founding director of the Max Planck Institute for History and the Sciences in Jena, Germany. “Prosocial behavior maybe helps people do well in harsh or unpredictable environments.”

Gray and his coauthors found a strong correlation between belief in high gods who enforce a moral code and other societal characteristics. Political complexity–namely a social hierarchy beyond the local community– and the practice of animal husbandry were both strongly associated with a belief in moralizing gods.

The emergence of religion has long been explained as a result of either culture or environmental factors but not both. The new findings imply that complex practices and characteristics thought to be exclusive to humans arise from a medley of ecological, historical, and cultural variables.

“When researchers discuss the forces that shaped human history, there is considerable disagreement as to whether our behavior is primarily determined by culture or by the environment,” says primary author Carlos Botero, a researcher at the Initiative for Biological Complexity at North Carolina State University. “We wanted to throw away all preconceived notions regarding these processes and look at all the potential drivers together to see how different aspects of the human experience may have contributed to the behavioral patterns we see today.” [Continue reading…]

facebooktwittermail

Gossip makes human society possible

Julie Beck writes: While gossiping is a behavior that has long been frowned upon, perhaps no one has frowned quite so intensely as the 16th- and 17th-century British. Back then, gossips, or “scolds” were sometimes forced to wear a menacing iron cage on their heads, called the “branks” or “scold’s bridle.” These masks purportedly had iron spikes or bits that went in the mouth and prevented the wearer from speaking. (And of course, of course, this ghastly punishment seems to have been mostly for women who were talking too much.)

Today, people who gossip are still not very well-liked, though we tend to resist the urge to cage their heads. Progress. And yet the reflexive distaste people feel for gossip and those who gossip in general is often nowhere to be found when people find themselves actually faced with a juicy morsel about someone they know. Social topics—personal relationships, likes and dislikes, anecdotes about social activities—made up about two-thirds of all conversations in analyses done by evolutionary psychologist Robin Dunbar. The remaining one-third of their time not spent talking about other people was devoted to discussing everything else: sports, music, politics, etc.

“Language in freely forming natural conversations is principally used for the exchange of social information,” Dunbar writes. “That such topics are so overwhelmingly important to us suggests that this is a primary function of language.” He even goes so far as to say: “Gossip is what makes human society as we know it possible.”

In recent years, research on the positive effects of gossip has proliferated. Rather than just a means to humiliate people and make them cry in the bathroom, gossip is now being considered by scientists as a way to learn about cultural norms, bond with others, promote cooperation, and even, as one recent study found, allow individuals to gauge their own success and social standing. [Continue reading…]

facebooktwittermail

Slaves of productivity

Quinn Norton writes: We dream now of making Every Moment Count, of achieving flow and never leaving, creating one project that must be better than the last, of working harder and smarter. We multitask, we update, and we conflate status with long hours worked in no paid overtime systems for the nebulous and fantastic status of being Too Important to have Time to Ourselves, time to waste. But this incarnation of the American dream is all about doing, and nothing about doing anything good, or even thinking about what one was doing beyond how to do more of it more efficiently. It was not even the surrenders to hedonism and debauchery or greed our literary dreams have recorded before. It is a surrender to nothing, to a nothingness of lived accounting.

This moment’s goal of productivity, with its all-consuming practice and unattainable horizon, is perfect for our current corporate world. Productivity never asks what it builds, just how much of it can be piled up before we leave or die. It is irrelevant to pleasure. It’s agnostic about the fate of humanity. It’s not even selfish, because production negates the self. Self can only be a denominator, holding up a dividing bar like a caryatid trying to hold up a stone roof.

I am sure this started with the Industrial Revolution, but what has swept through this generation is more recent. This idea of productivity started in the 1980s, with the lionizing of the hardworking greedy. There’s a critique of late capitalism to be had for sure, but what really devastated my generation was the spiritual malaise inherent in Taylorism’s perfectly mechanized human labor. But Taylor had never seen a robot or a computer perfect his methods of being human. By the 1980s, we had. In the age of robots we reinvented the idea of being robots ourselves. We wanted to program our minds and bodies and have them obey clocks and routines. In this age of the human robot, of the materialist mind, being efficient took the pre-eminent spot, beyond goodness or power or wisdom or even cruel greed. [Continue reading…]

facebooktwittermail