Abby Rabinowitz writes: On April 11, 2012, Zeddie Little appeared on Good Morning America, wearing the radiant, slightly perplexed smile of one enjoying instant fame. About a week earlier, Little had been a normal, if handsome, 25-year-old trying to make it in public relations. Then on March 31, he was photographed amid a crowd of runners in a South Carolina race by a stranger, Will King, who posted the image to a social networking website, Reddit. Dubbed “Ridiculously Photogenic Guy,” Little’s picture circulated on Facebook, Twitter, and Tumblr, accruing likes, comments, and captions (“Picture gets put up as employee of the month/for a company he doesn’t work for”). It spawned spinoffs (Ridiculously Photogenic Dog, Prisoner, and Syrian Rebel) and leapt to the mainstream media. At a high point, ABC Morning News reported that a Google search for “Zeddie Little” yielded 59 million hits.
Why the sudden fame? The truth is that Little hadn’t become famous: His meme had. According to website Know Your Meme, which documents viral Internet phenomena, a meme is “a piece of content or an idea that’s passed from person to person, changing and evolving along the way.” Ridiculously Photogenic Guy is a kind of Internet meme represented by LOL cats: that is, a photograph, video, or cartoon, often overlaid with a snarky message, perfect for incubating in the bored, fertile minds of cubicle workers and college students. In an age where politicians campaign through social media and viral marketers ponder the appeal of sneezing baby pandas, memes are more important than ever—however trivial they may seem.
But trawling the Internet, I found a strange paradox: While memes were everywhere, serious meme theory was almost nowhere. Richard Dawkins, the famous evolutionary biologist who coined the word “meme” in his classic 1976 book, The Selfish Gene, seemed bent on disowning the Internet variety, calling it a “hijacking” of the original term. The peer-reviewed Journal of Memetics folded in 2005. “The term has moved away from its theoretical beginnings, and a lot of people don’t know or care about its theoretical use,” philosopher and meme theorist Daniel Dennett told me. What has happened to the idea of the meme, and what does that evolution reveal about its usefulness as a concept? [Continue reading…]
Category Archives: Culture
Can a dying language be saved?
Judith Thurman writes: It is a singular fate to be the last of one’s kind. That is the fate of the men and women, nearly all of them elderly, who are — like Marie Wilcox, of California; Gyani Maiya Sen, of Nepal; Verdena Parker, of Oregon; and Charlie Mungulda, of Australia — the last known speakers of a language: Wukchumni, Kusunda, Hupa, and Amurdag, respectively. But a few years ago, in Chile, I met Joubert Yanten Gomez, who told me he was “the world’s only speaker of Selk’nam.” He was twenty-one.
Yanten Gomez, who uses the tribal name Keyuk, grew up modestly, in Santiago. His father, Blas Yanten, is a woodworker, and his mother, Ivonne Gomez Castro, practices traditional medicine. As a young girl, she was mocked at school for her mestizo looks, so she hesitated to tell her children — Keyuk and an older sister — about their ancestry. They hadn’t known that their maternal relatives descended from the Selk’nam, a nomadic tribe of unknown origin that settled in Tierra del Fuego. The first Europeans to encounter the Selk’nam, in the sixteenth century, were astonished by their height and their hardiness — they braved the frigid climate by coating their bodies with whale fat. The tribe lived mostly undisturbed until the late eighteen-hundreds, when an influx of sheep ranchers and gold prospectors who coveted their land put bounties on their heads. (One hunter boasted that he had received a pound sterling per corpse, redeemable with a pair of ears.) The survivors of the Selk’nam Genocide, as it is called — a population of about four thousand was reduced to some three hundred — were resettled on reservations run by missionaries. The last known fluent speaker of the language, Angela Loij, a laundress and farmer, died forty years ago.
Many children are natural mimics, but Keyuk could imitate speech like a mynah. His father, who is white, had spent part of his childhood in the Arauco region, which is home to the Mapuche, Chile’s largest native community, and he taught Keyuk their language, Mapudungun. The boy, a bookworm and an A student, easily became fluent. A third-grade research project impassioned him about indigenous peoples, and Ivonne, who descends from a line of shamans, took this as a sign that his ancestors were speaking through him. When she told him of their heritage, Keyuk vowed that he would master Selk’nam and also, eventually, Yagán — the nearly extinct language of a neighboring people in the far south — reckoning that he could pass them down to his children and perhaps reseed the languages among the tribes’ descendants. At fourteen, he travelled with his father to Puerto Williams, a town in Chile’s Antarctic province that calls itself “the world’s southernmost city,” to meet Cristina Calderón, the last native Yagán speaker. She subsequently tutored him by phone. [Continue reading…]
A phony populism is denying Americans the joys of serious thought
Steve Wasserman writes: The vast canvas afforded by the Internet has done little to encourage thoughtful and serious criticism. Mostly it has provided a vast Democracy Wall on which any crackpot can post his or her manifesto. Bloggers bloviate and insults abound. Discourse coarsens. Information is abundant, wisdom scarce. It is a striking irony, as Leon Wieseltier has noted, that with the arrival of the Internet, “a medium of communication with no limitations of physical space, everything on it has to be in six hundred words.” The Internet, he said, is the first means of communication invented by humankind that privileges one’s first thoughts as one’s best thoughts. And he rightly observed that if “value is a function of scarcity,” then “what is most scarce in our culture is long, thoughtful, patient, deliberate analysis of questions that do not have obvious or easy answers.” Time is required to think through difficult questions. Patience is a condition of genuine intellection. The thinking mind, the creating mind, said Wieseltier, should not be rushed. “And where the mind is rushed and made frenetic, neither thought nor creativity will ensue. What you will most likely get is conformity and banality. Writing is not typed talking.”
The fundamental idea at stake in the criticism of culture generally is the self-image of society: how it reasons with itself, describes itself, imagines itself. Nothing in the excitements made possible by the digital revolution banishes the need for the rigor such self-reckoning requires. It is, as Wieseltier says, the obligation of cultural criticism to bear down on what matters. [Continue reading…]
Steven Pinker is wrong about violence and war
In an essay challenging Steven Pinker’s thesis that the world is becoming progressively more peaceful, John Gray writes: While it is true that war has changed, it has not become less destructive. Rather than a contest between well-organised states that can at some point negotiate peace, it is now more often a many-sided conflict in fractured or collapsed states that no one has the power to end. The protagonists are armed irregulars, some of them killing and being killed for the sake of an idea or faith, others from fear or a desire for revenge and yet others from the world’s swelling armies of mercenaries, who fight for profit. For all of them, attacks on civilian populations have become normal. The ferocious conflict in Syria, in which methodical starvation and the systematic destruction of urban environments are deployed as strategies, is an example of this type of warfare.
It may be true that the modern state’s monopoly of force has led, in some contexts, to declining rates of violent death. But it is also true that the power of the modern state has been used for purposes of mass killing, and one should not pass too quickly over victims of state terror. With increasing historical knowledge it has become clear that the “Holocaust-by-bullets” – the mass shootings of Jews, mostly in the Soviet Union, during the second world war – was perpetrated on an even larger scale than previously realised. Soviet agricultural collectivisation incurred millions of foreseeable deaths, mainly as a result of starvation, with deportation to uninhabitable regions, life-threatening conditions in the Gulag and military-style operations against recalcitrant villages also playing an important role. Peacetime deaths due to internal repression under the Mao regime have been estimated to be around 70 million. Along with fatalities caused by state terror were unnumbered millions whose lives were irreparably broken and shortened. How these casualties fit into the scheme of declining violence is unclear. Pinker goes so far as to suggest that the 20th-century Hemoclysm [the tide of 20th-century mass murder in which Pinker includes the Holocaust] might have been a gigantic statistical fluke, and cautions that any history of the last century that represents it as having been especially violent may be “apt to exaggerate the narrative coherence of this history” (the italics are Pinker’s). However, there is an equal or greater risk in abandoning a coherent and truthful narrative of the violence of the last century for the sake of a spurious quantitative precision.
Estimating the numbers of those who die from violence involves complex questions of cause and effect, which cannot always be separated from moral judgments. There are many kinds of lethal force that do not produce immediate death. Are those who die of hunger or disease during war or its aftermath counted among the casualties? Do refugees whose lives are cut short appear in the count? Where torture is used in war, will its victims figure in the calculus if they succumb years later from the physical and mental damage that has been inflicted on them? Do infants who are born to brief and painful lives as a result of exposure to Agent Orange or depleted uranium find a place in the roll call of the dead? If women who have been raped as part of a military strategy of sexual violence die before their time, will their passing feature in the statistical tables?
While the seeming exactitude of statistics may be compelling, much of the human cost of war is incalculable. Deaths by violence are not all equal. It is terrible to die as a conscript in the trenches or a civilian in an aerial bombing campaign, but to perish from overwork, beating or cold in a labour camp can be a greater evil. It is worse still to be killed as part of a systematic campaign of extermination as happened to those who were consigned to death camps such as Treblinka. Disregarding these distinctions, the statistics presented by those who celebrate the arrival of the Long Peace are morally dubious if not meaningless. [Continue reading…]
A deficit in patience produces the illusion of a shortage of time
Chelsea Wald writes: Not long ago I diagnosed myself with the recently identified condition of sidewalk rage. It’s most pronounced when it comes to a certain friend who is a slow walker. Last month, as we sashayed our way to dinner, I found myself biting my tongue, thinking, I have to stop going places with her if I ever want to … get there!
You too can measure yourself on the “Pedestrian Aggressiveness Syndrome Scale,” a tool developed by University of Hawaii psychologist Leon James. While walking in a crowd, do you find yourself “acting in a hostile manner (staring, presenting a mean face, moving closer or faster than expected)” and “enjoying thoughts of violence?”
Slowness rage is not confined to the sidewalk, of course. Slow drivers, slow Internet, slow grocery lines — they all drive us crazy. Even the opening of this article may be going on a little too long for you. So I’ll get to the point. Slow things drive us crazy because the fast pace of society has warped our sense of timing. Things that our great-great-grandparents would have found miraculously efficient now drive us around the bend. Patience is a virtue that’s been vanquished in the Twitter age.
Once upon a time, cognitive scientists tell us, patience and impatience had an evolutionary purpose. They constituted a yin and yang balance, a finely tuned internal timer that tells when we’ve waited too long for something and should move on. When that timer went buzz, it was time to stop foraging at an unproductive patch or abandon a failing hunt.
“Why are we impatient? It’s a heritage from our evolution,” says Marc Wittmann, a psychologist at the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany. Impatience made sure we didn’t die from spending too long on a single unrewarding activity. It gave us the impulse to act.
But that good thing is gone. The fast pace of society has thrown our internal timer out of balance. It creates expectations that can’t be rewarded fast enough — or rewarded at all. When things move more slowly than we expect, our internal timer even plays tricks on us, stretching out the wait, summoning anger out of proportion to the delay. [Continue reading…]
The mythical secular civilization promoted by evangelical atheists
John Gray writes: Considering the alternatives that are on offer, liberal societies are well worth defending. But there is no reason for thinking these societies are the beginning of a species-wide secular civilisation of the kind of which evangelical atheists dream.
In ancient Greece and Rome, religion was not separate from the rest of human activity. Christianity was less tolerant than these pagan societies, but without it the secular societies of modern times would hardly have been possible. By adopting the distinction between what is owed to Caesar and what to God, Paul and Augustine – who turned the teaching of Jesus into a universal creed – opened the way for societies in which religion was no longer coextensive with life. Secular regimes come in many shapes, some liberal, others tyrannical. Some aim for a separation of church and state as in the US and France, while others – such as the Ataturkist regime that until recently ruled in Turkey – assert state control over religion. Whatever its form, a secular state is no guarantee of a secular culture. Britain has an established church, but despite that fact – or more likely because of it – religion has a smaller role in politics than in America and is less publicly divisive than it is in France.
There is no sign anywhere of religion fading away, but by no means all atheists have thought the disappearance of religion possible or desirable. Some of the most prominent – including the early 19th-century poet and philosopher Giacomo Leopardi, the philosopher Arthur Schopenhauer, the Austro-Hungarian philosopher and novelist Fritz Mauthner (who published a four-volume history of atheism in the early 1920s) and Sigmund Freud, to name a few – were all atheists who accepted the human value of religion. One thing these atheists had in common was a refreshing indifference to questions of belief. Mauthner – who is remembered today chiefly because of a dismissive one-line mention in Wittgenstein’s Tractatus – suggested that belief and unbelief were both expressions of a superstitious faith in language. For him, “humanity” was an apparition which melts away along with the departing Deity. Atheism was an experiment in living without taking human concepts as realities. Intriguingly, Mauthner saw parallels between this radical atheism and the tradition of negative theology in which nothing can be affirmed of God, and described the heretical medieval Christian mystic Meister Eckhart as being an atheist in this sense.
Above all, these unevangelical atheists accepted that religion is definitively human. Though not all human beings may attach great importance to them, every society contains practices that are recognisably religious. Why should religion be universal in this way? For atheist missionaries this is a decidedly awkward question. Invariably they claim to be followers of Darwin. Yet they never ask what evolutionary function this species-wide phenomenon serves. There is an irresolvable contradiction between viewing religion naturalistically – as a human adaptation to living in the world – and condemning it as a tissue of error and illusion. What if the upshot of scientific inquiry is that a need for illusion is built into in the human mind? If religions are natural for humans and give value to their lives, why spend your life trying to persuade others to give them up?
The answer that will be given is that religion is implicated in many human evils. Of course this is true. Among other things, Christianity brought with it a type of sexual repression unknown in pagan times. Other religions have their own distinctive flaws. But the fault is not with religion, any more than science is to blame for the proliferation of weapons of mass destruction or medicine and psychology for the refinement of techniques of torture. The fault is in the intractable human animal. Like religion at its worst, contemporary atheism feeds the fantasy that human life can be remade by a conversion experience – in this case, conversion to unbelief. [Continue reading…]
Conversion can be thought of as an example of the miracle of neuroplasticity: that beliefs, firmly held, can in the right circumstances, suddenly be upturned such that the world thereafter is perceived in a radically different way.
That transition is usually described in terms of a bridge that leads from weak faith, no faith, or false faith, to conviction, but as Gray points out, that bridge could also be imagined to be traversable in the opposite direction.
The mistake that all evangelicals make (be they religious evangelicals or new atheists) is to imagine that they have the right and ability to march others across this bridge.
Real conversion, by its nature, cannot be coercive, since it entails some kind of discovery and no one discovers anything under pressure from others.
In a world that remains predominantly religious, the new atheists have ostensibly embarked on a mission of staggering proportions in their effort to purge humanity of its unreasonable superstitions.
This could be viewed as a heroically ambitious undertaking, but there seem to be plenty of reasons not to see it that way.
If the new atheists genuinely hope to persuade religious believers to see the error of their ways, how can they make any progress if they start out by viewing their prospective converts with contempt?
When was it ever the first step in a genuine process of persuasion, to start with the assumption that the person you are addressing is a fool?
As much as the new atheists may appear to be possessed by evangelical fervor, they’re appetite to condemn religion sometimes mirrors the religious fanaticism that condemns apostates.
“Some propositions are so dangerous that it may even be ethical to kill people for believing them,” writes Sam Harris in apparent agreement with the leaders of ISIS. Their only disagreement is over which propositions warrant a death sentence.
Still, much as the new atheists are often guilty of evangelical errors, I seriously doubt that their mission truly is to mount a challenge against the reign of religion.
On the contrary, I think their mission seems to have less to do with changing the world than it has with preaching to the converted. It’s about selling books, going on speaking tours, appearing on TV, amassing followers on Twitter, and doing everything else it takes to carve out a profitable cultural niche.
Who would have thought that it’s possible to pursue a career as a professional atheist? Sam Harris has, and I’m sure he has been rewarded handsomely and his success will continue, irrespective of the fate of religion.
The mystery of flying kicks
Humans and animals — the power and limitations of language
Stassa Edwards writes: In his Apology for Raymond Sebond (1576), Michel de Montaigne ascribed animals’ silence to man’s own wilful arrogance. The French essayist argued that animals could speak, that they were in possession of rich consciousness, but that man wouldn’t condescend to listen. ‘It is through the vanity of the same imagination that [man] equates himself with God,’ Montaigne wrote, ‘that he attributes divine attributes for himself, picks himself out and separates himself from the crowd of other creatures.’ Montaigne asked: ‘When I play with my cat, who knows if she is making more of a pastime of me than I of her?’
Montaigne’s question is as playful as his cat. Apology is not meant to answer the age-old question, but rather to provoke; to tap into an unending inquiry about the reasoning of animals. Perhaps, Montaigne implies, we simply misunderstand the foreign language of animals, and the ignorance is not theirs, but ours.
Montaigne’s position was a radical one – the idea the animals could actually speak to humans was decidedly anti-anthropocentric – and when he looked around for like-minded thinkers, he found himself one solitary essayist. But if Montaigne was a 16th century loner, then he could appeal to the Classics. Apology is littered with references to Pliny and a particular appeal to Plato’s account of the Golden Age under Saturn. But even there, Montaigne had little to work with. Aristotle had argued that animals lacked logos (meaning, literally, ‘word’ but also ‘reason’) and, therefore, had no sense of the philosophical world inhabited and animated by humans. And a few decades after Montaigne, the French philosopher René Descartes delivered the final blow, arguing that the uniqueness of man stems from his ownership of reason, which animals are incapable of possessing, and which grants him dominion over them.
Everyone know what it’s like to forget someone’s name. It could be the name of a celebrity and the need to remember might be non-existent, and yet, as though finding this name might be an antidote to looming senility, it’s hard to let go of such a compulsion until it is satisfied.
From infancy we are taught that success in life requires an unceasing commitment to colonize the world with language. To be lost for words, is to be left out.
Without the ability to speak or understand, we would lose our most vital connection with the rest of humanity.
Montaigne understood that it was a human conceit to imagine that among all creatures, we were the only ones endowed with the capacity to communicate:
Can there be a more formall and better ordained policie, divided into so severall charges and offices, more constantly entertained, and better maintained, than that of Bees? Shall we imagine their so orderly disposing of their actions, and managing of their vocations, have so proportioned and formall a conduct without discourse, reason, and forecast?
What Montaigne logically inferred in the 1500s, science would confirm centuries later.
While Stassa Edwards enumerates the many expressions of a human desire for animals to speak, my sense is that behind this desire there is an intuition about the limitations of language: that our mute companions often see more because they can say less.
We view language as a prism that allows us perceive order in the world and yet this facility in representation is so successful and elegantly structured that most of the time we see the representations much more clearly than we see the world.
Our ability to describe and analyze the world has never been more advanced than it is today and yet for millennia, humans have observed that animals seem to be able to do something that we cannot: anticipate earthquakes.
Perhaps our word-constructed world only holds together on condition that our senses remain dull.
The world we imagine we can describe, quantify, and control, is in truth a world we barely understand.
A conversation with Adam Curtis
Jon Ronson writes: I’ve known Adam Curtis for nearly 20 years. We’re friends. We see movies together, and once even went to Romania on a mini-break to attend an auction of Nicolae Ceausescu’s belongings. But it would be wrong to characterise our friendship as frivolous. Most of the time when we’re together I’m just intensely cross-questioning him about some new book idea I have.
Sometimes Adam will say something that seems baffling and wrong at the time, but makes perfect sense a few years later. I could give you lots of examples, but here’s one: I’m about to publish a book – So You’ve Been Publicly Shamed – about how social media is evolving into a cold and conservative place, a giant echo chamber where what we believe is constantly reinforced by people who believe the same thing, and when people step out of line in the smallest ways we destroy them. Adam was warning me about Twitter’s propensity to turn this way six years ago, when it was still a Garden of Eden. Sometimes talking to Adam feels like finding the results of some horse race of the future, where the long-shot horse wins.
I suppose it’s no surprise that Adam would notice this stuff about social media so early on. It’s what his films are almost always about – power and social control. However, people don’t only enjoy them for the subject matter, but for how they look, too – his wonderful, strange use of archive.
His new film, Bitter Lake, is his most experimental yet. And I think it’s his best. It’s still journalism: it’s about our relationship with Afghanistan, and how we don’t know what to do, and so we just repeat the mistakes of the past. But he’s allowed his use of archive to blossom crazily. Fifty percent of the film has no commentary. Instead, he’s created this dreamlike, fantastical collage from historical footage and raw, unedited news footage. Sometimes it’s just a shot of a man walking down a road in some Afghan town, and you don’t know why he’s chosen it, and then something happens and you think, ‘Ah!’ (Or, more often, ‘Oh God.’) It might be something small and odd. Or it might be something huge and terrible.
Nightmarish things happen in Bitter Lake. There are shots of people dying. It’s a film that could never be on TV. It’s too disturbing. And it’s too long as well – nearly two and a half hours. And so he’s putting it straight onto BBC iPlayer. I think, with this film, he’s invented a whole new way of telling a nonfiction story.
VICE asked the two of us to have an email conversation about his work. We started just before Christmas, and carried on until after the New Year. [Continue reading…]
Satire shouldn’t promote ignorance
Alex Andreou writes: In the wake of recent attacks in France, a rule of thumb appears to be emerging: of course we should be free to mock Islam, but we should do it with respect. This might seem irreconcilable, but in practice is perfectly achievable.
Satire has been a tool for expanding the boundaries of free expression since Aristophanes. It does so most effectively by being hyper-aware of those boundaries, not ignorant of them. When it is done with the sole intention to offend it creates disharmony. When the intention is to entertain and challenge, the effect is quite the opposite.
Recently I played Arshad – a sort of cuddly version of Abu Hamza – in David Baddiel’s musical rendering of The Infidel: a farce in which a British Muslim discovers he is adopted and is actually Jewish, on the eve of his son’s nuptials to a fundamentalist’s daughter. The entire cast and creative team were obsessively attentive to religious detail, both Muslim and Jewish. Precisely how do women tie the niqab? What is the correct pronunciation and meaning of HaMotzi? With which hand would a Muslim hold the Qur’an, and how? Which way is the tallit worn, and why? Hours of research and discussion.
Backstage, after a particular scene in which we did a stylised cipher based on morning prayers, we folded our prayer mats carefully and put them away respectfully. They were just props, so why did it matter? Because they looked like prayer mats and seeing them discarded grated on members of the team who came from a Muslim background – even if they were not religious. Such instincts are deeply ingrained.
All this may seem precious, especially when one is about to launch into a ska musical number entitled Put a Fatwa on It, but it is not. The point is artistic control. You want to challenge an audience in precisely the way you intended – not because you are eating with the wrong hand. One is not careful out of a fear to offend, but out of a fear to offend randomly. Just because something is a legitimate target does not mean that one should have a go at it with a rocket launcher. Rockets inflict collateral damage. [Continue reading…]
Trying to read scrolls that can’t be read
The Economist: In 1752 Camillo Paderni, an artist who had been put in charge of the growing pile of antiquities being dug up at Herculaneum, a seaside town near Naples, wrote to a certain Dr Mead, who then wrote to the Royal Society in London reporting that “there were found many volumes of papyrus but turned to a sort of charcoal, and so brittle, that being touched, it fell to ashes. Yet by His Majesty’s orders he made many trials to open them, but all to no purpose; excepting some scraps containing some words.”
The excavation at Herculaneum — which, like nearby Pompeii, was buried in 79AD under ash from Mount Vesuvius — had uncovered a literary time capsule. What came to be called the Villa of the Papyri contained a library of perhaps 2,000 books, the only such collection known to have been preserved from antiquity.
Actually reading these scrolls has, however, proved both tricky and destructive — until now. For a paper just published in Nature Communications, by Vito Mocella of the Institute for Microelectronics and Microsystems, in Naples, describes a way to decipher them without unrolling them.
Don’t restart Europe’s wars of religion
Pankaj Mishra writes: On Jan. 7, the day jihadists attacked the satirical weekly Charlie Hebdo and a Jewish supermarket in France, I was in a small village in Anatolia, Turkey. I had barely registered the horrifying news when a friend forwarded me a tweet from New York Times columnist Roger Cohen. “The entire free world,” it read, “should respond, ruthlessly.”
For a few seconds I was pulled back into the Cold War when Turkey, a NATO member, was technically part of the “free world.” Even back then the category was porous: Ronald Reagan included in it the jihadists fighting the Soviet army in Afghanistan.
The words seem more anachronistic a quarter century later. Our complex and often bewildering political landscape is only superficially similar to the world we knew then. Devout Anatolian masses rising from poverty have transformed Turkey politically and economically. I did not dare show Charlie Hebdo’s cartoons to the local villagers who pass my house several times every day en route to the mosque next door, let alone argue that the magazine had the right to publish them.
There is no disagreement, except from fanatics, about the viciousness of the murderers, and the need to bring their associates to justice. But the aftermath of the attacks revealed strikingly different ways of looking at the broader issues around them: Our views on free speech, secularism, and the nature of religious hurt turn out to be shaped by particular historical and socioeconomic circumstances. [Continue reading…]
Submission
I am powerless and my life is out of control.
I believe a higher power can restore my sanity.
I submit to the will of God, the only power that can guide my life.
OK. I neither believe in God nor am I an alcoholic, but I based the lines above on the first three steps of the twelve-step program created by Alcoholics Anonymous just to convey the fact that submission to the will of God is a practice (or aspiration) that shapes the lives of millions of Americans — people who might not necessarily describe themselves as religious.
Soumission (Submission) is the title of Michel Houellebecq’s new novel — a book which even before its release this week and before the Charlie Hebdo shootings took place, had stirred a huge amount of controversy in France since it depicts a not-too-distant future in which the French submit to Islamic rule.
Given that premise, it’s not hard to see why Houellebecq is being accused of pandering to the fears of the far right — of those who believe in the National Front’s slogan, “France for the French.” But while Houellebecq’s appetite for controversy is undeniable, he says he’s neither trying to defend secularism nor fuel Islamophobia.
In an interview with The Paris Review, Houellebecq says that he thought he was an atheist but was really an agnostic.
Usually that word serves as a screen for atheism but not, I think, in my case. When, in the light of what I know, I reexamine the question whether there is a creator, a cosmic order, that kind of thing, I realize that I don’t actually have an answer.
The Economist summarizes Soumission in this way:
The novel, which has not yet been translated into English, is narrated by François, a literature professor at the Sorbonne, who drifts between casual sex and microwaved ready-made meals in a state of wry detachment and ennui. Then, in an imaginary France of 2022, a political earthquake shakes him out of his torpor. The two mainstream parties, on the left and the right, are eliminated in the first round of a presidential election. This leaves French voters with the choice between Marine Le Pen’s populist National Front—and the Muslim Fraternity, a new party led by Mohammed Ben Abbes. Thanks to an anti-Le Pen front, Mr Ben Abbes is elected and thus begins Muslim rule.
After a period of disorder, France returns to a strange calm under its apparently moderate new Muslim president; and François, who fled briefly, returns to Paris. But the city, and his university, are unrecognisable. More women are veiled, and give up work to look after their menfolk (helping to bring down France’s unemployment rate). Polygamy is made legal. France embarks on a geopolitical project to merge Europe with Muslim Mediterranean states. Saudi Arabia has poured petrodollars into better pay for professors and posh apartments on the city’s left bank. And his own university has been rebranded the Islamic University of Paris-Sorbonne. Will François, an atheist, resist, or flee the new regime or compromise with it?
While this sounds like a graphic representation of Islamophobic fears prevalent not only in France but across much of Europe, Houellebecq says:
I tried to put myself in the place of a Muslim, and I realized that, in reality, they are in a totally schizophrenic situation. Because overall Muslims aren’t interested in economic issues, their big issues are what we nowadays call societal issues. On these issues, obviously, they are very far from the left and even further from the Green Party. Just think of gay marriage and you’ll see what I mean, but the same is true across the board. And one doesn’t really see why they’d vote for the right, much less for the extreme right, which utterly rejects them. So if a Muslim wants to vote, what’s he supposed to do? The truth is, he’s in an impossible situation. He has no representation whatsoever.
I think there is a real need for God and that the return of religion is not a slogan but a reality, and that it is very much on the rise.
That hypothesis is central to the book, but we know that it has been discredited for many years by numerous researchers, who have shown that we are actually witnessing a progressive secularization of Islam, and that violence and radicalism should be understood as the death throes of Islamism. That is the argument made by Olivier Roy, and many other people who have worked on this question for more than twenty years.
This is not what I have observed, although in North and South America, Islam has benefited less than the evangelicals. This is not a French phenomenon, it’s almost global. I don’t know about Asia, but the case of Africa is interesting because there you have the two great religious powers on the rise — evangelical Christianity and Islam. I remain in many ways a Comtean, and I don’t believe that a society can survive without religion.
[I]n your book you describe, in a very blurry and vague way, various world events, and yet the reader never knows quite what these are. This takes us into the realm of fantasy, doesn’t it, into the politics of fear.
Yes, perhaps. Yes, the book has a scary side. I use scare tactics.
Like imagining the prospect of Islam taking over the country?
Actually, it’s not clear what we are meant to be afraid of, nativists or Muslims. I leave that unresolved.
Have you asked yourself what the effect might be of a novel based on such a hypothesis?
None. No effect whatsoever.
You don’t think it will help reinforce the image of France that I just described, in which Islam hangs overhead like the sword of Damocles, like the most frightening thing of all?
In any case, that’s pretty much all the media talks about, they couldn’t talk about it more. It would be impossible to talk about it more than they already do, so my book won’t have any effect.
Doesn’t it make you want to write about something else so as not to join the pack?
No, part of my work is to talk about what everyone is talking about, objectively. I belong to my own time.
[Y]our book describes the replacement of the Catholic religion by Islam.
No. My book describes the destruction of the philosophy handed down by the Enlightenment, which no longer makes sense to anyone, or to very few people. Catholicism, by contrast, is doing rather well. I would maintain that an alliance between Catholics and Muslims is possible. We’ve seen it happen before, it could happen again.
You who have become an agnostic, you can look on cheerfully and watch the destruction of Enlightenment philosophy?
Yes. It has to happen sometime and it might as well be now. In this sense, too, I am a Comtean. We are in what he calls the metaphysical stage, which began in the Middle Ages and whose whole point was to destroy the phase that preceded it. In itself, it can produce nothing, just emptiness and unhappiness. So yes, I am hostile to Enlightenment philosophy, I need to make that perfectly clear.
[I]f Catholicism doesn’t work, that’s because it’s already run its course, it seems to belong to the past, it has defeated itself. Islam is an image of the future. Why has the idea of the Nation stalled out? Because it’s been abused too long.
Some might be surprised that you chose to go in this direction when your last book was greeted as such a triumph that it silenced your critics.
The true answer is that, frankly, I didn’t choose. The book started with a conversion to Catholicism that should have taken place but didn’t.
Isn’t there something despairing about this gesture, which you didn’t really choose?
The despair comes from saying good-bye to a civilization, however ancient. But in the end the Koran turns out to be much better than I thought, now that I’ve reread it — or rather, read it. The most obvious conclusion is that the jihadists are bad Muslims. Obviously, as with all religious texts, there is room for interpretation, but an honest reading will conclude that a holy war of aggression is not generally sanctioned, prayer alone is valid. So you might say I’ve changed my opinion. That’s why I don’t feel that I’m writing out of fear.
In its crudest expressions, the Clash of Cultures discourse presents a Christian West threatened by Islam, but many of those who reject this narrative use one that is no less polarizing. It presents secular moderates challenged by Islamic extremists — it’s still Religion vs. The Enlightenment, superstition vs. reason.
Much as the West promotes the idea of religious freedom in the context of civil liberties, religion is meant to be a private affair that doesn’t intrude into the social sphere outside the carefully circumscribed territories of church, temple, and mosque. We expect religious freedom to be coupled with religious restraint.
The real struggle, it seems to me, is not ultimately philosophical and theological — it’s not about the existence or non-existence of God. It’s about values.
What count are not values that serve as emblems of identity (often wrapped around nationalism), but instead those that guide individual action and shape society.
We profess values which are libertarian and egalitarian and yet have created societies in which the guiding values are those of materialism, competition, and personal autonomy — values that are all socially corrosive.
Society is relentlessly being atomized, reduced to a social unit of one, captured in the lonely image of the selfie. This is what we’ve been sold and what we’ve bought, but I don’t think it’s what we want.
Spellbound by technological progress, we have neither expected nor demanded that material advances should lead to social advances — that better equipped societies should also be better functioning, happier, more caring societies.
What the false promise of materially sustained, individual autonomy has created is the expectation that the more control we possess over life, the better it will get. We imagine that we must either be in control or fall under control.
From this vantage point, the concept of submission provokes fears of domination, and yet what it really all it means is to come into alignment with the way things are.
Where religion intrudes and so often fails is through the forcible imposition of rigid representations of such an alignment. But submission itself means seeing we belong to life — something that cannot be possessed or controlled.
Friluftsliv, shinrin-yoku, hygge, wabi-sabi, kaizen, gemütlichkeit, and jugaad?
Starre Vartan writes about cultural concepts most of us have never heard of: Friluftsliv translates directly from Norwegian as “free air life,” which doesn’t quite do it justice. Coined relatively recently, in 1859, it is the concept that being outside is good for human beings’ mind and spirit. “It is a term in Norway that is used often to describe a way of life that is spent exploring and appreciating nature,” Anna Stoltenberg, culture coordinator for Sons of Norway, a U.S.-based Norwegian heritage group, told MNN. Other than that, it’s not a strict definition: it can include sleeping outside, hiking, taking photographs or meditating, playing or dancing outside, for adults or kids. It doesn’t require any special equipment, includes all four seasons, and needn’t cost much money. Practicing friluftsliv could be as simple as making a commitment to walking in a natural area five days a week, or doing a day-long hike once a month.
Shinrin-yoku is a Japanese term that means “forest bathing” and unlike the Norwegian translation above, this one seems a perfect language fit (though a pretty similar idea). The idea being that spending time in the forest and natural areas is good preventative medicine, since it lowers stress, which causes or exacerbates some of our most intractable health issues. As MNN’s Catie Leary details, this isn’t just a nice idea — there’s science behind it: “The “magic” behind forest bathing boils down to the naturally produced allelochemic substances known as phytoncides, which are kind of like pheromones for plants. Their job is to help ward off pesky insects and slow the growth of fungi and bacteria. When humans are exposed to phytoncides, these chemicals are scientifically proven to lower blood pressure, relieve stress and boost the growth of cancer-fighting white blood cells. Some common examples of plants that give off phytoncides include garlic, onion, pine, tea tree and oak, which makes sense considering their potent aromas.” [Continue reading…]
Neil Postman: The man who predicted Fox News, the internet, Stephen Colbert and reality TV
Scott Timberg writes: These days, even the kind of educated person who might have once disdained TV and scorned electronic gadgets debates plot turns from “Game of Thrones” and carries an app-laden iPhone. The few left concerned about the effects of the Internet are dismissed as Luddites or killjoys who are on the wrong side of history. A new kind of consensus has shaped up as Steve Jobs becomes the new John Lennon, Amanda Palmer the new Liz Phair, and Elon Musk’s rebel cool graces magazines covers. Conservatives praise Silicon Valley for its entrepreneurial energy; a Democratic president steers millions of dollars of funding to Amazon.
It seems like a funny era for the work of a cautionary social critic, one often dubious about the wonders of technology – including television — whose most famous book came out three decades ago. But the neoliberal post-industrial world now looks chillingly like the one Neil Postman foresaw in books like “Amusing Ourselves to Death” and “Technopoly: The Surrender of Culture to Technology.” And the people asking the important questions about where American society is going are taking a page from him.
“Amusing Ourselves” didn’t argue that regular TV shows were bad or dangerous. It insisted instead that the medium would reshape every other sphere with which it engaged: By using the methods of entertainment, TV would trivialize what the book jacket calls “politics, education, religion, and journalism.”
“It just blew me away,” says D.C.-based politics writer Matt Bai, who read the 1985 book “Amusing Ourselves to Death” while trying to figure out how the press and media became obsessed with superficiality beginning in the ‘80s. “So much of what I’d been thinking about was pioneered so many years before,” says Bai – whose recent book, “All the Truth Is Out: The Week Politics Went Tabloid,” looks at the 1987 Gary Hart sex scandal that effectively ended the politician’s career. “It struck me as incredibly relevant … And the more I reported the book, the more relevant it became.”
Bai isn’t alone. While he’s hardly a household name, Postman has become an important guide to the world of the Internet though most of his work was written before its advent. Astra Taylor, a documentary filmmaker and Occupy activist, turned to his books while she was plotting out what became “The People’s Platform: Taking Back Power and Culture in the Digital Age.” Douglas Rushkoff — a media theorist whose book “Present Shock: When Everything Happens Now,” is one of the most lucid guides to our bewildering age — is indebted to his work. Michael Harris’ recent “The End of Absence” is as well. And Jaron Lanier, the virtual-reality inventor and author (“Who Owns the Future?”) who’s simultaneously critic and tech-world insider, sees Postman as an essential figure whose work becomes more crucial every year.
“There’s this kind of dialogue around technology where people dump on each other for ‘not getting it,’” Lanier says. “Postman does not seem to be vulnerable to that accusation: He was old-fashioned but he really transcended that. I don’t remember him saying, ‘When I was a kid, things were better.’ He called on fundamental arguments in very broad terms – the broad arc of human history and ethics.” [Continue reading…]
Why has progress stalled?
Michael Hanlon writes: We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.
The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.
Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.
There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.
Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. [Continue reading…]
E.O. Wilson talks about the threat to Earth’s biodiversity
How civilization has given humans brittle bones
Nicholas St. Fleur writes: Somewhere in a dense forest of ash and elm trees, a hunter readies his spear for the kill. He hurls his stone-tipped weapon at his prey, an unsuspecting white-tailed deer he has tracked since morning. The crude projectile pierces the animal’s hide, killing it and giving the hunter food to bring back to his family many miles away. Such was survival circa 5,000 B.C. in ancient North America.
But today, the average person barely has to lift a finger, let alone throw a spear to quell their appetite. The next meal is a mere online order away. And according to anthropologists, this convenient, sedentary way of life is making bones weak. Ahead, there’s a future of fractures, breaks, and osteoporosis. But for some anthropologists, the key to preventing aches in bones is by better understanding the skeletons of our hunter-gatherer ancestors.
“Over the vast majority of human prehistory, our ancestors engaged in far more activity over longer distances than we do today,” said Brian Richmond, an anthropologist from the American Museum of Natural History in New York, in a statement. “We cannot fully understand human health today without knowing how our bodies evolved to work in the past, so it is important to understand how our skeletons evolved within the context of those high levels of activity.”
For thousands of years, Native American hunter-gatherers trekked on strenuous ventures for food. And for those same thousands of years, dense skeletons supported their movements. But about 6,000 years later with the advent of agriculture the bones and joints of Native Americans became less rigid and more fragile. Similar transitions occurred across the world as populations shifted from foraging to farming, according to two new papers published Monday in the Proceedings of the National Academies of Sciences. [Continue reading…]