Palmyra and its ancient ruins have fallen to ISIS

The New York Times reports: Islamic State militants swept into the desert city of Palmyra in central Syria on Wednesday, and by evening were in control of it, residents and Syrian state news media said, a victory that gives them another strategically important prize five days after the group seized the Iraqi city of Ramadi.

Palmyra has extra resonance, with its grand complex of 2,000-year-old colonnades and tombs, one of the world’s most magnificent remnants of antiquity, as well as the grimmer modern landmark of Tadmur Prison, where Syrian dissidents have languished over the decades.

But for the fighters on the ground, the city of 50,000 people is significant because it sits among gas fields and astride a network of roads across the country’s central desert. Palmyra’s vast unexcavated antiquities could also provide significant revenue through illegal trafficking.

Control of Palmyra gives the Islamic State command of roads leading from its strongholds in eastern Syria to Damascus and the other major cities of the populated west, as well as new links to western Iraq, the other half of its self-declared caliphate.

The advance, in which residents described soldiers and the police fleeing, wounded civilians unable to reach hospitals and museum workers hurrying to pack up antiquities, comes even as the United States is scrambling to come up with a response to the loss of Ramadi, the capital of Iraq’s Anbar Province.

The two successes, at opposite ends of a battlefield sprawling across two countries, showed the Islamic State’s ability to shake off setbacks and advance on multiple fronts, less than two months after it was driven from the Iraqi city of Tikrit — erasing any notion that the group had suffered a game-changing blow. [Continue reading…]

Prof Kevin Butcher writes: From modest beginnings in the 1st Century BC, Palmyra gradually rose to prominence under the aegis of Rome until, during the 3rd Century AD, the city’s rulers challenged Roman power and created an empire of their own that stretched from Turkey to Egypt.

The story of its Queen Zenobia, who fought against the Roman Emperor Aurelian, is well known; but it is less well-known that Palmyra also fought another empire: that of the Sasanian Persians.

In the middle of the third century, when the Sasanians invaded the Roman Empire and captured the Emperor Valerian, it was the Palmyrenes who defeated them and drove them back across the Euphrates.

For several decades Rome had to rely on Palmyrene power to prop up its declining influence in the east.

Palmyra was a great Middle Eastern achievement, and was unlike any other city of the Roman Empire.

It was quite unique, culturally and artistically. In other cities the landed elites normally controlled affairs, whereas in Palmyra a merchant class dominated the political life, and the Palmyrenes specialised in protecting merchant caravans crossing the desert. [Continue reading…]

facebooktwittermail

ISIS advance in Syria endangers ancient ruins at Palmyra

Palmyra

The New York Times reports: Islamic State militants advanced to the outskirts of the Syrian town of Palmyra on Thursday, putting the extremist group within striking distance of some of the world’s most magnificent antiquities.

That raised fears that the ancient city of Palmyra, with its complex of columns, tombs and ancient temples dating to the first century A.D., could be looted or destroyed. Militants from the Islamic State, also known as ISIS or ISIL, have already destroyed large parts of ancient sites at Nimrud, Hatra and Nineveh in Iraq. Islamic State leaders denounce pre-Islamic art and architecture as idolatrous even as they sell smaller, more portable artifacts to finance their violent rampage through the region.

The fighting on Thursday took place little more than a mile from the city’s grand 2,000-year-old ruins, which stand as the crossroad of Greek, Roman, Persian and Islamic cultures.

People in Palmyra described a state of anxiety and chaos, with residents trying to flee the northern neighborhoods. [Continue reading…]

facebooktwittermail

Big drop in share of Americans calling themselves Christian

The New York Times reports: The Christian share of adults in the United States has declined sharply since 2007, affecting nearly all major Christian traditions and denominations, and crossing age, race and region, according to an extensive survey by the Pew Research Center.

Seventy-one percent of American adults were Christian in 2014, the lowest estimate from any sizable survey to date, and a decline of 5 million adults and 8 percentage points since a similar Pew survey in 2007.

The Christian share of the population has been declining for decades, but the pace rivals or even exceeds that of the country’s most significant demographic trends, like the growing Hispanic population. It is not confined to the coasts, the cities, the young or the other liberal and more secular groups where one might expect it, either.

“The decline is taking place in every region of the country, including the Bible Belt,” said Alan Cooperman, the director of religion research at the Pew Research Center and the lead editor of the report. [Continue reading…]

facebooktwittermail

The world beyond your head

Matthew Crawford, author of The World Beyond Your Head, talks to Ian Tuttle.

Crawford: Only by excluding all the things that grab at our attention are we able to immerse ourselves in something worthwhile, and vice versa: When you become absorbed in something that is intrinsically interesting, that burden of self-regulation is greatly reduced.

Tuttle: To the present-day consequences. The first, and perhaps most obvious, consequence is a moral one, which you address in your harrowing chapter on machine gambling: “If we have no robust and demanding picture of what a good life would look like, then we are unable to articulate any detailed criticism of the particular sort of falling away from a good life that something like machine gambling represents.” To modern ears that sentence sounds alarmingly paternalistic. Is the notion of “the good life” possible in our age? Or is it fundamentally at odds with our political and/or philosophical commitments?

Crawford: Once you start digging into the chilling details of machine gambling, and of other industries such as mobile gaming apps that emulate the business model of “addiction by design” through behaviorist conditioning, you may indeed start to feel a little paternalistic — if we can grant that it is the role of a pater to make scoundrels feel unwelcome in the town.

According to the prevailing notion, freedom manifests as “preference-satisfying behavior.” About the preferences themselves we are to maintain a principled silence, out of deference to the autonomy of the individual. They are said to express the authentic core of the self, and are for that reason unavailable for rational scrutiny. But this logic would seem to break down when our preferences are the object of massive social engineering, conducted not by government “nudgers” but by those who want to monetize our attention.

My point in that passage is that liberal/libertarian agnosticism about the human good disarms the critical faculties we need even just to see certain developments in the culture and economy. Any substantive notion of what a good life requires will be contestable. But such a contest is ruled out if we dogmatically insist that even to raise questions about the good life is to identify oneself as a would-be theocrat. To Capital, our democratic squeamishness – our egalitarian pride in being “nonjudgmental” — smells like opportunity. Commercial forces step into the void of cultural authority, where liberals and libertarians fear to tread. And so we get a massive expansion of an activity — machine gambling — that leaves people compromised and degraded, as well as broke. And by the way, Vegas is no longer controlled by the mob. It’s gone corporate.

And this gets back to what I was saying earlier, about how our thinking is captured by obsolete polemics from hundreds of years ago. Subjectivism — the idea that what makes something good is how I feel about it — was pushed most aggressively by Thomas Hobbes, as a remedy for civil and religious war: Everyone should chill the hell out. Live and let live. It made sense at the time. This required discrediting all those who claim to know what is best. But Hobbes went further, denying the very possibility of having a better or worse understanding of such things as virtue and vice. In our time, this same posture of value skepticism lays the public square bare to a culture industry that is not at all shy about sculpting souls – through manufactured experiences, engineered to appeal to our most reliable impulses. That’s how one can achieve economies of scale. The result is a massification of the individual. [Continue reading…]

facebooktwittermail

From chimps to bees and bacteria, how animals hold elections

By Robert John Young, University of Salford

Lots of people find elections dull, but there’s nothing boring about the political manoeuvres that take place in the animal kingdom. In the natural world, jockeying for advantage, whether this is conscious or merely mechanical, can be a matter of life or death.

Chimpanzees, our closest relatives, are highly political. They’re smart enough to realise that in the natural world brute strength will only get you so far – getting to the top of a social group and remaining there requires political guile.

It’s all about making friends and influencing others. Chimps make friends by grooming each other and forming alliances; this behaviour is especially prominent in males wishing to be group leader. In times of dispute they call upon their friends for assistance or when they sense a coup may be successful. And the ruling group either reaffirms its position or a new group grabs control – but having the weight of numbers is normally critical to success.

You’ve got my vote.
IFAW, CC BY-NC

Back in the 1980s, the leading Dutch primatologist Frans de Waal spent six years researching the world’s largest captive colony for his classic book Chimpanzee Politics. He soon realised that, in addition to forming cliques, chimp politics still involves some degree of aggression.

Humans in modern societies have largely replaced antagonistic takeovers with voting. Chimps do not, however, live in a democratic society. For them, the social structure of the ruling party is usually one based on male hierarchy, where dominant individuals have best access to the resources available – usually food and females.

[Read more…]

facebooktwittermail

Jihadists destroy proposed world heritage site in Mali

The Associated Press reports: Jihadists have destroyed a mausoleum in central Mali that had been submitted as a U.N. World Heritage site, leaving behind a warning that they will come after all those who don’t follow their strict version of Islam, a witness said Monday.

The dynamite attack on the mausoleum of Cheick Amadou Barry mirrors similar ones that were carried out in northern Mali in 2012 when jihadists seized control of the major towns there. The destruction also comes as concerns grow about the emergence of a new extremist group active much further south and closer to the capital.

Barry was a marabout, or important Islamic religious leader, in the 19th century who helped to spread Islam among the animists of central Mali. One of his descendants, Bologo Amadou Barry, confirmed to The Associated Press that the site had been partially destroyed in Hamdallahi village on Sunday night.

The jihadists left behind a note on Sunday warning they would attack all those who did not follow the teachings of Islam’s prophet.

“They also threatened France and the U.N. peacekeepers and all those who work with them,” Bologo Amadou Barry said. [Continue reading…]

facebooktwittermail

Chain reactions spreading ideas through science and culture

David Krakauer writes: On Dec. 2, 1942, just over three years into World War II, President Roosevelt was sent the following enigmatic cable: “The Italian navigator has landed in the new world.” The accomplishments of Christopher Columbus had long since ceased to be newsworthy. The progress of the Italian physicist, Enrico Fermi, navigator across the territories of Lilliputian matter — the abode of the microcosm of the atom — was another thing entirely. Fermi’s New World, discovered beneath a Midwestern football field in Chicago, was the province of newly synthesized radioactive elements. And Fermi’s landing marked the earliest sustained and controlled nuclear chain reaction required for the construction of an atomic bomb.

This physical chain reaction was one of the links of scientific and cultural chain reactions initiated by the Hungarian physicist, Leó Szilárd. The first was in 1933, when Szilárd proposed the idea of a neutron chain reaction. Another was in 1939, when Szilárd and Einstein sent the now famous “Szilárd-Einstein” letter to Franklin D. Roosevelt informing him of the destructive potential of atomic chain reactions: “This new phenomenon would also lead to the construction of bombs, and it is conceivable — though much less certain — that extremely powerful bombs of a new type may thus be constructed.”

This scientific information in turn generated political and policy chain reactions: Roosevelt created the Advisory Committee on Uranium which led in yearly increments to the National Defense Research Committee, the Office of Scientific Research and Development, and finally, the Manhattan Project.

Life itself is a chain reaction. Consider a cell that divides into two cells and then four and then eight great-granddaughter cells. Infectious diseases are chain reactions. Consider a contagious virus that infects one host that infects two or more susceptible hosts, in turn infecting further hosts. News is a chain reaction. Consider a report spread from one individual to another, who in turn spreads the message to their friends and then on to the friends of friends.

These numerous connections that fasten together events are like expertly arranged dominoes of matter, life, and culture. As the modernist designer Charles Eames would have it, “Eventually everything connects — people, ideas, objects. The quality of the connections is the key to quality per se.”

Dominoes, atoms, life, infection, and news — all yield domino effects that require a sensitive combination of distances between pieces, physics of contact, and timing. When any one of these ingredients is off-kilter, the propagating cascade is likely to come to a halt. Premature termination is exactly what we might want to happen to a deadly infection, but it is the last thing that we want to impede an idea. [Continue reading…]

facebooktwittermail

The cautious rise of atheism and religious doubt in the Arab world

Ahmed Benchemsi writes: Last December, Dar Al Ifta, a venerable Cairo-based institution charged with issuing Islamic edicts, cited an obscure poll according to which the exact number of Egyptian atheists was 866. The poll provided equally precise counts of atheists in other Arab countries: 325 in Morocco, 320 in Tunisia, 242 in Iraq, 178 in Saudi Arabia, 170 in Jordan, 70 in Sudan, 56 in Syria, 34 in Libya, and 32 in Yemen. In total, exactly 2,293 nonbelievers in a population of 300 million.

Many commentators ridiculed these numbers. The Guardian asked Rabab Kamal, an Egyptian secularist activist, if she believed the 866 figure was accurate. “I could count more than that number of atheists at Al Azhar University alone,” she replied sarcastically, referring to the Cairo-based academic institution that has been a center of Sunni Islamic learning for almost 1,000 years. Brian Whitaker, a veteran Middle East correspondent and the author of Arabs Without God, wrote, “One possible clue is that the figure for Jordan (170) roughly corresponds to the membership of a Jordanian atheist group on Facebook. So it’s possible that the researchers were simply trying to identify atheists from various countries who are active in social media.”

Even by that standard, Dar Al Ifta’s figures are rather low. When I recently searched Facebook in both Arabic and English, combining the word “atheist” with names of different Arab countries, I turned up over 250 pages or groups, with memberships ranging from a few individuals to more than 11,000. And these numbers only pertain to Arab atheists (or Arabs concerned with the topic of atheism) who are committed enough to leave a trace online. “My guess is, every Egyptian family contains an atheist, or at least someone with critical ideas about Islam,” an atheist compatriot, Momen, told Egyptian historian Hamed Abdel-Samad recently. “They’re just too scared to say anything to anyone.”

While Arab states downplay the atheists among their citizens, the West is culpable in its inability to even conceive of an Arab atheist. In Western media, the question is not if Arabs are religious, but rather to what extent their (assumed) religiosity can harm the West. In Europe, the debate focuses on immigration (are “Muslim immigrants” adverse to secular freedoms?) while in the United States, the central topic is terrorism (are “Muslims” sympathetic to it?). As for the political debate, those on the right suspect “Muslims” of being hostile to individual freedoms and sympathetic to jihad, while leftists seek to exonerate “Muslims” by highlighting their “peaceful” and “moderate” religiosity. But no one is letting the Arab populations off the hook for their Muslimhood. Both sides base their argument on the premise that when it comes to Arab people, religiosity is an unquestionable given, almost an ethnic mandate embedded in their DNA. [Continue reading…]

facebooktwittermail

Afghan novelist: ‘We live in a vacuum, lacking heroes and ideals’

Mujib Mashal writes: Four large clocks tick out of sync, puncturing the silence of his Soviet-built apartment. A half-burned candle sits next to a stack of books. A small television is covered in soot.

This is where Rahnaward Zaryab, Afghanistan’s most celebrated novelist, locks himself up for weeks at a time, lost in bottles of smuggled vodka and old memories of Kabul, a capital city long transformed by war and money.

“We live in a vacuum, lacking heroes and ideals,” Mr. Zaryab reads from his latest manuscript, handwritten on the back of used paper. The smoke from his Pine cigarette, a harsh South Korean brand, clings to yellowed walls. “The heroes lie in dust, the ideals are ridiculed.”

The product of a rare period of peace and tolerance in Afghan history, Mr. Zaryab’s work first flourished in the 1970s, before the country was unraveled by invasion and civil war. Afghanistan still had a vibrant music and theater scene, and writers had a broad readership that stretched beyond just the political elite.

“I would receive letters from girls that would smell of perfume when you opened them,” Mr. Zaryab, who is 70, remembered fondly.

Mr. Zaryab’s stories are informed by his readings of Western philosophy and literature, the writer Homaira Qaderi said. He was educated on scholarships in New Zealand and Britain. But his heroes are indigenous and modest, delicately questioning the dogma and superstitions of a conservative society.

“He is the first writer to focus on the structure of stories, with the eye of someone well read,” Ms. Qaderi said. “We call him the father of new storytelling in Afghanistan.”

But after he became the standard-bearer for Afghan literature, Mr. Zaryab was forced to watch as Kabul, the muse he idealized as a city of music and chivalry in most of his 17 books, fell into rubble and chaos.

Some of the chaos has eased over the past decade, but that has caused him even more pain. He loathes how Kabul has been rebuilt: on a foundation of American cash and foreign values, paving over Afghan culture.

“Money, money, money,” he said, cringing. “Everyone is urged to make money, in any way they can. Art, culture and literature have been forgotten completely.” [Continue reading…]

facebooktwittermail

Why do memes go viral, and should we care?

Abby Rabinowitz writes: On April 11, 2012, Zeddie Little appeared on Good Morning America, wearing the radiant, slightly perplexed smile of one enjoying instant fame. About a week earlier, Little had been a normal, if handsome, 25-year-old trying to make it in public relations. Then on March 31, he was photographed amid a crowd of runners in a South Carolina race by a stranger, Will King, who posted the image to a social networking website, Reddit. Dubbed “Ridiculously Photogenic Guy,” Little’s picture circulated on Facebook, Twitter, and Tumblr, accruing likes, comments, and captions (“Picture gets put up as employee of the month/for a company he doesn’t work for”). It spawned spinoffs (Ridiculously Photogenic Dog, Prisoner, and Syrian Rebel) and leapt to the mainstream media. At a high point, ABC Morning News reported that a Google search for “Zeddie Little” yielded 59 million hits.

Why the sudden fame? The truth is that Little hadn’t become famous: His meme had. According to website Know Your Meme, which documents viral Internet phenomena, a meme is “a piece of content or an idea that’s passed from person to person, changing and evolving along the way.” Ridiculously Photogenic Guy is a kind of Internet meme represented by LOL cats: that is, a photograph, video, or cartoon, often overlaid with a snarky message, perfect for incubating in the bored, fertile minds of cubicle workers and college students. In an age where politicians campaign through social media and viral marketers ponder the appeal of sneezing baby pandas, memes are more important than ever—however trivial they may seem.

But trawling the Internet, I found a strange paradox: While memes were everywhere, serious meme theory was almost nowhere. Richard Dawkins, the famous evolutionary biologist who coined the word “meme” in his classic 1976 book, The Selfish Gene, seemed bent on disowning the Internet variety, calling it a “hijacking” of the original term. The peer-reviewed Journal of Memetics folded in 2005. “The term has moved away from its theoretical beginnings, and a lot of people don’t know or care about its theoretical use,” philosopher and meme theorist Daniel Dennett told me. What has happened to the idea of the meme, and what does that evolution reveal about its usefulness as a concept? [Continue reading…]

facebooktwittermail

Can a dying language be saved?

Judith Thurman writes: It is a singular fate to be the last of one’s kind. That is the fate of the men and women, nearly all of them elderly, who are — like Marie Wilcox, of California; Gyani Maiya Sen, of Nepal; Verdena Parker, of Oregon; and Charlie Mungulda, of Australia — the last known speakers of a language: Wukchumni, Kusunda, Hupa, and Amurdag, respectively. But a few years ago, in Chile, I met Joubert Yanten Gomez, who told me he was “the world’s only speaker of Selk’nam.” He was twenty-one.

Yanten Gomez, who uses the tribal name Keyuk, grew up modestly, in Santiago. His father, Blas Yanten, is a woodworker, and his mother, Ivonne Gomez Castro, practices traditional medicine. As a young girl, she was mocked at school for her mestizo looks, so she hesitated to tell her children — Keyuk and an older sister — about their ancestry. They hadn’t known that their maternal relatives descended from the Selk’nam, a nomadic tribe of unknown origin that settled in Tierra del Fuego. The first Europeans to encounter the Selk’nam, in the sixteenth century, were astonished by their height and their hardiness — they braved the frigid climate by coating their bodies with whale fat. The tribe lived mostly undisturbed until the late eighteen-hundreds, when an influx of sheep ranchers and gold prospectors who coveted their land put bounties on their heads. (One hunter boasted that he had received a pound sterling per corpse, redeemable with a pair of ears.) The survivors of the Selk’nam Genocide, as it is called — a population of about four thousand was reduced to some three hundred — were resettled on reservations run by missionaries. The last known fluent speaker of the language, Angela Loij, a laundress and farmer, died forty years ago.

Many children are natural mimics, but Keyuk could imitate speech like a mynah. His father, who is white, had spent part of his childhood in the Arauco region, which is home to the Mapuche, Chile’s largest native community, and he taught Keyuk their language, Mapudungun. The boy, a bookworm and an A student, easily became fluent. A third-grade research project impassioned him about indigenous peoples, and Ivonne, who descends from a line of shamans, took this as a sign that his ancestors were speaking through him. When she told him of their heritage, Keyuk vowed that he would master Selk’nam and also, eventually, Yagán — the nearly extinct language of a neighboring people in the far south — reckoning that he could pass them down to his children and perhaps reseed the languages among the tribes’ descendants. At fourteen, he travelled with his father to Puerto Williams, a town in Chile’s Antarctic province that calls itself “the world’s southernmost city,” to meet Cristina Calderón, the last native Yagán speaker. She subsequently tutored him by phone. [Continue reading…]

facebooktwittermail

A phony populism is denying Americans the joys of serious thought

Steve Wasserman writes: The vast canvas afforded by the Internet has done little to encourage thoughtful and serious criticism. Mostly it has provided a vast Democracy Wall on which any crackpot can post his or her manifesto. Bloggers bloviate and insults abound. Discourse coarsens. Information is abundant, wisdom scarce. It is a striking irony, as Leon Wieseltier has noted, that with the arrival of the Internet, “a medium of communication with no limitations of physical space, everything on it has to be in six hundred words.” The Internet, he said, is the first means of communication invented by humankind that privileges one’s first thoughts as one’s best thoughts. And he rightly observed that if “value is a function of scarcity,” then “what is most scarce in our culture is long, thoughtful, patient, deliberate analysis of questions that do not have obvious or easy answers.” Time is required to think through difficult questions. Patience is a condition of genuine intellection. The thinking mind, the creating mind, said Wieseltier, should not be rushed. “And where the mind is rushed and made frenetic, neither thought nor creativity will ensue. What you will most likely get is conformity and banality. Writing is not typed talking.”

The fundamental idea at stake in the criticism of culture generally is the self-image of society: how it reasons with itself, describes itself, imagines itself. Nothing in the excitements made possible by the digital revolution banishes the need for the rigor such self-reckoning requires. It is, as Wieseltier says, the obligation of cultural criticism to bear down on what matters. [Continue reading…]

facebooktwittermail

Steven Pinker is wrong about violence and war

In an essay challenging Steven Pinker’s thesis that the world is becoming progressively more peaceful, John Gray writes: While it is true that war has changed, it has not become less destructive. Rather than a contest between well-organised states that can at some point negotiate peace, it is now more often a many-sided conflict in fractured or collapsed states that no one has the power to end. The protagonists are armed irregulars, some of them killing and being killed for the sake of an idea or faith, others from fear or a desire for revenge and yet others from the world’s swelling armies of mercenaries, who fight for profit. For all of them, attacks on civilian populations have become normal. The ferocious conflict in Syria, in which methodical starvation and the systematic destruction of urban environments are deployed as strategies, is an example of this type of warfare.

It may be true that the modern state’s monopoly of force has led, in some contexts, to declining rates of violent death. But it is also true that the power of the modern state has been used for purposes of mass killing, and one should not pass too quickly over victims of state terror. With increasing historical knowledge it has become clear that the “Holocaust-by-bullets” – the mass shootings of Jews, mostly in the Soviet Union, during the second world war – was perpetrated on an even larger scale than previously realised. Soviet agricultural collectivisation incurred millions of foreseeable deaths, mainly as a result of starvation, with deportation to uninhabitable regions, life-threatening conditions in the Gulag and military-style operations against recalcitrant villages also playing an important role. Peacetime deaths due to internal repression under the Mao regime have been estimated to be around 70 million. Along with fatalities caused by state terror were unnumbered millions whose lives were irreparably broken and shortened. How these casualties fit into the scheme of declining violence is unclear. Pinker goes so far as to suggest that the 20th-century Hemoclysm [the tide of 20th-century mass murder in which Pinker includes the Holocaust] might have been a gigantic statistical fluke, and cautions that any history of the last century that represents it as having been especially violent may be “apt to exaggerate the narrative coherence of this history” (the italics are Pinker’s). However, there is an equal or greater risk in abandoning a coherent and truthful narrative of the violence of the last century for the sake of a spurious quantitative precision.

Estimating the numbers of those who die from violence involves complex questions of cause and effect, which cannot always be separated from moral judgments. There are many kinds of lethal force that do not produce immediate death. Are those who die of hunger or disease during war or its aftermath counted among the casualties? Do refugees whose lives are cut short appear in the count? Where torture is used in war, will its victims figure in the calculus if they succumb years later from the physical and mental damage that has been inflicted on them? Do infants who are born to brief and painful lives as a result of exposure to Agent Orange or depleted uranium find a place in the roll call of the dead? If women who have been raped as part of a military strategy of sexual violence die before their time, will their passing feature in the statistical tables?

While the seeming exactitude of statistics may be compelling, much of the human cost of war is incalculable. Deaths by violence are not all equal. It is terrible to die as a conscript in the trenches or a civilian in an aerial bombing campaign, but to perish from overwork, beating or cold in a labour camp can be a greater evil. It is worse still to be killed as part of a systematic campaign of extermination as happened to those who were consigned to death camps such as Treblinka. Disregarding these distinctions, the statistics presented by those who celebrate the arrival of the Long Peace are morally dubious if not meaningless. [Continue reading…]

facebooktwittermail

A deficit in patience produces the illusion of a shortage of time

Chelsea Wald writes: Not long ago I diagnosed myself with the recently identified condition of sidewalk rage. It’s most pronounced when it comes to a certain friend who is a slow walker. Last month, as we sashayed our way to dinner, I found myself biting my tongue, thinking, I have to stop going places with her if I ever want to … get there!

You too can measure yourself on the “Pedestrian Aggressiveness Syndrome Scale,” a tool developed by University of Hawaii psychologist Leon James. While walking in a crowd, do you find yourself “acting in a hostile manner (staring, presenting a mean face, moving closer or faster than expected)” and “enjoying thoughts of violence?”

Slowness rage is not confined to the sidewalk, of course. Slow drivers, slow Internet, slow grocery lines — they all drive us crazy. Even the opening of this article may be going on a little too long for you. So I’ll get to the point. Slow things drive us crazy because the fast pace of society has warped our sense of timing. Things that our great-great-grandparents would have found miraculously efficient now drive us around the bend. Patience is a virtue that’s been vanquished in the Twitter age.

Once upon a time, cognitive scientists tell us, patience and impatience had an evolutionary purpose. They constituted a yin and yang balance, a finely tuned internal timer that tells when we’ve waited too long for something and should move on. When that timer went buzz, it was time to stop foraging at an unproductive patch or abandon a failing hunt.

“Why are we impatient? It’s a heritage from our evolution,” says Marc Wittmann, a psychologist at the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany. Impatience made sure we didn’t die from spending too long on a single unrewarding activity. It gave us the impulse to act.

But that good thing is gone. The fast pace of society has thrown our internal timer out of balance. It creates expectations that can’t be rewarded fast enough — or rewarded at all. When things move more slowly than we expect, our internal timer even plays tricks on us, stretching out the wait, summoning anger out of proportion to the delay. [Continue reading…]

facebooktwittermail

The mythical secular civilization promoted by evangelical atheists

John Gray writes: Considering the alternatives that are on offer, liberal societies are well worth defending. But there is no reason for thinking these societies are the beginning of a species-wide secular civilisation of the kind of which evangelical atheists dream.

In ancient Greece and Rome, religion was not separate from the rest of human activity. Christianity was less tolerant than these pagan societies, but without it the secular societies of modern times would hardly have been possible. By adopting the distinction between what is owed to Caesar and what to God, Paul and Augustine – who turned the teaching of Jesus into a universal creed – opened the way for societies in which religion was no longer coextensive with life. Secular regimes come in many shapes, some liberal, others tyrannical. Some aim for a separation of church and state as in the US and France, while others – such as the Ataturkist regime that until recently ruled in Turkey – assert state control over religion. Whatever its form, a secular state is no guarantee of a secular culture. Britain has an established church, but despite that fact – or more likely because of it – religion has a smaller role in politics than in America and is less publicly divisive than it is in France.

There is no sign anywhere of religion fading away, but by no means all atheists have thought the disappearance of religion possible or desirable. Some of the most prominent – including the early 19th-century poet and philosopher Giacomo Leopardi, the philosopher Arthur Schopenhauer, the Austro-Hungarian philosopher and novelist Fritz Mauthner (who published a four-volume history of atheism in the early 1920s) and Sigmund Freud, to name a few – were all atheists who accepted the human value of religion. One thing these atheists had in common was a refreshing indifference to questions of belief. Mauthner – who is remembered today chiefly because of a dismissive one-line mention in Wittgenstein’s Tractatus – suggested that belief and unbelief were both expressions of a superstitious faith in language. For him, “humanity” was an apparition which melts away along with the departing Deity. Atheism was an experiment in living without taking human concepts as realities. Intriguingly, Mauthner saw parallels between this radical atheism and the tradition of negative theology in which nothing can be affirmed of God, and described the heretical medieval Christian mystic Meister Eckhart as being an atheist in this sense.

Above all, these unevangelical atheists accepted that religion is definitively human. Though not all human beings may attach great importance to them, every society contains practices that are recognisably religious. Why should religion be universal in this way? For atheist missionaries this is a decidedly awkward question. Invariably they claim to be followers of Darwin. Yet they never ask what evolutionary function this species-wide phenomenon serves. There is an irresolvable contradiction between viewing religion naturalistically – as a human adaptation to living in the world – and condemning it as a tissue of error and illusion. What if the upshot of scientific inquiry is that a need for illusion is built into in the human mind? If religions are natural for humans and give value to their lives, why spend your life trying to persuade others to give them up?

The answer that will be given is that religion is implicated in many human evils. Of course this is true. Among other things, Christianity brought with it a type of sexual repression unknown in pagan times. Other religions have their own distinctive flaws. But the fault is not with religion, any more than science is to blame for the proliferation of weapons of mass destruction or medicine and psychology for the refinement of techniques of torture. The fault is in the intractable human animal. Like religion at its worst, contemporary atheism feeds the fantasy that human life can be remade by a conversion experience – in this case, conversion to unbelief. [Continue reading…]

Conversion can be thought of as an example of the miracle of neuroplasticity: that beliefs, firmly held, can in the right circumstances, suddenly be upturned such that the world thereafter is perceived in a radically different way.

That transition is usually described in terms of a bridge that leads from weak faith, no faith, or false faith, to conviction, but as Gray points out, that bridge could also be imagined to be traversable in the opposite direction.

The mistake that all evangelicals make (be they religious evangelicals or new atheists) is to imagine that they have the right and ability to march others across this bridge.

Real conversion, by its nature, cannot be coercive, since it entails some kind of discovery and no one discovers anything under pressure from others.

In a world that remains predominantly religious, the new atheists have ostensibly embarked on a mission of staggering proportions in their effort to purge humanity of its unreasonable superstitions.

This could be viewed as a heroically ambitious undertaking, but there seem to be plenty of reasons not to see it that way.

If the new atheists genuinely hope to persuade religious believers to see the error of their ways, how can they make any progress if they start out by viewing their prospective converts with contempt?

When was it ever the first step in a genuine process of persuasion, to start with the assumption that the person you are addressing is a fool?

As much as the new atheists may appear to be possessed by evangelical fervor, they’re appetite to condemn religion sometimes mirrors the religious fanaticism that condemns apostates.

“Some propositions are so dangerous that it may even be ethical to kill people for believing them,” writes Sam Harris in apparent agreement with the leaders of ISIS. Their only disagreement is over which propositions warrant a death sentence.

Still, much as the new atheists are often guilty of evangelical errors, I seriously doubt that their mission truly is to mount a challenge against the reign of religion.

On the contrary, I think their mission seems to have less to do with changing the world than it has with preaching to the converted. It’s about selling books, going on speaking tours, appearing on TV, amassing followers on Twitter, and doing everything else it takes to carve out a profitable cultural niche.

Who would have thought that it’s possible to pursue a career as a professional atheist? Sam Harris has, and I’m sure he has been rewarded handsomely and his success will continue, irrespective of the fate of religion.

facebooktwittermail

The mystery of flying kicks

facebooktwittermail

Humans and animals — the power and limitations of language

Stassa Edwards writes: In his Apology for Raymond Sebond (1576), Michel de Montaigne ascribed animals’ silence to man’s own wilful arrogance. The French essayist argued that animals could speak, that they were in possession of rich consciousness, but that man wouldn’t condescend to listen. ‘It is through the vanity of the same imagination that [man] equates himself with God,’ Montaigne wrote, ‘that he attributes divine attributes for himself, picks himself out and separates himself from the crowd of other creatures.’ Montaigne asked: ‘When I play with my cat, who knows if she is making more of a pastime of me than I of her?’

Montaigne’s question is as playful as his cat. Apology is not meant to answer the age-old question, but rather to provoke; to tap into an unending inquiry about the reasoning of animals. Perhaps, Montaigne implies, we simply misunderstand the foreign language of animals, and the ignorance is not theirs, but ours.

Montaigne’s position was a radical one – the idea the animals could actually speak to humans was decidedly anti-anthropocentric – and when he looked around for like-minded thinkers, he found himself one solitary essayist. But if Montaigne was a 16th century loner, then he could appeal to the Classics. Apology is littered with references to Pliny and a particular appeal to Plato’s account of the Golden Age under Saturn. But even there, Montaigne had little to work with. Aristotle had argued that animals lacked logos (meaning, literally, ‘word’ but also ‘reason’) and, therefore, had no sense of the philosophical world inhabited and animated by humans. And a few decades after Montaigne, the French philosopher René Descartes delivered the final blow, arguing that the uniqueness of man stems from his ownership of reason, which animals are incapable of possessing, and which grants him dominion over them.

Everyone know what it’s like to forget someone’s name. It could be the name of a celebrity and the need to remember might be non-existent, and yet, as though finding this name might be an antidote to looming senility, it’s hard to let go of such a compulsion until it is satisfied.

From infancy we are taught that success in life requires an unceasing commitment to colonize the world with language. To be lost for words, is to be left out.

Without the ability to speak or understand, we would lose our most vital connection with the rest of humanity.

Montaigne understood that it was a human conceit to imagine that among all creatures, we were the only ones endowed with the capacity to communicate:

Can there be a more formall and better ordained policie, divided into so severall charges and offices, more constantly entertained, and better maintained, than that of Bees? Shall we imagine their so orderly disposing of their actions, and managing of their vocations, have so proportioned and formall a conduct without discourse, reason, and forecast?

What Montaigne logically inferred in the 1500s, science would confirm centuries later.

While Stassa Edwards enumerates the many expressions of a human desire for animals to speak, my sense is that behind this desire there is an intuition about the limitations of language: that our mute companions often see more because they can say less.

We view language as a prism that allows us perceive order in the world and yet this facility in representation is so successful and elegantly structured that most of the time we see the representations much more clearly than we see the world.

Our ability to describe and analyze the world has never been more advanced than it is today and yet for millennia, humans have observed that animals seem to be able to do something that we cannot: anticipate earthquakes.

Perhaps our word-constructed world only holds together on condition that our senses remain dull.

The world we imagine we can describe, quantify, and control, is in truth a world we barely understand.

facebooktwittermail

A conversation with Adam Curtis

Jon Ronson writes: I’ve known Adam Curtis for nearly 20 years. We’re friends. We see movies together, and once even went to Romania on a mini-break to attend an auction of Nicolae Ceausescu’s belongings. But it would be wrong to characterise our friendship as frivolous. Most of the time when we’re together I’m just intensely cross-questioning him about some new book idea I have.

Sometimes Adam will say something that seems baffling and wrong at the time, but makes perfect sense a few years later. I could give you lots of examples, but here’s one: I’m about to publish a book – So You’ve Been Publicly Shamed – about how social media is evolving into a cold and conservative place, a giant echo chamber where what we believe is constantly reinforced by people who believe the same thing, and when people step out of line in the smallest ways we destroy them. Adam was warning me about Twitter’s propensity to turn this way six years ago, when it was still a Garden of Eden. Sometimes talking to Adam feels like finding the results of some horse race of the future, where the long-shot horse wins.

I suppose it’s no surprise that Adam would notice this stuff about social media so early on. It’s what his films are almost always about – power and social control. However, people don’t only enjoy them for the subject matter, but for how they look, too – his wonderful, strange use of archive.

His new film, Bitter Lake, is his most experimental yet. And I think it’s his best. It’s still journalism: it’s about our relationship with Afghanistan, and how we don’t know what to do, and so we just repeat the mistakes of the past. But he’s allowed his use of archive to blossom crazily. Fifty percent of the film has no commentary. Instead, he’s created this dreamlike, fantastical collage from historical footage and raw, unedited news footage. Sometimes it’s just a shot of a man walking down a road in some Afghan town, and you don’t know why he’s chosen it, and then something happens and you think, ‘Ah!’ (Or, more often, ‘Oh God.’) It might be something small and odd. Or it might be something huge and terrible.

Nightmarish things happen in Bitter Lake. There are shots of people dying. It’s a film that could never be on TV. It’s too disturbing. And it’s too long as well – nearly two and a half hours. And so he’s putting it straight onto BBC iPlayer. I think, with this film, he’s invented a whole new way of telling a nonfiction story.

VICE asked the two of us to have an email conversation about his work. We started just before Christmas, and carried on until after the New Year. [Continue reading…]

facebooktwittermail