Amy Ellis Nutt writes: For starters, he is the father of Western science and Western philosophy. He invented formal logic and the scientific method and wrote the first books about biology, physics, astronomy and psychology. Freedom and democracy, justice and equality, the importance of a middle class and the dangers of credit — they’re just a sampling of Aristotle’s political and economic principles. And, yes, Christianity, Islam and our Founding Fathers also owe him a lot.
Nearly 2-1/2 millennia after Aristotle’s birth, we now know where his ashes most likely were laid to rest: in the city of his birth, Stagira, on a small, picturesque peninsula in northern Greece.
“We have no [concrete] evidence, but very strong indications reaching almost to certainty,” archaeologist Kostas Sismanidis said through a translator at this week’s World Congress celebrating “Aristotle 2400 Years.” [Continue reading…]
The Guardian reports: Four helipads will cluster around one of the largest domes in the world, like sideplates awaiting the unveiling of a momentous main course, which will be jacked up 45 storeys into the sky above the deserts of Mecca. It is the crowning feature of the holy city’s crowning glory, the superlative summit of what will be the world’s largest hotel when it opens in 2017.
With 10,000 bedrooms and 70 restaurants, plus five floors for the sole use of the Saudi royal family, the £2.3bn Abraj Kudai is an entire city of five-star luxury, catering to the increasingly high expectations of well-heeled pilgrims from the Gulf.
Modelled on a “traditional desert fortress”, seemingly filtered through the eyes of a Disneyland imagineer with classical pretensions, the steroidal scheme comprises 12 towers teetering on top of a 10-storey podium, which houses a bus station, shopping mall, food courts, conference centre and a lavishly appointed ballroom.
Located in the Manafia district, just over a mile south of the Grand Mosque, the complex is funded by the Saudi Ministry of Finance and designed by the Dar Al-Handasah group, a 7,000-strong global construction conglomerate that turns its hand to everything from designing cities in Kazakhstan to airports in Dubai. For the Abraj Kudai, it has followed the wedding-cake pastiche style of the city’s recent hotel boom: cornice is piled upon cornice, with fluted pink pilasters framing blue-mirrored windows, some arched with a vaguely Ottoman air. The towers seem to be packed so closely together that guests will be able to enjoy views into each other’s rooms.
“The city is turning into Mecca-hattan,” says Irfan Al-Alawi, director of the UK-based Islamic Heritage Research Foundation, which campaigns to try to save what little heritage is left in Saudi Arabia’s holy cities. “Everything has been swept away to make way for the incessant march of luxury hotels, which are destroying the sanctity of the place and pricing normal pilgrims out.”
The Grand Mosque is now loomed over by the second tallest building in the world, the Abraj al-Bait clocktower, home to thousands more luxury hotel rooms, where rates can reach £4,000 a night for suites with the best views of the Kaaba – the black cube at the centre of the mosque around which Muslims must walk. The hotel rises 600m (2,000ft) into the air, projecting a dazzling green laser-show by night, on a site where an Ottoman fortress once stood – razed for development, along with the hill on which it sat.
The list of heritage crimes goes on, driven by state-endorsed Wahhabism, the hardline interpretation of Islam that perceives historical sites as encouraging sinful idolatry – which spawned the ideology that is now driving Isis’s reign of destruction in Syria and Iraq. [Continue reading…]
The construction of towering luxury hotels in Mecca seems to conflict with what can be described as the leveling effect for pilgrims performing the annual Hajj.
A 2008 Harvard study which compared attitudes of 800 successful Hajj lottery applicants from Pakistan, to an equal number of unsuccessful ones, found:
Hajjis have more positive views about people from other Muslim countries and are more likely to believe that different Pakistani ethnic and Islamic sectarian groups are equal and that they can live in harmony. Despite non-Muslims not being part of the hajj experience, these views also extend to adherents of other religions: Pilgrims are 22 percent more likely to declare that people of different religions are equal and 11 percent more likely to state that different religions can live in harmony by compromising over their disagreements.
Paralleling the findings on tolerance, hajjis report more positive views on women’s abilities, greater concern for their quality of life, and are also more likely to favor educating girls and women participating in the workforce.
Hajjis are also less likely to support the use of violence and show no evidence of any increased hostility toward the West. They are more than twice as likely to declare that the goals of Osama bin Laden are incorrect, more likely to express a preference for peace between Pakistan and India, and more likely to declare that it is incorrect to physically punish someone if they have dishonored the family. Hajjis also become more sensitive to crimes against women.
It thus seems that in many respects, the value of Hajj has less to do with the quality of accommodation available to pilgrims than it does with the avenues of access.
“These are the last days of Mecca,” Alawi tells The Guardian. “The pilgrimage is supposed to be a spartan, simple rite of passage, but it has turned into an experience closer to Las Vegas, which most pilgrims simply can’t afford.”
Ryan Ruby writes: For a word that literally means definition, the aphorism is a rather indefinite genre. It bears a family resemblance to the fragment, the proverb, the maxim, the hypomnema, the epigram, the mantra, the parable, and the prose poem. Coined sometime between the fifth and third centuries BC as the title for one of the books of the Corpus Hippocraticum, the Aphorismi were originally a compendium of the latest medical knowledge. The penultimate aphorism, “In chronic disease an excessive flux from the bowels is bad,” is more representative of the collection’s contents than the first — “Life is short, art is long” — for which it is best known.
But in those six words lies a clue to the particular space aphorisms were supposed to define. Thanks to a semantic slippage between the Greek word techne and its English translation (via the Latin ars), the saying is often taken to mean that the works of human beings outlast their days. But in its original context, Hippocrates or his editors probably intended something more pragmatic: the craft of medicine takes a long time to learn, and physicians have a short time in which to learn it. Although what aphorisms have in common with the forms listed above is their brevity, what is delimited by the aphorism is not the number of words in which ideas are expressed but the scope of their inquiry. Unlike Hebrew proverbs, in which the beginning of wisdom is the fear of God, the classical aphorism is a secular genre concerned with the short span of time we are allotted on earth. Books of aphorisms are also therapeutic in nature, collections of practical wisdom through which we can rid ourselves of unnecessary suffering and achieve what Hippocrates’ contemporary Socrates called eudaimonia, the good life.
This is certainly what the Stoic philosopher Arrian had in mind when he whittled down the discourses of his master, Epictetus, into a handbook of aphorisms. The Enchiridion is composed of that mixture of propositional assertion and assertive imperative that is now a hallmark of the form. In it, Epictetus, a former slave, outlines the Stoic view that, while “some things are in our control,” most things are ruled by fate. The way to the good life is to bring what is up to us — our attitudes, judgments, and desires — into harmony with what is not up to us: what happens to our bodies, possessions, and reputations. If we accept that what does happen must happen, we will never be disappointed by vain hopes or sudden misfortunes. Our dispositions, not our destinies, are the real source of our unhappiness. [Continue reading…]
Tania Lombrozo writes: Researchers have studied how people think about humans in relation to the natural world, and how the way we reason about humans and other animals changes over the course of development and as a function of education and culture.
The findings from this body of work suggest that by age 5, Western children growing up in urban environments are anomalous in the extent to which they regard humans as central to the biological world. Much of the rest of the world — including 3-year-olds, 5-year-olds in rural environments and adults from indigenous populations in South America — are more inclined to think about humans as one animal species among others, at least when it comes to reasoning about the properties that human and non-human animals are likely to possess.
To illustrate, consider a study by Patricia Herrmann, Sandra Waxman and Douglas Medin published in the Proceedings of the National Academy of Sciences in 2010. In one experiment, 64 urban children, aged 3 or 5, were asked a series of questions that assessed their willingness to generalize an unknown property from one object to another. For instance, they might be told that people “have andro inside,” and would then have to guess whether it’s right or wrong to say that dogs “have andro inside.”
The findings with 5-year-olds replicated classic work in developmental psychology and suggested a strong “anthropocentric” bias: The children were more likely to generalize from humans to non-humans than the other way around, consistent with a privileged place for humans in the biological world. The 3-year-olds, by contrast, showed no signs of this bias: They generalized from humans to non-humans and from non-humans to humans in just the same way. These findings suggest that an anthropocentric perspective isn’t a necessary starting point for human reasoning about the biological world, but rather a perspective we acquire through experience.
So what happens between the ages of 3 and 5 to induce an anthropocentric bias?
Perhaps surprisingly, one influence seems to be anthropomorphism in storybooks. [Continue reading…]
Specifically, something is undermining young people’s mental health, especially girls.
In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.
“It indicates a lot of suffering,” Twenge told Quartz.
It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:
Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.
The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).
“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.
Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.
Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.
Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.
“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.
“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]
Kacem El Ghazzali writes: When we say that nowadays to call for sexual freedom in Arab and Muslim societies is more dangerous than the demand to topple monarchies or dictatorial regimes, we are not playing with metaphor or attempting to gain sympathy. We are stating a bitter and painful fact of the reality in which we are living.
In Arab and Muslim milieus, sex is considered a means and not an end, hedged by many prickly restrictions that make it an objectionable matter and synonymous with sin. Its function within marriage is confined to procreation and nothing else, and all sexual activity outside the institution of marriage is banned legally and rejected socially. Innocent children born out of wedlock are socially rejected and considered foundlings.
This situation cannot be said to be characteristic of Arab societies only, but we experience these miseries in far darker and more intense ways than in other countries. This is especially so because of the dominance of machismo, which considers a man’s sexual adventures as heroics worthy of pride, while a woman who dares to give in to her sexual desires is destined to be killed — or at best beaten and expelled from home — because she has brought dishonor upon her family. [Continue reading…]
Antonia Malchik writes: The ranch my mother was born on was not built solely by her family’s labour. It relied on water aquifers deep beneath the surface, the health of soil on plains and hills beyond their borders, on hundreds – perhaps thousands – of years of care by the Blackfoot tribe whose land it should have remained, the weather over which they had no control, the sun, seeds, and a community who knew in their bones that nobody could do this alone. These things comprised an ecosystem that was vital to their survival, and the same holds true today. These are our shared natural resources, or what was once known as ‘the commons’.
We live on and in the commons, even if we don’t recognise it as such. Every time we take a breath, we’re drawing from the commons. Every time we walk down a road we’re using the commons. Every time we sit in the sunshine or shelter from the rain, listen to birdsong or shut our windows against the stench from a nearby oil refinery, we are engaging with the commons. But we have forgotten the critical role that the commons play in our existence. The commons make life possible. Beyond that, they make private property possible. When the commons become degraded or destroyed, enjoyment and use of private property become untenable. A Montana rancher could own ten thousand acres and still be dependent on the health of the commons. Neither a gated community nor high-rise penthouse apartments can close a human being from the wider world that we all rely on. [Continue reading…]
Robert Kaplan writes: Orientalism, through which one culture appropriated and dominated another, is slowly evaporating in a world of cosmopolitan interactions and comparative studies, as [Edward] Said intuited it might. Europe has responded by artificially reconstructing national-cultural identities on the extreme right and left, to counter the threat from the civilization it once dominated.
Although the idea of an end to history — with all its ethnic and territorial disputes — turns out to have been a fantasy, this realization is no excuse for a retreat into nationalism. The cultural purity that Europe craves in the face of the Muslim-refugee influx is simply impossible in a world of increasing human interactions.
“The West,” if it does have a meaning beyond geography, manifests a spirit of ever more inclusive liberalism. Just as in the 19th century there was no going back to feudalism, there is no going back now to nationalism, not without courting disaster. [Continue reading…]
Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.
Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.
A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”
Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.
Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)
That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]
Kenan Malik writes: Cultural appropriation is, in the words of Susan Scafidi, professor of law at Fordham University, and author of Who Owns Culture? Appropriation and Authenticity in American Law, “Taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission”. This can include the “unauthorised use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”
But what is it for knowledge or an object to “belong” to a culture? And who gives permission for someone from another culture to use such knowledge or forms?
The idea that the world could be divided into distinct cultures, and that every culture belonged to a particular people, has its roots in late 18th-century Europe.
The Romantic movement, which developed in part in opposition to the rationalism of the Enlightenment, celebrated cultural differences and insisted on the importance of “authentic” ways of being.
For Johann Gottfried Herder, the German philosopher who best articulated the Romantic notion of culture, what made each people – or “volk” – unique was its particular language, history and modes of living. The unique nature of each volk was expressed through its “volksgeist” – the unchanging spirit of a people refined through history.
Herder was no reactionary – he was an important champion of equality – but his ideas about culture were adopted by reactionary thinkers. Those ideas became central to racial thinking – the notion of the volksgeist was transformed into the concept of racial make-up – and fuelled the belief that non-Western societies were “backward” because of their “backward” cultures.
Radicals challenging racism and colonialism rejected the Romantic view of culture, adopting instead a universalist perspective. From the struggle against slavery to the anti-colonial movements, the aim not to protect one’s own special culture but to create a more universal culture in which all could participate on equal terms.
In recent decades, however, the universalist viewpoint has eroded, largely as many of the social movements that embodied that viewpoint have disintegrated. The social space vacated by that disintegration became filled by identity politics.
As the broader struggles for social transformation have faded, people have tended to retreat into their particular faiths or cultures, and to embrace more parochial forms of identity. In this process, the old cultural arguments of the racists have returned, but now rebranded as “antiracist”.
But how does creating gated cultures, and preventing others from trespassing upon one’s culture without permission, challenge racism or promote social justice? [Continue reading…]
Governments in Britain have tended to treat Muslim citizens much like colonial administrations treated their subjects. Intermediaries – tribal leaders or religious figures – are found to establish communication between the empire and its people. One positive thing about a recent ICM poll of British muslims is that it offers an alternative. The survey, carried out for a Channel 4 documentary, was never going to be able to reflect the complexity of British Muslim life accurately, but it does signal a shift by engaging directly with Muslim citizens.
How poll data is used is one way to test how colonialism’s legacy might linger on. The Daily Mail chose for its headline the quote: “Muslims are not like us and we should just accept that they will not integrate …” while Sky News highlighted that: “Half of British Muslims want homosexuality banned.”
Few media outlets rushed to use the headline that “86% of Muslims feel strong affiliation with UK, higher than the national average”, although this too is one of the findings from the survey. It is an “us and them” framework that fails to spark debate about who “we” might be and why “they”, with all their differences, might need greater integration with us, as the report has suggested.
We don’t have space here to discuss how the category Muslim may be broken up across class, regional or ethnic background. Nor will we get into comparisons with others: whether, for instance, British Catholics, or for that matter, members of the Conservative Party, might have similar sentiments towards homosexuality.
Frans de Waal writes: Tickling a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself.
The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter.
Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal.
The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners.
Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. [Continue reading…]
Salman Rushdie writes: As we honour the four hundredth anniversaries of the deaths of William Shakespeare and Miguel de Cervantes Saavedra, it may be worth noting that while it’s generally accepted that the two giants died on the same date, 23 April 1616, it actually wasn’t the same day. By 1616 Spain had moved on to using the Gregorian calendar, while England still used the Julian, and was 11 days behind. (England clung to the old Julian dating system until 1752, and when the change finally came, there were riots and, it’s said, mobs in the streets shouting, “Give us back our 11 days!”) Both the coincidence of the dates and the difference in the calendars would, one suspects, have delighted the playful, erudite sensibilities of the two fathers of modern literature.
We don’t know if they were aware of each other, but they had a good deal in common, beginning right there in the “don’t know” zone, because they are both men of mystery; there are missing years in the record and, even more tellingly, missing documents. Neither man left behind much personal material. Very little to nothing in the way of letters, work diaries, abandoned drafts; just the colossal, completed oeuvres. “The rest is silence.” Consequently, both men have been prey to the kind of idiot theories that seek to dispute their authorship.
A cursory internet search “reveals”, for example, that not only did Francis Bacon write Shakespeare’s works, he wrote Don Quixote as well. (My favourite crazy Shakespeare theory is that his plays were not written by him but by someone else of the same name.) And of course Cervantes faced a challenge to his authorship in his own lifetime, when a certain pseudonymous Alonso Fernández de Avellaneda, whose identity is also uncertain, published his fake sequel to Don Quixote and goaded Cervantes into writing the real Book II, whose characters are aware of the plagiarist Avellaneda and hold him in much contempt. [Continue reading…]
Michael Kimmelman writes: Squares have defined urban living since the dawn of democracy, from which they are inseparable. The public square has always been synonymous with a society that acknowledges public life and a life in public, which is to say a society distinguishing the individual from the state. There were, strictly speaking, no public squares in ancient Egypt or India or Mesopotamia. There were courts outside temples and royal houses, and some wide processional streets.
By the sixth century BC, the agora in Athens was a civic center, and with the rise of democracy, became a center for democracy’s institutions, the heart of public life. In ancient Greek, the word “agora” is hard to translate. In Homer it could imply a “gathering” or “assembly”; by the time of Thucydides it had come to connote the public center of a city, the place around which the rest of the city was arranged, where business and politics were conducted in public — the place without which Greeks did not really regard a town or city as a town or city at all. Rather, such a place was, as the second-century writer Pausanias roughly put it, just a sorry assortment of houses and ancient shrines.
The agora announced the town as a polis. Agoras grew in significance during the Classical and Hellenistic years, physical expressions of civic order and life, with their temples and fishmongers and bankers at money-changing tables and merchants selling oil and wine and pottery. Stoas, or colonnades, surrounded the typical agora, and sometimes trees provided shade. People who didn’t like cities, and disliked democracy in its messiness, complained that agoras mixed religious and sacrilegious life, commerce, politics, and theater. But of course that was also their attraction and significance. The agora symbolized civil justice; it was organic, changeable, urbane. Even as government moved indoors and the agora evolved over time into the Roman forum, a grander, more formal place, the notion of the public square as the soul of urban life remained, for thousands of years, critical to the self-identity of the state.
I don’t think it’s coincidental that early in 2011 the Egyptian revolution centered around Tahrir Square, or that the Occupy Movement later that same year, partly inspired by the Arab Spring, expressed itself by taking over squares like Taksim in Istanbul, the Plaça de Catalunya in Barcelona, and Zuccotti Park in Lower Manhattan. And I don’t think it’s coincidental that the strangers who came together at places like Zuccotti and Taksim all formed pop-up towns on these sites, producing in miniature form (at least temporarily) what they imagined to be the outlines of a city, with distinct spaces designated for legal services, libraries, medical stations, media centers, kitchens serving free food, and general stores handing out free clothing. [Continue reading…]
Stephen Greenblatt writes: A few years ago, during a merciful remission in the bloodshed and mayhem that has for so many years afflicted Afghanistan, a young Afghan poet, Qais Akbar Omar, had an idea. It was, he brooded, not only lives and livelihood that had been ruthlessly attacked by the Taliban, it was also culture. The international emblem of that cultural assault was the dynamiting of the Bamiyan Buddhas, but the damage extended to painting, music, dance, fiction, film, and poetry. It extended as well to the subtle web of relations that link one culture to another across boundaries and make us, each in our provincial worlds, feel that we are part of a larger humanity. This web is not only a contemporary phenomenon, the result of modern technology; it is as old as culture itself, and it has been particularly dense and vital in Afghanistan with its ancient trade routes and its endless succession of would-be conquerors.
Omar thought that the time was ripe to mark the restoration of civil society and repair some of the cultural damage. He wanted to stage a play with both men and women actors performing in public in an old garden in Kabul. He chose a Shakespeare play. No doubt the choice had something to do with the old imperial presence of the British in Afghanistan, but it was not only this particular history that was at work. Shakespeare is the embodiment worldwide of a creative achievement that does not remain within narrow boundaries of the nation-state or lend itself to the secure possession of a particular faction or speak only for this or that chosen group. He is the antithesis of intolerant provinciality and fanaticism. He could make with effortless grace the leap from Stratford to Kabul, from English to Dari.
Omar did not wish to put on a tragedy; his country, he thought, had suffered through quite enough tragedy of its own. Considering possible comedies, he shied away from those that involved cross-dressing. It was risky enough simply to have men and women perform together on stage. In the end he chose Love’s Labour’s Lost, a comedy that arranged the sexes in distinct male and female groups, had relatively few openly transgressive or explicitly erotic moments, and decorously deferred the final consummation of desire into an unstaged future. As a poet, Omar was charmed by the play’s gorgeous language, language that he felt could be rendered successfully in Dari.
The complex story of the mounting of the play is told in semifictionalized form in a 2015 book Omar coauthored with Stephen Landrigan, A Night in the Emperor’s Garden. Measured by the excitement it generated, this production of Love’s Labor’s Lost was a great success. The overflow crowds on the opening night gave way to ever-larger crowds clamoring to get in, along with worldwide press coverage.
But the attention came at a high price. The Taliban took note of Shakespeare in Kabul and what it signified. In the wake of the production, virtually everyone involved in it began to receive menacing messages. Spouses, children, and the extended families of the actors were not exempt from harrassment and warnings. The threats were not idle. The husband of one of the performers answered a loud knock on the door one night and did not return. His mutilated body was found the next morning.
What had seemed like a vigorous cultural renaissance in Afghanistan quickly faded and died. In the wake of the resurgence of the Taliban, Qais Akbar Omar and all the others who had had the temerity to mount Shakespeare’s delicious comedy of love were in terrible trouble. They are now, every one of them, in exile in different parts of the world.
Love’s labors lost indeed. But the subtitle of Omar’s account—“A True Story of Hope and Resilience in Afghanistan”—is not or at least not only ironic. The humane, inexhaustible imaginative enterprise that Shakespeare launched more than four hundred years ago in one small corner of the world is more powerful than all the oppressive forces that can be gathered against it. [Continue reading…]