Lee Vinsel & Andrew Russell write: Innovation is a dominant ideology of our era, embraced in America by Silicon Valley, Wall Street, and the Washington DC political elite. As the pursuit of innovation has inspired technologists and capitalists, it has also provoked critics who suspect that the peddlers of innovation radically overvalue innovation. What happens after innovation, they argue, is more important. Maintenance and repair, the building of infrastructures, the mundane labour that goes into sustaining functioning and efficient infrastructures, simply has more impact on people’s daily lives than the vast majority of technological innovations.
The fates of nations on opposing sides of the Iron Curtain illustrate good reasons that led to the rise of innovation as a buzzword and organising concept. Over the course of the 20th century, open societies that celebrated diversity, novelty, and progress performed better than closed societies that defended uniformity and order.
In the late 1960s in the face of the Vietnam War, environmental degradation, the Kennedy and King assassinations, and other social and technological disappointments, it grew more difficult for many to have faith in moral and social progress. To take the place of progress, ‘innovation’, a smaller, and morally neutral, concept arose. Innovation provided a way to celebrate the accomplishments of a high-tech age without expecting too much from them in the way of moral and social improvement.
Before the dreams of the New Left had been dashed by massacres at My Lai and Altamont, economists had already turned to technology to explain the economic growth and high standards of living in capitalist democracies. Beginning in the late 1950s, the prominent economists Robert Solow and Kenneth Arrow found that traditional explanations – changes in education and capital, for example – could not account for significant portions of growth. They hypothesised that technological change was the hidden X factor. Their finding fit hand-in-glove with all of the technical marvels that had come out of the Second World War, the Cold War, the post-Sputnik craze for science and technology, and the post-war vision of a material abundance. [Continue reading…]
Category Archives: Attention to the Unseen
Will we know extraterrestrial life when we see it?
Tina Hesman Saey writes: In a 1967 episode of Star Trek, Captain Kirk and crew investigated the mysterious murders of miners on the planet Janus VI. The killer, it turned out, was a rock monster called the Horta. But the Enterprise’s sensors hadn’t registered any signs of life in the creature. The Horta was a silicon-based life-form, rather than carbon-based like living things on Earth.
Still, it didn’t take long to determine that the Horta was alive. The first clue was that it skittered about. Spock closed the case with a mind meld, learning that the creature was the last of its kind, protecting its throng of eggs.
But recognizing life on different worlds isn’t likely to be this simple, especially if the recipe for life elsewhere doesn’t use familiar ingredients. There may even be things alive on Earth that have been overlooked because they don’t fit standard definitions of life, some scientists suspect. Astrobiologists need some ground rules — with some built-in wiggle room — for when they can confidently declare, “It’s alive!”
Among the researchers working out those rules is theoretical physicist Christoph Adami, who watches his own version of silicon-based life grow inside a computer at Michigan State University in East Lansing.
“It’s easy when it’s easy,” Adami says. “If you find something walking around and waving at you, it won’t be that hard to figure out that you’ve found life.” But chances are, the first aliens that humans encounter won’t be little green men. They will probably be tiny microbes of one color or another — or perhaps no color at all.
Trying to figure out how to recognize those alien microbes, especially if they are very strange, has led scientists to propose some basic criteria for distinguishing living from nonliving things. Many researchers insist that features such as active metabolism, reproduction and Darwinian evolution are de rigueur for any life, including extraterrestrials. Others add the requirement that life must have cells big enough to contain protein-building machines called ribosomes.
But such definitions can be overly restrictive. A list of specific criteria for life may give scientists tunnel vision, blinding them to the diversity of living things in the universe, especially in extreme environments, says philosopher of science Carol Cleland of the University of Colorado Boulder. Narrow definitions will “act as blinkers if you run into a form of life that’s very different.”
Some scientists, for instance, say viruses aren’t alive because they rely on their host cells to reproduce. But Adami disagrees. “There’s no doubt in my mind that biochemical viruses are alive,” he says. “They don’t carry with them everything they need to survive, but neither do we.” What’s important, Adami says, is that viruses transmit genetic information from one generation to another. Life, he says, is information that replicates.
Darwinian evolution should be off the table, too, Cleland says. Humans probably won’t be able to tell at a quick glance whether something is evolving, anyway. “Evolvability is hard to detect,” she says, “because you’ve got a snapshot and you don’t have time to hang around and watch it evolve.” [Continue reading…]
Dolphins are helping us search for aliens
Daniel Oberhaus writes: When twelve men gathered at the Green Bank Observatory in West Virginia to discuss the art and science of alien hunting in 1961, the Order of the Dolphin was born. A number of the brightest minds from a range of scientific disciplines, including three Nobel laureates, a young Carl Sagan, and an eccentric neuroscientist named John Lilly — who was best known for trying to talk to dolphins — were in attendance.
It was Lilly’s research that inspired the group’s name: If humans couldn’t even communicate with animals that shared most of our evolutionary history, he believed, they were a bit daft to think they could recognize signals from a distant planet. With that in mind, the Order of the Dolphin set out to determine what our ocean-going compatriots here on Earth might be able to teach us about talking to extraterrestrials.
Lilly’s work on interspecies communication has since gone in and out of vogue several times within the SETI (Search for Extraterrestrial Intelligence) community. Today, it’s back in fashion, thanks to new applications of information theory and to technological advancements, such as the Cetacean Hearing and Telemetry (CHAT) device, a submersible computer interface that establishes basic communication with dolphins. The return to dolphins as a model for alien intelligence came in 1999, when SETI Institute astronomer Laurance Doyle proposed using information theory to analyze animal communication systems, particularly the whistle repertoire of bottlenose dolphins. [Continue reading…]
Technology is not ruining our kids. Parents (and their technology) are ruining them
Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.
Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.
A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”
Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.
Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)
That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]
Processing high-level math concepts uses the same neural networks as in basic math skills
Scientific American reports: Alan Turing, Albert Einstein, Stephen Hawking, John Nash — these “beautiful” minds never fail to enchant the public, but they also remain somewhat elusive. How do some people progress from being able to perform basic arithmetic to grasping advanced mathematical concepts and thinking at levels of abstraction that baffle the rest of the population? Neuroscience has now begun to pin down whether the brain of a math wiz somehow takes conceptual thinking to another level.
Specifically, scientists have long debated whether the basis of high-level mathematical thought is tied to the brain’s language-processing centers — that thinking at such a level of abstraction requires linguistic representation and an understanding of syntax — or to independent regions associated with number and spatial reasoning. In a study published this week in Proceedings of the National Academy of Sciences, a pair of researchers at the INSERM–CEA Cognitive Neuroimaging Unit in France reported that the brain areas involved in math are different from those engaged in equally complex nonmathematical thinking.
The team used functional magnetic resonance imaging (fMRI) to scan the brains of 15 professional mathematicians and 15 nonmathematicians of the same academic standing. While in the scanner the subjects listened to a series of 72 high-level mathematical statements, divided evenly among algebra, analysis, geometry and topology, as well as 18 high-level nonmathematical (mostly historical) statements. They had four seconds to reflect on each proposition and determine whether it was true, false or meaningless.
The researchers found that in the mathematicians only, listening to math-related statements activated a network involving bilateral intraparietal, dorsal prefrontal, and inferior temporal regions of the brain. This circuitry is usually not associated with areas involved in language processing and semantics, which were activated in both mathematicians and nonmathematicians when they were presented with the nonmathematical statements. “On the contrary,” says study co-author and graduate student Marie Amalric, “our results show that high-level mathematical reflection recycles brain regions associated with an evolutionarily ancient knowledge of number and space.” [Continue reading…]
Is that ‘bump’ a new particle?
The combustion engines of life
Alex Riley writes: For a large part of his life, Charles Darwin didn’t like peacocks. It wasn’t their loud vocalisations – a high-pitched, piercing combo of laughter and screaming. That he could deal with. What kept him up at night was the peacocks’ tails. As he wrote to a friend in 1860, the sight of those ornate feathers made him feel sick whenever he gazed at them. Why? Because he couldn’t explain them. The plumes of turquoise, blue and brown, trailing behind many times the bird’s body length or spread into a wide fan of flamboyancy, was an affront to his theory of evolution by natural selection, a process founded on efficiency and removal of extravagance.
Not only is such a train of feathers metabolically costly, it is also readily visible to any carnivore looking for an easy meal. With all the predation, pathogens and diseases that living things need to overcome, Darwin wondered, how could such self-destructive beauty evolve? Why would an animal go to such extremes to make life harder, and death more likely? He finally hit on a plausible answer in 1871. In the second part of his book The Descent of Man, he explained that there is more to life than mere survival. Animals need to have sex, too. And because females are often more heavily invested than males in egg production and parental care, they are more likely to take the lead in choosing mates, too. As Darwin wrote: ‘It’s not a struggle for existence, but a struggle between the males for the possession of the females.’
Female choice takes many forms. In some species, courting males fight each other, and the female’s decision is made for her. But in many others, males win mates by showing off through ornaments and display. Females choose those males with superior appearance or antics, and over many millennia the selected traits become amplified; nature is red in tooth and claw, but it is also no place for the ugly. In the oceans, male fish flirt using elongated dorsal fins. In the air, male butterflies glisten with iridescent tints of colour. Even ancient dinosaurs weren’t just fighters but flaunters, too. Take triceratops, an iconic dinosaur known for its three-horned face. Some studies suggest that this species’ large, bony frill, long regarded as a form of protection against predators, was actually a flashy come-hither sign to the opposite sex.
No males go to greater lengths of seduction than do the dinosaurs’ modern-day feathered descendants. As Darwin noted, birds are the most aesthetically elaborate of all animal groups: ‘They are ornamented by all sorts of combs, wattles, protuberances, horns, air-distended sacks, top-knots, naked shafts, plumes, and lengthened feathers gracefully springing from all parts of the body.’ And they don’t stop with physical display. Birds of paradise – a group of 42 species from deep within the rainforests of New Guinea – augment their gaudy plumage with carefully choreographed rituals of courtship. In Australia, bowerbirds construct gardens of locally collected objects, each categorised by colour, texture and shape. Songbirds sing for love.
If you want to understand male ornamentation, then, birds are the animals to study. And there is a lot left to understand, because Darwin merely scraped the surface. He never fully answered his own question about the value of sexual displays. When a peahen chooses a particular peacock with beautiful outspread feathers, what exactly is it that she is choosing? What is the peacock displaying? [Continue reading…]
Culture without borders: The history of culture is the history of cultural appropriation
Kenan Malik writes: Cultural appropriation is, in the words of Susan Scafidi, professor of law at Fordham University, and author of Who Owns Culture? Appropriation and Authenticity in American Law, “Taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission”. This can include the “unauthorised use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”
But what is it for knowledge or an object to “belong” to a culture? And who gives permission for someone from another culture to use such knowledge or forms?
The idea that the world could be divided into distinct cultures, and that every culture belonged to a particular people, has its roots in late 18th-century Europe.
The Romantic movement, which developed in part in opposition to the rationalism of the Enlightenment, celebrated cultural differences and insisted on the importance of “authentic” ways of being.
For Johann Gottfried Herder, the German philosopher who best articulated the Romantic notion of culture, what made each people – or “volk” – unique was its particular language, history and modes of living. The unique nature of each volk was expressed through its “volksgeist” – the unchanging spirit of a people refined through history.
Herder was no reactionary – he was an important champion of equality – but his ideas about culture were adopted by reactionary thinkers. Those ideas became central to racial thinking – the notion of the volksgeist was transformed into the concept of racial make-up – and fuelled the belief that non-Western societies were “backward” because of their “backward” cultures.
Radicals challenging racism and colonialism rejected the Romantic view of culture, adopting instead a universalist perspective. From the struggle against slavery to the anti-colonial movements, the aim not to protect one’s own special culture but to create a more universal culture in which all could participate on equal terms.
In recent decades, however, the universalist viewpoint has eroded, largely as many of the social movements that embodied that viewpoint have disintegrated. The social space vacated by that disintegration became filled by identity politics.
As the broader struggles for social transformation have faded, people have tended to retreat into their particular faiths or cultures, and to embrace more parochial forms of identity. In this process, the old cultural arguments of the racists have returned, but now rebranded as “antiracist”.
But how does creating gated cultures, and preventing others from trespassing upon one’s culture without permission, challenge racism or promote social justice? [Continue reading…]
How LSD helped us probe what the ‘sense of self’ looks like in the brain
By Nicolas Crossley, King’s College London and Ed Bullmore, University of Cambridge
Every single person is different. We all have different backgrounds, views, values and interests. And yet there is one universal feeling that we all experience at every single moment. Call it an “ego”, a “self” or just an “I” – it’s the idea that our thoughts and feelings are our own, and no one else has access to them in the same way. This may sound a bit like post-war French existentialism or psycho-analysis, but it’s actually a topic that’s being increasingly addressed by neuroscientists.
We were part of a team interested in finding out how this sense of self is expressed in the brain – and what happens when it dissolves. To do that, we used brain imaging and the psychedelic drug LSD.
Our sense of self is something so natural that we are not always fully aware of it. In fact, it is when it is disturbed that it becomes the most noticeable. This could be due to mental illnesses such as psychosis, when people might experience the delusional belief that their thoughts are no longer private, but can be accessed and even modified by other people. Or it could be due to the influence of psychedelic drugs such as LSD, when the user can feel that their ego is “dissolving” and they are becoming at one with the world. From a scientific point of view, these experiences of “ego death” or ego dissolution are also opportunities to search for this sense of self in the brain.
Our study, led by Enzo Tagliazucchi and published in Current Biology, set out to probe what is happening in the brain when our sense of self becomes altered by psychedelic drugs (link to Enzo’s paper). We studied 15 healthy volunteers before and after taking LSD, which altered their normal feelings of their selves and their relationship with the environment. These subjects were scanned while intoxicated and while receiving placebo using functional MRI, a technique which allows us to study the brain’s activity by measuring changes in blood flow. By contrasting the activity of the brain when receiving a placebo with its activity after taking LSD, we could start exploring the brain mechanisms involved in the normal experience of the self.
Why luck matters more than you might think
Robert H Frank writes: I’m a lucky man. Perhaps the most extreme example of my considerable good fortune occurred one chilly Ithaca morning in November 2007, while I was playing tennis with my longtime friend and collaborator, the Cornell psychologist Tom Gilovich. He later told me that early in the second set, I complained of feeling nauseated. The next thing he knew, I was lying motionless on the court.
He yelled for someone to call 911, and then started pounding on my chest—something he’d seen many times in movies but had never been trained to do. He got a cough out of me, but seconds later I was again motionless with no pulse. Very shortly, an ambulance showed up.
Ithaca’s ambulances are dispatched from the other side of town, more than five miles away. How did this one arrive so quickly? By happenstance, just before I collapsed, ambulances had been dispatched to two separate auto accidents close to the tennis center. Since one of them involved no serious injuries, an ambulance was able to peel off and travel just a few hundred yards to me. EMTs put electric paddles on my chest and rushed me to our local hospital. There, I was loaded onto a helicopter and flown to a larger hospital in Pennsylvania, where I was placed on ice overnight.
Doctors later told me that I’d suffered an episode of sudden cardiac arrest. Almost 90 percent of people who experience such episodes don’t survive, and the few who do are typically left with significant impairments. And for three days after the event, my family tells me, I spoke gibberish. But on day four, I was discharged from the hospital with a clear head. Two weeks later, I was playing tennis with Tom again.
If that ambulance hadn’t happened to have been nearby, I would be dead.
Not all random events lead to favorable outcomes, of course. Mike Edwards is no longer alive because chance frowned on him. Edwards, formerly a cellist in the British pop band the Electric Light Orchestra, was driving on a rural road in England in 2010 when a 1,300-pound bale of hay rolled down a steep hillside and landed on his van, crushing him. By all accounts, he was a decent, peaceful man. That a bale of hay snuffed out his life was bad luck, pure and simple.
Most people will concede that I’m fortunate to have survived and that Edwards was unfortunate to have perished. But in other arenas, randomness can play out in subtler ways, causing us to resist explanations that involve luck. In particular, many of us seem uncomfortable with the possibility that personal success might depend to any significant extent on chance. As E. B. White once wrote, “Luck is not something you can mention in the presence of self-made men.” [Continue reading…]
Brain scans reveal how LSD affects consciousness
Researchers from Imperial College London, working with the Beckley Foundation, have for the first time visualised the effects of LSD on the brain: In a series of experiments, scientists have gained a glimpse into how the psychedelic compound affects brain activity. The team administered LSD (Lysergic acid diethylamide) to 20 healthy volunteers in a specialist research centre and used various leading-edge and complementary brain scanning techniques to visualise how LSD alters the way the brain works.
The findings, published in Proceedings of the National Academy of Sciences (PNAS), reveal what happens in the brain when people experience the complex visual hallucinations that are often associated with LSD state. They also shed light on the brain changes that underlie the profound altered state of consciousness the drug can produce.
A major finding of the research is the discovery of what happens in the brain when people experience complex dreamlike hallucinations under LSD. Under normal conditions, information from our eyes is processed in a part of the brain at the back of the head called the visual cortex. However, when the volunteers took LSD, many additional brain areas – not just the visual cortex – contributed to visual processing.
Dr Robin Carhart-Harris, from the Department of Medicine at Imperial, who led the research, explained: “We observed brain changes under LSD that suggested our volunteers were ‘seeing with their eyes shut’ – albeit they were seeing things from their imagination rather than from the outside world. We saw that many more areas of the brain than normal were contributing to visual processing under LSD – even though the volunteers’ eyes were closed. Furthermore, the size of this effect correlated with volunteers’ ratings of complex, dreamlike visions.”
The study also revealed what happens in the brain when people report a fundamental change in the quality of their consciousness under LSD.
Dr Carhart-Harris explained: “Normally our brain consists of independent networks that perform separate specialised functions, such as vision, movement and hearing – as well as more complex things like attention. However, under LSD the separateness of these networks breaks down and instead you see a more integrated or unified brain.
“Our results suggest that this effect underlies the profound altered state of consciousness that people often describe during an LSD experience. It is also related to what people sometimes call ‘ego-dissolution’, which means the normal sense of self is broken down and replaced by a sense of reconnection with themselves, others and the natural world. This experience is sometimes framed in a religious or spiritual way – and seems to be associated with improvements in well-being after the drug’s effects have subsided.” [Continue reading…]
Amanda Feilding, executive director of the Beckley Foundation, in an address she will deliver to the Royal Society tomorrow, says: I think Albert Hoffman would have been delighted to have his “Problem child” celebrated at the Royal Society, as in his long lifetime the academic establishment never recognised his great contribution. But for the taboo surrounding this field, he would surely have won the Nobel Prize. That was the beginning of the modern psychedelic age, which has fundamentally changed society.
After the discovery of the effects of LSD, there was a burst of excitement in the medical and therapeutic worlds – over 1000 experimental and clinical studies were undertaken. Then, in the early 60s, LSD escaped from the labs and began to spread into the world at large. Fuelled by its transformational insights, a cultural evolution took place, whose effects are still felt today. It sparked a wave of interest in Eastern mysticism, healthy living, nurturing the environment, individual freedoms and new music and art among many other changes. Then the establishment panicked and turned to prohibition, partly motivated by American youth becoming disenchanted with fighting a war in far-off Vietnam.
Aghast at the global devastation caused by the war on drugs, I set up the Beckley Foundation in 1998. With the advent of brain imaging technology, I realised that one could correlate the subjective experience of altered states of consciousness, brought about by psychedelic substances, with empirical findings. I realised that only through the very best science investigating how psychedelics work in the brain could one overcome the misplaced taboo which had transformed them from the food of the gods to the work of the devil. [Continue reading…]
Just to be clear, as valuable as this research is, it is an exercise in map-making. The map should never be confused with the territory.
Yuri Milner is spending $100 million on a probe that could travel to Alpha Centauri within a generation
Ross Andersen writes: In the Southern Hemisphere’s sky, there is a constellation, a centaur holding a spear, its legs raised in mid-gallop. The creature’s front hoof is marked by a star that has long hypnotized humanity, with its brightness, and more recently, its proximity.
Since the dawn of written culture, at least, humans have dreamt of star travel. As the nearest star system to Earth, Alpha Centauri is the most natural subject of these dreams. To a certain cast of mind, the star seems destined to figure prominently in our future.
In the four centuries since the Scientific Revolution, a series of increasingly powerful instruments has slowly brought Alpha Centauri into focus. In 1689, the Jesuit priest Jean Richaud fixed his telescope on a comet, as it was streaking through the stick-figure centaur. He was startled to find not one, but two stars twinkling in its hoof. In 1915, a third star was spotted, this one a small, red satellite of the system’s two central, sunlike stars.
To say that Alpha Centauri is the nearest star system to Earth is not to say that it’s near. A 25 trillion mile abyss separates us. Alpha Centauri’s light travels to Earth at the absurd rate of 186,000 miles per second, and still takes more than four years to arrive. [Continue reading…]
What I learned from tickling apes
Frans de Waal writes: Tickling a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself.
The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter.
Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal.
The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners.
Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. [Continue reading…]
How to count trees
Zach St. George writes: Gregor Hintler had what seemed like a simple question: How many trees are there? As part of Plant for the Planet, a youth initiative that aimed to plant one billion trees in every country by 2020, he needed a way to figure out how many trees the planet could fit. But when he tried to find out, he realized nobody knew the answer. One estimate suggested 400 billion trees. “That sounds like a lot,” he recalls thinking. “Could be right.” But Hintler, who was then a graduate student in environmental management at Yale University, started looking at data from plots in Germany, Norway, and the United States, where foresters had counted the number of trees. He discovered that the old figures weren’t even close — 400 billion was, in fact, far too low.
Forests cover about one third of the planet’s terrestrial area. They prevent desertification and erosion, store carbon, and provide habitat for millions of species. The recent Paris climate agreement highlights their importance, recommending that signing countries take steps to slow deforestation and enlist their forests in carbon credit markets. Knowing how many trees there are now, and how many there used to be, will help researchers assess human impact on the planet and any options going forward. [Continue reading…]
How Cervantes and Shakespeare wrote the modern literary rule book
Salman Rushdie writes: As we honour the four hundredth anniversaries of the deaths of William Shakespeare and Miguel de Cervantes Saavedra, it may be worth noting that while it’s generally accepted that the two giants died on the same date, 23 April 1616, it actually wasn’t the same day. By 1616 Spain had moved on to using the Gregorian calendar, while England still used the Julian, and was 11 days behind. (England clung to the old Julian dating system until 1752, and when the change finally came, there were riots and, it’s said, mobs in the streets shouting, “Give us back our 11 days!”) Both the coincidence of the dates and the difference in the calendars would, one suspects, have delighted the playful, erudite sensibilities of the two fathers of modern literature.
We don’t know if they were aware of each other, but they had a good deal in common, beginning right there in the “don’t know” zone, because they are both men of mystery; there are missing years in the record and, even more tellingly, missing documents. Neither man left behind much personal material. Very little to nothing in the way of letters, work diaries, abandoned drafts; just the colossal, completed oeuvres. “The rest is silence.” Consequently, both men have been prey to the kind of idiot theories that seek to dispute their authorship.
A cursory internet search “reveals”, for example, that not only did Francis Bacon write Shakespeare’s works, he wrote Don Quixote as well. (My favourite crazy Shakespeare theory is that his plays were not written by him but by someone else of the same name.) And of course Cervantes faced a challenge to his authorship in his own lifetime, when a certain pseudonymous Alonso Fernández de Avellaneda, whose identity is also uncertain, published his fake sequel to Don Quixote and goaded Cervantes into writing the real Book II, whose characters are aware of the plagiarist Avellaneda and hold him in much contempt. [Continue reading…]
The craving for public squares
Michael Kimmelman writes: Squares have defined urban living since the dawn of democracy, from which they are inseparable. The public square has always been synonymous with a society that acknowledges public life and a life in public, which is to say a society distinguishing the individual from the state. There were, strictly speaking, no public squares in ancient Egypt or India or Mesopotamia. There were courts outside temples and royal houses, and some wide processional streets.
By the sixth century BC, the agora in Athens was a civic center, and with the rise of democracy, became a center for democracy’s institutions, the heart of public life. In ancient Greek, the word “agora” is hard to translate. In Homer it could imply a “gathering” or “assembly”; by the time of Thucydides it had come to connote the public center of a city, the place around which the rest of the city was arranged, where business and politics were conducted in public — the place without which Greeks did not really regard a town or city as a town or city at all. Rather, such a place was, as the second-century writer Pausanias roughly put it, just a sorry assortment of houses and ancient shrines.
The agora announced the town as a polis. Agoras grew in significance during the Classical and Hellenistic years, physical expressions of civic order and life, with their temples and fishmongers and bankers at money-changing tables and merchants selling oil and wine and pottery. Stoas, or colonnades, surrounded the typical agora, and sometimes trees provided shade. People who didn’t like cities, and disliked democracy in its messiness, complained that agoras mixed religious and sacrilegious life, commerce, politics, and theater. But of course that was also their attraction and significance. The agora symbolized civil justice; it was organic, changeable, urbane. Even as government moved indoors and the agora evolved over time into the Roman forum, a grander, more formal place, the notion of the public square as the soul of urban life remained, for thousands of years, critical to the self-identity of the state.
I don’t think it’s coincidental that early in 2011 the Egyptian revolution centered around Tahrir Square, or that the Occupy Movement later that same year, partly inspired by the Arab Spring, expressed itself by taking over squares like Taksim in Istanbul, the Plaça de Catalunya in Barcelona, and Zuccotti Park in Lower Manhattan. And I don’t think it’s coincidental that the strangers who came together at places like Zuccotti and Taksim all formed pop-up towns on these sites, producing in miniature form (at least temporarily) what they imagined to be the outlines of a city, with distinct spaces designated for legal services, libraries, medical stations, media centers, kitchens serving free food, and general stores handing out free clothing. [Continue reading…]
How Stalin and his successors maintained an iron grip on power
By Mark Harrison, University of Warwick
The Soviet Union was one of the world’s more durable police states – and it is now one of the best documented. From Stalin’s bloody terror to the less violent but still rigidly authoritarian rule of Khrushchev and Brezhnev, the Soviet police state underwent many changes. From this history emerge seven underlying habits that communist rulers cultivated in order to safeguard their rule.
1. Your enemy is hiding
A dictator is hated and feared because of those he has caused to suffer. The more he is feared, the more his enemies will hide their hostility. To stay safe, the enemies will try to blend in with the supporters. This sets up what the political scientist Ronald Wintrobe called the dictator’s dilemma: the ruler is afraid of enemies, but cannot easily know who they are.
From Lenin to Andropov, Soviet rulers saw hidden enemies as the greatest threat to their authority. Stalin called them “wolves in sheep’s clothing”: he noted that their best cover was to join the ruling party. A powerful secret police with plenty of undercover agents was the logical way to manage such hidden threats.
How Shakespeare lives now
Stephen Greenblatt writes: A few years ago, during a merciful remission in the bloodshed and mayhem that has for so many years afflicted Afghanistan, a young Afghan poet, Qais Akbar Omar, had an idea. It was, he brooded, not only lives and livelihood that had been ruthlessly attacked by the Taliban, it was also culture. The international emblem of that cultural assault was the dynamiting of the Bamiyan Buddhas, but the damage extended to painting, music, dance, fiction, film, and poetry. It extended as well to the subtle web of relations that link one culture to another across boundaries and make us, each in our provincial worlds, feel that we are part of a larger humanity. This web is not only a contemporary phenomenon, the result of modern technology; it is as old as culture itself, and it has been particularly dense and vital in Afghanistan with its ancient trade routes and its endless succession of would-be conquerors.
Omar thought that the time was ripe to mark the restoration of civil society and repair some of the cultural damage. He wanted to stage a play with both men and women actors performing in public in an old garden in Kabul. He chose a Shakespeare play. No doubt the choice had something to do with the old imperial presence of the British in Afghanistan, but it was not only this particular history that was at work. Shakespeare is the embodiment worldwide of a creative achievement that does not remain within narrow boundaries of the nation-state or lend itself to the secure possession of a particular faction or speak only for this or that chosen group. He is the antithesis of intolerant provinciality and fanaticism. He could make with effortless grace the leap from Stratford to Kabul, from English to Dari.
Omar did not wish to put on a tragedy; his country, he thought, had suffered through quite enough tragedy of its own. Considering possible comedies, he shied away from those that involved cross-dressing. It was risky enough simply to have men and women perform together on stage. In the end he chose Love’s Labour’s Lost, a comedy that arranged the sexes in distinct male and female groups, had relatively few openly transgressive or explicitly erotic moments, and decorously deferred the final consummation of desire into an unstaged future. As a poet, Omar was charmed by the play’s gorgeous language, language that he felt could be rendered successfully in Dari.
The complex story of the mounting of the play is told in semifictionalized form in a 2015 book Omar coauthored with Stephen Landrigan, A Night in the Emperor’s Garden. Measured by the excitement it generated, this production of Love’s Labor’s Lost was a great success. The overflow crowds on the opening night gave way to ever-larger crowds clamoring to get in, along with worldwide press coverage.
But the attention came at a high price. The Taliban took note of Shakespeare in Kabul and what it signified. In the wake of the production, virtually everyone involved in it began to receive menacing messages. Spouses, children, and the extended families of the actors were not exempt from harrassment and warnings. The threats were not idle. The husband of one of the performers answered a loud knock on the door one night and did not return. His mutilated body was found the next morning.
What had seemed like a vigorous cultural renaissance in Afghanistan quickly faded and died. In the wake of the resurgence of the Taliban, Qais Akbar Omar and all the others who had had the temerity to mount Shakespeare’s delicious comedy of love were in terrible trouble. They are now, every one of them, in exile in different parts of the world.
Love’s labors lost indeed. But the subtitle of Omar’s account—“A True Story of Hope and Resilience in Afghanistan”—is not or at least not only ironic. The humane, inexhaustible imaginative enterprise that Shakespeare launched more than four hundred years ago in one small corner of the world is more powerful than all the oppressive forces that can be gathered against it. [Continue reading…]