Category Archives: Ideas

The thoughts of our ancient ancestors

The discovery of what appear to have been deliberately etched markings made by a human ancestor, Homo erectus, on the surface of a shell, call for a reconsideration of assumptions that have been made about the origins of abstract thought.

While the meaning of these zigzag markings will most likely remain forever unknown, it can reasonably be inferred that for the individual who created them, the marks had some significance.

In a report in Nature, Josephine Joordens, a biologist at Leiden University whose team discovered the markings, says:

“We’ve looked at all possibilities, but in the end we are really certain that this must have been made by an agent who did a very deliberate action with a very sharp implement,” says Joordens. Her team tried replicating the pattern on fresh and fossilized shells, “and that made us realize how difficult it really was”, she says.

Saying much more about the engraving is tricky. “If you don’t know the intention of the person who made it, it’s impossible to call it art,” says Joordens.

“But on the other hand, it is an ancient drawing. It is a way of expressing yourself. What was meant by the person who did this, we simply don’t know, ” she adds. “It could have been to impress his girlfriend, or to doodle a bit, or to mark the shell as his own property.”

Clive Finlayson, a zoologist at the Gibraltar Museum who was part of the team that described cross-hatch patterns linked to Neanderthals, is also agnostic about whether to call the H. erectus doodles art. What is more important, he says, is the growing realization that abilities such as abstract thinking, once ascribed to only H. sapiens, were present in other archaic humans, including, now, their ancestors.

“I’ve been suggesting increasingly strongly that a lot of these things that are meant to be modern human we’re finding in other hominids,” he says. “We really need to revisit these concepts and take stock.”

Palaeoanthropology, by necessity, is a highly speculative discipline — therein lies both its strength and its weakness.

The conservatism of hard science recoils at the idea that some scratches on a single shell amount to sufficient evidence to prompt a reconsideration about the origins of the human mind, and yet to refrain from such speculation seems like an effort to restrain the powers of the very thing we are trying to understand.

Rationally, there is as much reason to assume that abstract thinking long predates modern humans and thus searching for evidence of its absence and finding none would leave us agnostic about its presence or absence, than there is reason to assume that at some juncture it was born.

My inclination is to believe that any living creature that has some capacity to construct a neurological representation of their surroundings is by that very capacity employing something akin to abstract thinking.

This ability for the inner to mirror the outer has no doubt evolved, becoming progressively more complex and more deeply abstract, and yet mind, if defined as world-mirroring, seems to have been born when life first moved.

Facebooktwittermail

ISIS is not the product of another era

Giles Fraser writes: Among the various reactions to the Church of England’s vote on women bishops, one comment really got under my skin: “Welcome to the 21st century.” Almost everything about it irritated me. For unless the person who made this comment was partying somewhere like Sydney on the evening of 31 December 1999, I suspect that we have both been sharing the 21st century for exactly the same amount of time. So how come he gets to welcome me to it? And with all the assumed and self-satisfied cultural superiority of a native welcoming an immigrant off the boat at Calais.

Back in 1983, the German anthropologist Johannes Fabian published a brilliant account of how western anthropologists often used the language of time to distance themselves from the object of their study and to secure the dominance of a western Enlightenment worldview. In Time and the Other he noted there was something fishy about the way early anthropologists went out and studied other cultures, talking and interacting with people in the same temporal space, yet when such encounters came to be written up, the people being studied/talked with tended to be situated back in time. The anthropologist always lives in the present. The people being studied live in the past. It’s what Fabian calls “a denial of coevalness” – a denial that we share the same temporal space with those who have different values or different political aspirations. This denial of coevalness, argues Fabian (very much in the style of Edward Said), is often a political power-play, a discourse of “otherness” that was commonly used to buttress the colonial exploitation of others.

But it’s not just colonialism-justifying anthropologists who play this linguistic/moral trick with the clock. The same thing happens in contemporary journalism all the time. Isis, for example, are often described as “medieval”. Travel to Damascus or Baghdad, and you travel not just to the Middle East but also to the middle ages. In part, this familiar trope is based on the idea that the extreme violence of contemporary jihadis has more in common with the extreme violence of the middle ages. As a comparison, this is most unfair on the middle ages, which is transformed from a rich and complex period of human history into modernity’s “other” – little more than that against which modernity comes to define itself. Forget about the founding of the great cathedrals and universities, forget about the Islamic development of mathematics, forget about Leonardo da Vinci and all of that: in secular salvation myth we are sold the simple story that we have been saved from the dark ages of barbarism and stupidity by the clear moral vision of science, rationality and Apple computers. This is just as much a salvation myth as any proposed by religion – though in this version of salvation it is religion itself that we need to be saved from. [Continue reading…]

Facebooktwittermail

Wonder and the ends of inquiry

Lorraine Daston writes: Science and wonder have a long and ambivalent relationship. Wonder is a spur to scientific inquiry but also a reproach and even an inhibition to inquiry. As philosophers never tire of repeating, only those ignorant of the causes of things wonder: the solar eclipse that terrifies illiterate peasants is no wonder to the learned astronomer who can explain and predict it. Romantic poets accused science of not just neutralizing wonder but of actually killing it. Modern popularizations of science make much of wonder — but expressions of that passion are notably absent in professional publications. This love-hate relationship between wonder and science started with science itself.

Wonder always comes at the beginning of inquiry. “For it is owing to their wonder that men both now begin and at first began to philosophize,” explains Aristotle; Descartes made wonder “the first of the passions,” and the only one without a contrary, opposing passion. In these and many other accounts of wonder, both soul and senses are ambushed by a puzzle or a surprise, something that catches us unawares and unprepared. Wonder widens the eyes, opens the mouth, stops the heart, freezes thought. Above all, at least in classical accounts like those of Aristotle and Descartes, wonder both diagnoses and cures ignorance. It reveals that there are more things in heaven and earth than have been dreamt of in our philosophy; ideally, it also spurs us on to find an explanation for the marvel.

Therein lies the paradox of wonder: it is the beginning of inquiry (Descartes remarks that people deficient in wonder “are ordinarily quite ignorant”), but the end of inquiry also puts an end to wonder. [Continue reading…]

Facebooktwittermail

Review: Trouble in Paradise and Absolute Recoil by Slavoj Žižek

Terry Eagleton writes: It is said that Jean-Paul Sartre turned white-faced with excitement when a colleague arrived hotfoot from Germany with the news that one could make philosophy out of the ashtray. In these two new books, Slavoj Žižek philosophises in much the same spirit about sex, swearing, decaffeinated coffee, vampires, Henry Kissinger, The Sound of Music, the Muslim Brotherhood, the South Korean suicide rate and a good deal more. If there seems no end to his intellectual promiscuity, it is because he suffers from a rare affliction known as being interested in everything. In Britain, philosophers tend to divide between academics who write for each other and meaning-of-life merchants who beam their reflections at the general public. Part of Žižek’s secret is that he is both at once: a formidably erudite scholar well-versed in Kant and Heidegger who also has a consuming passion for the everyday. He is equally at home with Hegel and Hitchcock, the Fall from Eden and the fall of Mubarak. If he knows about Wagner and Schoenberg, he is also an avid consumer of vampire movies and detective fiction. A lot of his readers have learned to understand Freud or Nietzsche by viewing them through the lens of Jaws or Mary Poppins.

Academic philosophers can be obscure, whereas popularisers aim to be clear. With his urge to dismantle oppositions, Žižek has it both ways here. If some of his ideas can be hard to digest, his style is a model of lucidity. Absolute Recoil is full of intractable stuff, but Trouble in Paradise reports on the political situation in Egypt, China, Korea, Ukraine and the world in general in a crisp, well-crafted prose that any newspaper should be proud to publish. Not that, given Žižek’s provocatively political opinions, many of them would. He sees the world as divided between liberal capitalism and fundamentalism – in other words, between those who believe too little and those who believe too much. Instead of taking sides, however, he stresses the secret complicity between the two camps. [Continue reading…]

Facebooktwittermail

The man who invented modern probability

Dr. Slava Gerovitch writes: If two statisticians were to lose each other in an infinite forest, the first thing they would do is get drunk. That way, they would walk more or less randomly, which would give them the best chance of finding each other. However, the statisticians should stay sober if they want to pick mushrooms. Stumbling around drunk and without purpose would reduce the area of exploration, and make it more likely that the seekers would return to the same spot, where the mushrooms are already gone.

Such considerations belong to the statistical theory of “random walk” or “drunkard’s walk,” in which the future depends only on the present and not the past. Today, random walk is used to model share prices, molecular diffusion, neural activity, and population dynamics, among other processes. It is also thought to describe how “genetic drift” can result in a particular gene—say, for blue eye color—becoming prevalent in a population. Ironically, this theory, which ignores the past, has a rather rich history of its own. It is one of the many intellectual innovations dreamed up by Andrei Kolmogorov, a mathematician of startling breadth and ability who revolutionized the role of the unlikely in mathematics, while carefully negotiating the shifting probabilities of political and academic life in Soviet Russia.

As a young man, Kolmogorov was nourished by the intellectual ferment of post-revolutionary Moscow, where literary experimentation, the artistic avant-garde, and radical new scientific ideas were in the air. In the early 1920s, as a 17-year-old history student, he presented a paper to a group of his peers at Moscow University, offering an unconventional statistical analysis of the lives of medieval Russians. It found, for example, that the tax levied on villages was usually a whole number, while taxes on individual households were often expressed as fractions. The paper concluded, controversially for the time, that taxes were imposed on whole villages and then split among the households, rather than imposed on households and accumulated by village. “You have found only one proof,” was his professor’s acid observation. “That is not enough for a historian. You need at least five proofs.” At that moment, Kolmogorov decided to change his concentration to mathematics, where one proof would suffice. [Continue reading…]

Facebooktwittermail

What can you really know?

The theoretical physicist, Freeman Dyson, writes: Jim Holt’s Why Does the World Exist?: An Existential Detective Story is a portrait gallery of leading modern philosophers. He visited each of them in turn, warning them in advance that he was coming to discuss with them a single question: “Why is there something rather than nothing?” He reports their reactions to this question, and embellishes their words with descriptions of their habits and personalities. Their answers give us vivid glimpses of the speakers but do not solve the riddle of existence.

The philosophers are more interesting than the philosophy. Most of them are eccentric characters who have risen to the top of their profession. They think their deep thoughts in places of unusual beauty such as Paris and Oxford. They are heirs to an ancient tradition of academic hierarchy, in which disciples sat at the feet of sages, and sages enlightened disciples with Delphic utterances. The universities of Paris and Oxford have maintained this tradition for eight hundred years. The great world religions have maintained it even longer. Universities and religions are the most durable of human institutions.

According to Holt, the two most influential philosophers of the twentieth century were Martin Heidegger and Ludwig Wittgenstein, Heidegger supreme in continental Europe, Wittgenstein in the English-speaking world. Heidegger was one of the founders of existentialism, a school of philosophy that was especially attractive to French intellectuals. Heidegger himself lost his credibility in 1933 when he accepted the position of rector of the University of Freiburg under the newly established Hitler government and became a member of the Nazi Party. Existentialism continued to flourish in France after it faded in Germany.

Wittgenstein, unlike Heidegger, did not establish an ism. He wrote very little, and everything that he wrote was simple and clear. The only book that he published during his lifetime was Tractatus Logico-Philosophicus, written in Vienna in 1918 and published in England with a long introduction by Bertrand Russell in 1922. It fills less than two hundred small pages, even though the original German and the English translation are printed side by side. I was lucky to be given a copy of the Tractatus as a prize when I was in high school. I read it through in one night, in an ecstasy of adolescent enthusiasm. Most of it is about mathematical logic. Only the last five pages deal with human problems. The text is divided into numbered sections, each consisting of one or two sentences. For example, section 6.521 says: “The solution of the problem of life is seen in the vanishing of this problem. Is not this the reason why men, to whom after long doubting the sense of life became clear, could not then say wherein this sense consisted?” The most famous sentence in the book is the final section 7: “Wherof one cannot speak, thereof one must be silent.”

I found the book enlightening and liberating. It said that philosophy is simple and has limited scope. Philosophy is concerned with logic and the correct use of language. All speculations outside this limited area are mysticism. Section 6.522 says: “There is indeed the inexpressible. This shows itself. It is the mystical.” Since the mystical is inexpressible, there is nothing more to be said. Holt summarizes the difference between Heidegger and Wittgenstein in nine words: “Wittgenstein was brave and ascetic, Heidegger treacherous and vain.” These words apply equally to their characters as human beings and to their intellectual output.

Wittgenstein’s intellectual asceticism had a great influence on the philosophers of the English-speaking world. It narrowed the scope of philosophy by excluding ethics and aesthetics. At the same time, his personal asceticism enhanced his credibility. During World War II, he wanted to serve his adopted country in a practical way. Being too old for military service, he took a leave of absence from his academic position in Cambridge and served in a menial job, as a hospital orderly taking care of patients. When I arrived at Cambridge University in 1946, Wittgenstein had just returned from his six years of duty at the hospital. I held him in the highest respect and was delighted to find him living in a room above mine on the same staircase. I frequently met him walking up or down the stairs, but I was too shy to start a conversation. Several times I heard him muttering to himself: “I get stupider and stupider every day.”

Finally, toward the end of my time in Cambridge, I ventured to speak to him. I told him I had enjoyed reading the Tractatus, and I asked him whether he still held the same views that he had expressed twenty-eight years earlier. He remained silent for a long time and then said, “Which newspaper do you represent?” I told him I was a student and not a journalist, but he never answered my question.

Wittgenstein’s response to me was humiliating, and his response to female students who tried to attend his lectures was even worse. If a woman appeared in the audience, he would remain standing silent until she left the room. I decided that he was a charlatan using outrageous behavior to attract attention. I hated him for his rudeness. Fifty years later, walking through a churchyard on the outskirts of Cambridge on a sunny morning in winter, I came by chance upon his tombstone, a massive block of stone lightly covered with fresh snow. On the stone was written the single word, “WITTGENSTEIN.” To my surprise, I found that the old hatred was gone, replaced by a deeper understanding. He was at peace, and I was at peace too, in the white silence. He was no longer an ill-tempered charlatan. He was a tortured soul, the last survivor of a family with a tragic history, living a lonely life among strangers, trying until the end to express the inexpressible. [Continue reading…]

Facebooktwittermail

Happy New Year?

Tali Sharot, author of The Optimism Bias: Why we’re wired to look on the bright side (this book is not available in the U.S. yet), writes: We like to think of ourselves as rational creatures. We watch our backs, weigh the odds, pack an umbrella. But both neuroscience and social science suggest that we are more optimistic than realistic. On average, we expect things to turn out better than they wind up being. People hugely underestimate their chances of getting divorced, losing their job or being diagnosed with cancer; expect their children to be extraordinarily gifted; envision themselves achieving more than their peers; and overestimate their likely life span (sometimes by 20 years or more).

The belief that the future will be much better than the past and present is known as the optimism bias. It abides in every race, region and socioeconomic bracket. Schoolchildren playing when-I-grow-up are rampant optimists, but so are grown-ups: a 2005 study found that adults over 60 are just as likely to see the glass half full as young adults.

You might expect optimism to erode under the tide of news about violent conflicts, high unemployment, tornadoes and floods and all the threats and failures that shape human life. Collectively we can grow pessimistic – about the direction of our country or the ability of our leaders to improve education and reduce crime. But private optimism, about our personal future, remains incredibly resilient. A survey conducted in 2007 found that while 70% thought families in general were less successful than in their parents’ day, 76% of respondents were optimistic about the future of their own family.

Overly positive assumptions can lead to disastrous miscalculations – make us less likely to get health checkups, apply sunscreen or open a savings account, and more likely to bet the farm on a bad investment. But the bias also protects and inspires us: it keeps us moving forward rather than to the nearest high-rise ledge. Without optimism, our ancestors might never have ventured far from their tribes and we might all be cave dwellers, still huddled together and dreaming of light and heat.

To make progress, we need to be able to imagine alternative realities – better ones – and we need to believe that we can achieve them. Such faith helps motivate us to pursue our goals. Optimists in general work longer hours and tend to earn more. Economists at Duke University found that optimists even save more. And although they are not less likely to divorce, they are more likely to remarry – an act that is, as Samuel Johnson wrote, the triumph of hope over experience.

Even if that better future is often an illusion, optimism has clear benefits in the present. Hope keeps our minds at ease, lowers stress and improves physical health. Researchers studying heart-disease patients found that optimists were more likely than non-optimistic patients to take vitamins, eat low-fat diets and exercise, thereby reducing their overall coronary risk. A study of cancer patients revealed that pessimistic patients under 60 were more likely to die within eight months than non-pessimistic patients of the same initial health, status and age.

In fact, a growing body of scientific evidence points to the conclusion that optimism may be hardwired by evolution into the human brain. The science of optimism, once scorned as an intellectually suspect province of pep rallies and smiley faces, is opening a new window on the workings of human consciousness. What it shows could fuel a revolution in psychology, as the field comes to grips with accumulating evidence that our brains aren’t just stamped by the past. They are constantly being shaped by the future.
Hardwired for hope?

I would have liked to tell you that my work on optimism grew out of a keen interest in the positive side of human nature. The reality is that I stumbled onto the brain’s innate optimism by accident. After living through 9/11, in New York City, I had set out to investigate people’s memories of the terrorist attacks. I was intrigued by the fact that people felt their memories were as accurate as a videotape, while often they were filled with errors. A survey conducted around the country showed that 11 months after the attacks, individuals’ recollections of their experience that day were consistent with their initial accounts (given in September 2011) only 63% of the time. They were also poor at remembering details of the event, such as the names of the airline carriers. Where did these mistakes in memory come from?

Scientists who study memory proposed an intriguing answer: memories are susceptible to inaccuracies partly because the neural system responsible for remembering episodes from our past might not have evolved for memory alone. Rather, the core function of the memory system could in fact be to imagine the future – to enable us to prepare for what has yet to come. The system is not designed to perfectly replay past events, the researchers claimed. It is designed to flexibly construct future scenarios in our minds. As a result, memory also ends up being a reconstructive process, and occasionally, details are deleted and others inserted. [Continue reading…]

Facebooktwittermail

Steven Pinker’s tilted measure of violence

Steven Pinker claims we are living in the most peaceable era of human existence. In a review of The Better Angels of Our Nature: Why Violence Has Declined, Timothy Snyder challenges Pinker’s thesis.

The central psychological virtue of modern civilization, Pinker claims, is “self-control.” Over the centuries, after people are pacified by the state, they learn to think ahead, to see the perspectives of others, and to pursue their ends without immediate violent action. Violence becomes not only impractical but also taboo. Nazi Germany, as Pinker seems to sense, represents a tremendous problem for this argument. Germany in the 1930s was probably the most functional state of its time, with low homicide rates and a highly literate population. Mastery of self was not the Nazis’ problem; self-control was in fact a major element of the SS ethos, as preached by Reinhard Heydrich and Heinrich Himmler. Even Adolf Hitler practiced his emotive speeches. Lack of self-control was also not the problem for Joseph Stalin’s executioners, or for Stalin and Stalinists generally. Individual Soviet NKVD men killed hundreds of people, one by one, in a single day; this can hardly be done without self-control of a very high order.

To rescue his argument from the problem posed by the mass killings of the mid-twentieth century, Pinker resorts to claiming that a single individual, in the German case Hitler, was “mostly responsible.” Here, he misrepresents the historians he cites. It is true that most historians would subscribe to some version of “no Hitler, no Holocaust.” But what they mean is that Hitler was a necessary condition for such a calamity, not that he was a sufficient one. There were many other necessary conditions for Nazi racial imperialism. Take, for example, worries about the food supply. In the 1930s, food was highly valued in both Berlin and Moscow. This fact did not dictate which ideologies would define the two states. But in practice, both Hitler and Stalin were obsessed with mastering and exploiting fertile soil, the former to transform Germany into a self-sufficient, racially pure empire, the latter to finance the industrialization of the Soviet Union.

Without recognizing the importance of scarce resources, it is impossible to understand the very different plans for agrarian colonization that the Nazi and Soviet ideologies sanctioned. But Pinker dismisses any claim that resources (rather than bad ideas) were related to the bloodiest conflicts in modern history as a “nutball conspiracy theory.” This is an odd position for him to take, since his own history begins in a premodern world of conflict over resources. By insisting that ideas alone were to blame, he oversimplifies the issue. A more rigorous explanation would explain how political ideas interacted with scarcity, rather than insist that either one or the other must have been the problem.

Modern ideologies were not, as in Pinker’s metaphors, “toxic” forces that “drove” people to do this or that. They provided narratives to explain why some groups and individuals had better access to resources, and appealing visions of the future after an aggressive reordering. Nazi Germany and the Soviet Union were ideological states, but they cannot be dismissed from history simply because they were organized around the wrong ideas. Each of them had plans for economic development that were meant to privilege one group at the expense of others — plans that were inextricably entangled with justifications for why some people deserved more, others less, and others nothing but death (the extreme and unprecedented case being the Holocaust). These ideologies were effective in part because they motivated, and they motivated in part because they delivered, if not plenty, then at least visions of plenty.

We are different from the Nazis and the Soviets not because we have more self-control — we don’t. We are different largely because postwar improvements in agricultural technology have provided the West with reliable supplies of food, our massive consumption of which says much about our limited self-control. But what if food were to become scarcer and more expensive, as seems now to be the trend? What if unfavorable climate change were to outrun our technical capacities? Or what if melting glaciers leave societies such as China without fresh water? Pinker claims, unpersuasively, that global warming poses little threat to modern ways of life. But it hardly matters whether he is right: states are already taking action to minimize its consequences. China, for example, is buying up land in Africa and Ukraine in order to compensate for its own shortage of arable soil. The fresh water of Siberia must beckon. If scientists continue to issue credible warnings about the consequences of climate change, it would be surprising if leaders did not conjure up new reasons for preemptive violent action, positioning their states for a new age of want.

Treating Nazi Germany as a historical aberration also allows Pinker to sidestep the question of how Germans and central and western Europeans became such peaceful people after the demise of Nazism. This is a strange oversight, since European pacifism and low European homicide rates are where he begins the book. Today’s Europe is Pinker’s gold standard, but he does not ask why its levels of violence are the lowest in all of his charts. If, as he contends, the “pleasures of bourgeois life” prevent people from fighting, Pinker should also consider the place where these are most fully developed, and how they became so. Pinker persuasively relates how postwar economic cooperation among European states led to a pacifying interdependence, but he fails to stress that the postwar rebirth of European economies was a state-led enterprise funded by a massive U.S. subsidy known as the Marshall Plan. And he says very little about the concurrent development of redistributive social policy within those states. State power goes missing in the very places where states became preoccupied with welfare rather than warfare.

Facebooktwittermail

The evolutionary roots of collective intelligence

Big Think: For much of the 20th century, social scientists assumed that competition and strife were the natural order of things, as ingrained as the need for food and shelter. The world would be a better place if we could all just be a little more like John Wayne, the thinking went.

Now researchers are beginning to see teamwork as a biological imperative, present in even the most basic life forms on Earth. And it’s not just about fairness, or the strong lifting up the weak. Collective problem-solving is simply more efficient than rugged individualism.

Facebooktwittermail

Today Maoism speaks to the world’s poor more fluently than ever

Pankaj Mishra writes:

In 2008 in Beijing I met the Chinese novelist Yu Hua shortly after he had returned from Nepal, where revolutionaries inspired by Mao Zedong had overthrown a monarchy. A young Red Guard during the Cultural Revolution, Yu Hua, like many Chinese of his generation, has extremely complicated views on Mao. Still, he was astonished, he told me, to see Nepalese Maoists singing songs from his Maoist youth – sentiments he never expected to hear again in his lifetime.
otto 20/07 Illustration by Otto

In fact, the success of Nepalese Maoists is only one sign of the “return” of Mao. In central India armed groups proudly calling themselves Maoists control a broad swath of territory, fiercely resisting the Indian government’s attempts to make the region’s resource-rich forests safe for the mining operations that, according to a recent report in Foreign Policy magazine, “major global companies like Toyota and Coca-Cola” now rely on.

And – as though not to be outdone by Mao’s foreign admirers – some Chinese have begun to carefully deploy Mao’s still deeply ambiguous memory in China. Texting Mao’s sayings to mobile phones, broadcasting “Red” songs from state-owned radio and television, and sending college students to the countryside, Bo Xilai, the ambitious communist party chief of the southwestern municipality of Chongqing, is leading an unexpected Mao revival in China.

It was the “return” of Marx, rather than of Mao, that was much heralded in academic and journalistic circles after the financial crisis of 2008. And it is true that Marxist theorists, rather than Marx himself, clearly anticipated the problems of excessive capital accumulation, and saw how eager and opportunistic investors cause wildly uneven development across regions and nations, enriching a few and impoverishing many others. But Mao’s “Sinified” and practical Marxism, which includes a blueprint for armed rebellion, appears to speak more directly to many people in poor countries.

Facebooktwittermail

Nelson Mandela: From prisoner to president

David Africa writes:

As South Africans celebrate the birthday of their national hero Nelson Mandela all the accolades again praise him as a peacemaker, moderate, and a saint. This image of Mandela is one that has been aggressively cultivated since his elevation from prisoner to president with the first democratic election in 1994, and is a curious part of a political project with the twin objectives of moderating one of the primary symbols of the South African liberation struggle on the one hand, and appropriating this ‘new Mandela’ for a moderate or even conservative political project.

The saint-like status that Mandela has acquired in the West, traditionally hostile to Mandela’s politics and that of his organisation the African National Congress (ANC) is mirrored in the false adulation showered upon him by the local parliamentary opposition party the Democratic Alliance. The Democratic Alliance is mainly a coalition of former liberals and the remnants of the National Party that ruled South Africa until 1994. The capture of the Mandela icon and his transformation from militant to moderate saint is now almost complete.

And yet, this is not the Mandela that black South Africans know. The Mandela we know has always been a militant, from his days as a fiery youth leader in the 1940s, through leading the ANC Defiance Campaign against the Apartheid government in 1952 and being the first commander of that organisation’s armed wing when it turned to violent resistance in 1961.

His speech to the court in April 1964, as he and his fellow ANC comrades faced the real risk of the death penalty, is an articulation of a militancy that is at once reasonable and defiant. Throughout his long imprisonment Mandela refused offers of personal freedom in exchange for abandoning violent resistance to the Apartheid government.

Facebooktwittermail

How to survive the age of distraction

Johann Hari writes:

The book – the physical paper book – is being circled by a shoal of sharks, with sales down 9 per cent this year alone. It’s being chewed by the e-book. It’s being gored by the death of the bookshop and the library. And most importantly, the mental space it occupied is being eroded by the thousand Weapons of Mass Distraction that surround us all. It’s hard to admit, but we all sense it: it is becoming almost physically harder to read books.

In his gorgeous little book The Lost Art of Reading – Why Books Matter in a Distracted Time, the critic David Ulin admits to a strange feeling. All his life, he had taken reading as for granted as eating – but then, a few years ago, he “became aware, in an apartment full of books, that I could no longer find within myself the quiet necessary to read”. He would sit down to do it at night, as he always had, and read a few paragraphs, then find his mind was wandering, imploring him to check his email, or Twitter, or Facebook. “What I’m struggling with,” he writes, “is the encroachment of the buzz, the sense that there’s something out there that merits my attention.”

I think most of us have this sense today, if we are honest. If you read a book with your laptop thrumming on the other side of the room, it can be like trying to read in the middle of a party, where everyone is shouting to each other. To read, you need to slow down. You need mental silence except for the words. That’s getting harder to find.

No, don’t misunderstand me. I adore the web, and they will have to wrench my Twitter feed from my cold dead hands. This isn’t going to turn into an antedeluvian rant against the glories of our wired world. But there’s a reason why that word – “wired” – means both “connected to the internet” and “high, frantic, unable to concentrate”.

In the age of the internet, physical paper books are a technology we need more, not less. In the 1950s, the novelist Herman Hesse wrote: “The more the need for entertainment and mainstream education can be met by new inventions, the more the book will recover its dignity and authority. We have not yet quite reached the point where young competitors, such as radio, cinema, etc, have taken over the functions from the book it can’t afford to lose.”

We have now reached that point. And here’s the function that the book – the paper book that doesn’t beep or flash or link or let you watch a thousand videos all at once – does for you that nothing else will. It gives you the capacity for deep, linear concentration. As Ulin puts it: “Reading is an act of resistance in a landscape of distraction…. It requires us to pace ourselves. It returns us to a reckoning with time. In the midst of a book, we have no choice but to be patient, to take each thing in its moment, to let the narrative prevail. We regain the world by withdrawing from it just a little, by stepping back from the noise.”

Facebooktwittermail

Hitchens on mortality

Christopher Hitchens interviewed by Australia’s ABC TV (the full interview can be viewed on broadband here on Windows Media Player):

TONY JONES: I want to ask you what you think about Martin Amis’ idea that writers like you must actually believe in some form of life after death because not all of you, not all of the parts of you are going to die because the printed words you leave behind constitute a form of immortality. I mean, is he just being kind, or do you think that there’s a truth to that?

CHRISTOPHER HITCHENS: Littera scripta manet – “The written word will remain”. That’s true, but it won’t be that much comfort to me.

Of course I do write – I’ve always had the sense of writing, as it were, posthumously. I once wrote an introduction to a collection of my own essays. I stole the formulation from Nadine Gordimer who said you should try and write as if for post-mortem publication because it only then can screen out all those influences: public opinion, some reviewer you might want to be impressing, some publisher who might want to publish you, someone you’re afraid of offending. All these distractions, you can write purely and honestly and clearly and for its own sake. And the best way of doing that is to imagine that you won’t live to see it actually written, then you can be sure that you’re being objective and you’re being scrupulous.

I think that’s a wonderful reflection, but it doesn’t – it isn’t the same term as immortality at all.

TONY JONES: As you say in your memoirs, you’ve written for decades day in, day out – I think you said at least 1,000 words a day for many, many years – despatches, articles, lectures, books – in particular books. Doesn’t it give you some comfort that your thoughts, and indeed some version of you, is going to exist after your death, is imperishable?

CHRISTOPHER HITCHENS: Well, if you want to know – because I try to avoid the blues when talking about all of this, but if you want to know one of the most sour reflections that I have when I think that I’m 61 now and I might not make 65 – I quite easily might not.

One of the bitter aspects of that is, well, I put in 60 years at the coalface, I worked very hard. In the last few years I’ve got a fair amount of recognition for it. In my opinion, actually, rather more than I deserve. Certainly more than I expected. And I could have looked forward to a few years of, shall we say, cruising speed, you know, just, as it were, relishing that, enjoying it.

Not ceasing to work, not resting on the laurels, but savouring it a bit and that – I was just getting ready for that, as a matter of fact. I was hit right at the top of my form, right in the middle of a successful book tour. I’m not going to get that and that does upset me. So that’s how I demarcate it from immortality.

Similarly, I’m not going to see my grandchildren – almost certainly not. One has children in the expectation of dying before them. In fact, you want to make damn sure you die before them, just as you plant a tree or build a house knowing, hoping that it will outlive you. That’s how the human species has done as well as it has.

The great Cuban writer Jose Marti said that a man – he happened to say it was a man – three duties: to write a book, to plant a tree and to have a son. I remember the year my first son was born was the year I published my first real full-length book, and I had a book party for it and for him – Alexander, my son – and I planted a tree, a weeping willow and felt pretty good for the age of, what?, I think 32 or something.

But, the thought of mortality, in other words of being outlived, is fine when it’s your children, your books or your trees, but it doesn’t reconcile you to an early death. No, it doesn’t.

Facebooktwittermail

Wade Davis on endangered cultures

Wade Davis on endangered cultures

You know, one of the intense pleasures of travel and one of the delights of ethnographic research is the opportunity to live amongst those who have not forgotten the old ways, who still feel their past in the wind, touch it in stones polished by rain, taste it in the bitter leaves of plants. Just to know that Jaguar shamans still journey beyond the Milky Way, or the myths of the Inuit elders still resonate with meaning, or that in the Himalaya, the Buddhists still pursue the breath of the Dharma, is to really remember the central revelation of anthropology, and that is the idea that the world in which we live in does not exist in some absolute sense, but is just one model of reality, the consequence of one particular set of adaptive choices that our lineage made, albeit successfully, many generations ago.

And of course, we all share the same adaptive imperatives. We’re all born. We all bring our children into the world. We go through initiation rites. We have to deal with the inexorable separation of death, so it shouldn’t surprise us that we all sing, we all dance, we all have art.

But what’s interesting is the unique cadence of the song, the rhythm of the dance in every culture. And whether it is the Penan in the forests of Borneo, or the Voodoo acolytes in Haiti, or the warriors in the Kaisut desert of Northern Kenya, the Curandero in the mountains of the Andes, or a caravanserai in the middle of the Sahara. This is incidentally the fellow that I travelled into the desert with a month ago, or indeed a yak herder in the slopes of Qomolangma, Everest, the goddess mother of the world.

All of these peoples teach us that there are other ways of being, other ways of thinking, other ways of orienting yourself in the Earth. And this is an idea, if you think about it, can only fill you with hope. Now, together the myriad cultures of the world make up a web of spiritual life and cultural life that envelops the planet, and is as important to the well-being of the planet as indeed is the biological web of life that you know as a biosphere. And you might think of this cultural web of life as being an ethnosphere and you might define the ethnosphere as being the sum total of all thoughts and dreams, myths, ideas, inspirations, intuitions brought into being by the human imagination since the dawn of consciousness. The ethnosphere is humanity’s great legacy. It’s the symbol of all that we are and all that we can be as an astonishingly inquisitive species.

Facebooktwittermail