The man who invented modern probability

Dr. Slava Gerovitch writes: If two statisticians were to lose each other in an infinite forest, the first thing they would do is get drunk. That way, they would walk more or less randomly, which would give them the best chance of finding each other. However, the statisticians should stay sober if they want to pick mushrooms. Stumbling around drunk and without purpose would reduce the area of exploration, and make it more likely that the seekers would return to the same spot, where the mushrooms are already gone.

Such considerations belong to the statistical theory of “random walk” or “drunkard’s walk,” in which the future depends only on the present and not the past. Today, random walk is used to model share prices, molecular diffusion, neural activity, and population dynamics, among other processes. It is also thought to describe how “genetic drift” can result in a particular gene—say, for blue eye color—becoming prevalent in a population. Ironically, this theory, which ignores the past, has a rather rich history of its own. It is one of the many intellectual innovations dreamed up by Andrei Kolmogorov, a mathematician of startling breadth and ability who revolutionized the role of the unlikely in mathematics, while carefully negotiating the shifting probabilities of political and academic life in Soviet Russia.

As a young man, Kolmogorov was nourished by the intellectual ferment of post-revolutionary Moscow, where literary experimentation, the artistic avant-garde, and radical new scientific ideas were in the air. In the early 1920s, as a 17-year-old history student, he presented a paper to a group of his peers at Moscow University, offering an unconventional statistical analysis of the lives of medieval Russians. It found, for example, that the tax levied on villages was usually a whole number, while taxes on individual households were often expressed as fractions. The paper concluded, controversially for the time, that taxes were imposed on whole villages and then split among the households, rather than imposed on households and accumulated by village. “You have found only one proof,” was his professor’s acid observation. “That is not enough for a historian. You need at least five proofs.” At that moment, Kolmogorov decided to change his concentration to mathematics, where one proof would suffice. [Continue reading...]

facebooktwittermail

What can you really know?

The theoretical physicist, Freeman Dyson, writes: Jim Holt’s Why Does the World Exist?: An Existential Detective Story is a portrait gallery of leading modern philosophers. He visited each of them in turn, warning them in advance that he was coming to discuss with them a single question: “Why is there something rather than nothing?” He reports their reactions to this question, and embellishes their words with descriptions of their habits and personalities. Their answers give us vivid glimpses of the speakers but do not solve the riddle of existence.

The philosophers are more interesting than the philosophy. Most of them are eccentric characters who have risen to the top of their profession. They think their deep thoughts in places of unusual beauty such as Paris and Oxford. They are heirs to an ancient tradition of academic hierarchy, in which disciples sat at the feet of sages, and sages enlightened disciples with Delphic utterances. The universities of Paris and Oxford have maintained this tradition for eight hundred years. The great world religions have maintained it even longer. Universities and religions are the most durable of human institutions.

According to Holt, the two most influential philosophers of the twentieth century were Martin Heidegger and Ludwig Wittgenstein, Heidegger supreme in continental Europe, Wittgenstein in the English-speaking world. Heidegger was one of the founders of existentialism, a school of philosophy that was especially attractive to French intellectuals. Heidegger himself lost his credibility in 1933 when he accepted the position of rector of the University of Freiburg under the newly established Hitler government and became a member of the Nazi Party. Existentialism continued to flourish in France after it faded in Germany.

Wittgenstein, unlike Heidegger, did not establish an ism. He wrote very little, and everything that he wrote was simple and clear. The only book that he published during his lifetime was Tractatus Logico-Philosophicus, written in Vienna in 1918 and published in England with a long introduction by Bertrand Russell in 1922. It fills less than two hundred small pages, even though the original German and the English translation are printed side by side. I was lucky to be given a copy of the Tractatus as a prize when I was in high school. I read it through in one night, in an ecstasy of adolescent enthusiasm. Most of it is about mathematical logic. Only the last five pages deal with human problems. The text is divided into numbered sections, each consisting of one or two sentences. For example, section 6.521 says: “The solution of the problem of life is seen in the vanishing of this problem. Is not this the reason why men, to whom after long doubting the sense of life became clear, could not then say wherein this sense consisted?” The most famous sentence in the book is the final section 7: “Wherof one cannot speak, thereof one must be silent.”

I found the book enlightening and liberating. It said that philosophy is simple and has limited scope. Philosophy is concerned with logic and the correct use of language. All speculations outside this limited area are mysticism. Section 6.522 says: “There is indeed the inexpressible. This shows itself. It is the mystical.” Since the mystical is inexpressible, there is nothing more to be said. Holt summarizes the difference between Heidegger and Wittgenstein in nine words: “Wittgenstein was brave and ascetic, Heidegger treacherous and vain.” These words apply equally to their characters as human beings and to their intellectual output.

Wittgenstein’s intellectual asceticism had a great influence on the philosophers of the English-speaking world. It narrowed the scope of philosophy by excluding ethics and aesthetics. At the same time, his personal asceticism enhanced his credibility. During World War II, he wanted to serve his adopted country in a practical way. Being too old for military service, he took a leave of absence from his academic position in Cambridge and served in a menial job, as a hospital orderly taking care of patients. When I arrived at Cambridge University in 1946, Wittgenstein had just returned from his six years of duty at the hospital. I held him in the highest respect and was delighted to find him living in a room above mine on the same staircase. I frequently met him walking up or down the stairs, but I was too shy to start a conversation. Several times I heard him muttering to himself: “I get stupider and stupider every day.”

Finally, toward the end of my time in Cambridge, I ventured to speak to him. I told him I had enjoyed reading the Tractatus, and I asked him whether he still held the same views that he had expressed twenty-eight years earlier. He remained silent for a long time and then said, “Which newspaper do you represent?” I told him I was a student and not a journalist, but he never answered my question.

Wittgenstein’s response to me was humiliating, and his response to female students who tried to attend his lectures was even worse. If a woman appeared in the audience, he would remain standing silent until she left the room. I decided that he was a charlatan using outrageous behavior to attract attention. I hated him for his rudeness. Fifty years later, walking through a churchyard on the outskirts of Cambridge on a sunny morning in winter, I came by chance upon his tombstone, a massive block of stone lightly covered with fresh snow. On the stone was written the single word, “WITTGENSTEIN.” To my surprise, I found that the old hatred was gone, replaced by a deeper understanding. He was at peace, and I was at peace too, in the white silence. He was no longer an ill-tempered charlatan. He was a tortured soul, the last survivor of a family with a tragic history, living a lonely life among strangers, trying until the end to express the inexpressible. [Continue reading...]

facebooktwittermail

Video — JP Rangaswami: Information is food

facebooktwittermail

Happy New Year?

Tali Sharot, author of The Optimism Bias: Why we’re wired to look on the bright side (this book is not available in the U.S. yet), writes: We like to think of ourselves as rational creatures. We watch our backs, weigh the odds, pack an umbrella. But both neuroscience and social science suggest that we are more optimistic than realistic. On average, we expect things to turn out better than they wind up being. People hugely underestimate their chances of getting divorced, losing their job or being diagnosed with cancer; expect their children to be extraordinarily gifted; envision themselves achieving more than their peers; and overestimate their likely life span (sometimes by 20 years or more).

The belief that the future will be much better than the past and present is known as the optimism bias. It abides in every race, region and socioeconomic bracket. Schoolchildren playing when-I-grow-up are rampant optimists, but so are grown-ups: a 2005 study found that adults over 60 are just as likely to see the glass half full as young adults.

You might expect optimism to erode under the tide of news about violent conflicts, high unemployment, tornadoes and floods and all the threats and failures that shape human life. Collectively we can grow pessimistic – about the direction of our country or the ability of our leaders to improve education and reduce crime. But private optimism, about our personal future, remains incredibly resilient. A survey conducted in 2007 found that while 70% thought families in general were less successful than in their parents’ day, 76% of respondents were optimistic about the future of their own family.

Overly positive assumptions can lead to disastrous miscalculations – make us less likely to get health checkups, apply sunscreen or open a savings account, and more likely to bet the farm on a bad investment. But the bias also protects and inspires us: it keeps us moving forward rather than to the nearest high-rise ledge. Without optimism, our ancestors might never have ventured far from their tribes and we might all be cave dwellers, still huddled together and dreaming of light and heat.

To make progress, we need to be able to imagine alternative realities – better ones – and we need to believe that we can achieve them. Such faith helps motivate us to pursue our goals. Optimists in general work longer hours and tend to earn more. Economists at Duke University found that optimists even save more. And although they are not less likely to divorce, they are more likely to remarry – an act that is, as Samuel Johnson wrote, the triumph of hope over experience.

Even if that better future is often an illusion, optimism has clear benefits in the present. Hope keeps our minds at ease, lowers stress and improves physical health. Researchers studying heart-disease patients found that optimists were more likely than non-optimistic patients to take vitamins, eat low-fat diets and exercise, thereby reducing their overall coronary risk. A study of cancer patients revealed that pessimistic patients under 60 were more likely to die within eight months than non-pessimistic patients of the same initial health, status and age.

In fact, a growing body of scientific evidence points to the conclusion that optimism may be hardwired by evolution into the human brain. The science of optimism, once scorned as an intellectually suspect province of pep rallies and smiley faces, is opening a new window on the workings of human consciousness. What it shows could fuel a revolution in psychology, as the field comes to grips with accumulating evidence that our brains aren’t just stamped by the past. They are constantly being shaped by the future.
Hardwired for hope?

I would have liked to tell you that my work on optimism grew out of a keen interest in the positive side of human nature. The reality is that I stumbled onto the brain’s innate optimism by accident. After living through 9/11, in New York City, I had set out to investigate people’s memories of the terrorist attacks. I was intrigued by the fact that people felt their memories were as accurate as a videotape, while often they were filled with errors. A survey conducted around the country showed that 11 months after the attacks, individuals’ recollections of their experience that day were consistent with their initial accounts (given in September 2011) only 63% of the time. They were also poor at remembering details of the event, such as the names of the airline carriers. Where did these mistakes in memory come from?

Scientists who study memory proposed an intriguing answer: memories are susceptible to inaccuracies partly because the neural system responsible for remembering episodes from our past might not have evolved for memory alone. Rather, the core function of the memory system could in fact be to imagine the future – to enable us to prepare for what has yet to come. The system is not designed to perfectly replay past events, the researchers claimed. It is designed to flexibly construct future scenarios in our minds. As a result, memory also ends up being a reconstructive process, and occasionally, details are deleted and others inserted. [Continue reading...]

facebooktwittermail

Steven Pinker’s tilted measure of violence

Steven Pinker claims we are living in the most peaceable era of human existence. In a review of The Better Angels of Our Nature: Why Violence Has Declined, Timothy Snyder challenges Pinker’s thesis.

The central psychological virtue of modern civilization, Pinker claims, is “self-control.” Over the centuries, after people are pacified by the state, they learn to think ahead, to see the perspectives of others, and to pursue their ends without immediate violent action. Violence becomes not only impractical but also taboo. Nazi Germany, as Pinker seems to sense, represents a tremendous problem for this argument. Germany in the 1930s was probably the most functional state of its time, with low homicide rates and a highly literate population. Mastery of self was not the Nazis’ problem; self-control was in fact a major element of the SS ethos, as preached by Reinhard Heydrich and Heinrich Himmler. Even Adolf Hitler practiced his emotive speeches. Lack of self-control was also not the problem for Joseph Stalin’s executioners, or for Stalin and Stalinists generally. Individual Soviet NKVD men killed hundreds of people, one by one, in a single day; this can hardly be done without self-control of a very high order.

To rescue his argument from the problem posed by the mass killings of the mid-twentieth century, Pinker resorts to claiming that a single individual, in the German case Hitler, was “mostly responsible.” Here, he misrepresents the historians he cites. It is true that most historians would subscribe to some version of “no Hitler, no Holocaust.” But what they mean is that Hitler was a necessary condition for such a calamity, not that he was a sufficient one. There were many other necessary conditions for Nazi racial imperialism. Take, for example, worries about the food supply. In the 1930s, food was highly valued in both Berlin and Moscow. This fact did not dictate which ideologies would define the two states. But in practice, both Hitler and Stalin were obsessed with mastering and exploiting fertile soil, the former to transform Germany into a self-sufficient, racially pure empire, the latter to finance the industrialization of the Soviet Union.

Without recognizing the importance of scarce resources, it is impossible to understand the very different plans for agrarian colonization that the Nazi and Soviet ideologies sanctioned. But Pinker dismisses any claim that resources (rather than bad ideas) were related to the bloodiest conflicts in modern history as a “nutball conspiracy theory.” This is an odd position for him to take, since his own history begins in a premodern world of conflict over resources. By insisting that ideas alone were to blame, he oversimplifies the issue. A more rigorous explanation would explain how political ideas interacted with scarcity, rather than insist that either one or the other must have been the problem.

Modern ideologies were not, as in Pinker’s metaphors, “toxic” forces that “drove” people to do this or that. They provided narratives to explain why some groups and individuals had better access to resources, and appealing visions of the future after an aggressive reordering. Nazi Germany and the Soviet Union were ideological states, but they cannot be dismissed from history simply because they were organized around the wrong ideas. Each of them had plans for economic development that were meant to privilege one group at the expense of others — plans that were inextricably entangled with justifications for why some people deserved more, others less, and others nothing but death (the extreme and unprecedented case being the Holocaust). These ideologies were effective in part because they motivated, and they motivated in part because they delivered, if not plenty, then at least visions of plenty.

We are different from the Nazis and the Soviets not because we have more self-control — we don’t. We are different largely because postwar improvements in agricultural technology have provided the West with reliable supplies of food, our massive consumption of which says much about our limited self-control. But what if food were to become scarcer and more expensive, as seems now to be the trend? What if unfavorable climate change were to outrun our technical capacities? Or what if melting glaciers leave societies such as China without fresh water? Pinker claims, unpersuasively, that global warming poses little threat to modern ways of life. But it hardly matters whether he is right: states are already taking action to minimize its consequences. China, for example, is buying up land in Africa and Ukraine in order to compensate for its own shortage of arable soil. The fresh water of Siberia must beckon. If scientists continue to issue credible warnings about the consequences of climate change, it would be surprising if leaders did not conjure up new reasons for preemptive violent action, positioning their states for a new age of want.

Treating Nazi Germany as a historical aberration also allows Pinker to sidestep the question of how Germans and central and western Europeans became such peaceful people after the demise of Nazism. This is a strange oversight, since European pacifism and low European homicide rates are where he begins the book. Today’s Europe is Pinker’s gold standard, but he does not ask why its levels of violence are the lowest in all of his charts. If, as he contends, the “pleasures of bourgeois life” prevent people from fighting, Pinker should also consider the place where these are most fully developed, and how they became so. Pinker persuasively relates how postwar economic cooperation among European states led to a pacifying interdependence, but he fails to stress that the postwar rebirth of European economies was a state-led enterprise funded by a massive U.S. subsidy known as the Marshall Plan. And he says very little about the concurrent development of redistributive social policy within those states. State power goes missing in the very places where states became preoccupied with welfare rather than warfare.

facebooktwittermail

Antonio Damasio: The quest to understand consciousness

facebooktwittermail

The evolutionary roots of collective intelligence

Big Think: For much of the 20th century, social scientists assumed that competition and strife were the natural order of things, as ingrained as the need for food and shelter. The world would be a better place if we could all just be a little more like John Wayne, the thinking went.

Now researchers are beginning to see teamwork as a biological imperative, present in even the most basic life forms on Earth. And it’s not just about fairness, or the strong lifting up the weak. Collective problem-solving is simply more efficient than rugged individualism.

facebooktwittermail

Today Maoism speaks to the world’s poor more fluently than ever

Pankaj Mishra writes:

In 2008 in Beijing I met the Chinese novelist Yu Hua shortly after he had returned from Nepal, where revolutionaries inspired by Mao Zedong had overthrown a monarchy. A young Red Guard during the Cultural Revolution, Yu Hua, like many Chinese of his generation, has extremely complicated views on Mao. Still, he was astonished, he told me, to see Nepalese Maoists singing songs from his Maoist youth – sentiments he never expected to hear again in his lifetime.
otto 20/07 Illustration by Otto

In fact, the success of Nepalese Maoists is only one sign of the “return” of Mao. In central India armed groups proudly calling themselves Maoists control a broad swath of territory, fiercely resisting the Indian government’s attempts to make the region’s resource-rich forests safe for the mining operations that, according to a recent report in Foreign Policy magazine, “major global companies like Toyota and Coca-Cola” now rely on.

And – as though not to be outdone by Mao’s foreign admirers – some Chinese have begun to carefully deploy Mao’s still deeply ambiguous memory in China. Texting Mao’s sayings to mobile phones, broadcasting “Red” songs from state-owned radio and television, and sending college students to the countryside, Bo Xilai, the ambitious communist party chief of the southwestern municipality of Chongqing, is leading an unexpected Mao revival in China.

It was the “return” of Marx, rather than of Mao, that was much heralded in academic and journalistic circles after the financial crisis of 2008. And it is true that Marxist theorists, rather than Marx himself, clearly anticipated the problems of excessive capital accumulation, and saw how eager and opportunistic investors cause wildly uneven development across regions and nations, enriching a few and impoverishing many others. But Mao’s “Sinified” and practical Marxism, which includes a blueprint for armed rebellion, appears to speak more directly to many people in poor countries.

facebooktwittermail

Tim Harford: Trial, error and the God complex

facebooktwittermail

Nelson Mandela: From prisoner to president

David Africa writes:

As South Africans celebrate the birthday of their national hero Nelson Mandela all the accolades again praise him as a peacemaker, moderate, and a saint. This image of Mandela is one that has been aggressively cultivated since his elevation from prisoner to president with the first democratic election in 1994, and is a curious part of a political project with the twin objectives of moderating one of the primary symbols of the South African liberation struggle on the one hand, and appropriating this ‘new Mandela’ for a moderate or even conservative political project.

The saint-like status that Mandela has acquired in the West, traditionally hostile to Mandela’s politics and that of his organisation the African National Congress (ANC) is mirrored in the false adulation showered upon him by the local parliamentary opposition party the Democratic Alliance. The Democratic Alliance is mainly a coalition of former liberals and the remnants of the National Party that ruled South Africa until 1994. The capture of the Mandela icon and his transformation from militant to moderate saint is now almost complete.

And yet, this is not the Mandela that black South Africans know. The Mandela we know has always been a militant, from his days as a fiery youth leader in the 1940s, through leading the ANC Defiance Campaign against the Apartheid government in 1952 and being the first commander of that organisation’s armed wing when it turned to violent resistance in 1961.

His speech to the court in April 1964, as he and his fellow ANC comrades faced the real risk of the death penalty, is an articulation of a militancy that is at once reasonable and defiant. Throughout his long imprisonment Mandela refused offers of personal freedom in exchange for abandoning violent resistance to the Apartheid government.

facebooktwittermail

How to survive the age of distraction

Johann Hari writes:

The book – the physical paper book – is being circled by a shoal of sharks, with sales down 9 per cent this year alone. It’s being chewed by the e-book. It’s being gored by the death of the bookshop and the library. And most importantly, the mental space it occupied is being eroded by the thousand Weapons of Mass Distraction that surround us all. It’s hard to admit, but we all sense it: it is becoming almost physically harder to read books.

In his gorgeous little book The Lost Art of Reading – Why Books Matter in a Distracted Time, the critic David Ulin admits to a strange feeling. All his life, he had taken reading as for granted as eating – but then, a few years ago, he “became aware, in an apartment full of books, that I could no longer find within myself the quiet necessary to read”. He would sit down to do it at night, as he always had, and read a few paragraphs, then find his mind was wandering, imploring him to check his email, or Twitter, or Facebook. “What I’m struggling with,” he writes, “is the encroachment of the buzz, the sense that there’s something out there that merits my attention.”

I think most of us have this sense today, if we are honest. If you read a book with your laptop thrumming on the other side of the room, it can be like trying to read in the middle of a party, where everyone is shouting to each other. To read, you need to slow down. You need mental silence except for the words. That’s getting harder to find.

No, don’t misunderstand me. I adore the web, and they will have to wrench my Twitter feed from my cold dead hands. This isn’t going to turn into an antedeluvian rant against the glories of our wired world. But there’s a reason why that word – “wired” – means both “connected to the internet” and “high, frantic, unable to concentrate”.

In the age of the internet, physical paper books are a technology we need more, not less. In the 1950s, the novelist Herman Hesse wrote: “The more the need for entertainment and mainstream education can be met by new inventions, the more the book will recover its dignity and authority. We have not yet quite reached the point where young competitors, such as radio, cinema, etc, have taken over the functions from the book it can’t afford to lose.”

We have now reached that point. And here’s the function that the book – the paper book that doesn’t beep or flash or link or let you watch a thousand videos all at once – does for you that nothing else will. It gives you the capacity for deep, linear concentration. As Ulin puts it: “Reading is an act of resistance in a landscape of distraction…. It requires us to pace ourselves. It returns us to a reckoning with time. In the midst of a book, we have no choice but to be patient, to take each thing in its moment, to let the narrative prevail. We regain the world by withdrawing from it just a little, by stepping back from the noise.”

facebooktwittermail

Hitchens on mortality

Christopher Hitchens interviewed by Australia’s ABC TV (the full interview can be viewed on broadband here on Windows Media Player):

TONY JONES: I want to ask you what you think about Martin Amis’ idea that writers like you must actually believe in some form of life after death because not all of you, not all of the parts of you are going to die because the printed words you leave behind constitute a form of immortality. I mean, is he just being kind, or do you think that there’s a truth to that?

CHRISTOPHER HITCHENS: Littera scripta manet – “The written word will remain”. That’s true, but it won’t be that much comfort to me.

Of course I do write – I’ve always had the sense of writing, as it were, posthumously. I once wrote an introduction to a collection of my own essays. I stole the formulation from Nadine Gordimer who said you should try and write as if for post-mortem publication because it only then can screen out all those influences: public opinion, some reviewer you might want to be impressing, some publisher who might want to publish you, someone you’re afraid of offending. All these distractions, you can write purely and honestly and clearly and for its own sake. And the best way of doing that is to imagine that you won’t live to see it actually written, then you can be sure that you’re being objective and you’re being scrupulous.

I think that’s a wonderful reflection, but it doesn’t – it isn’t the same term as immortality at all.

TONY JONES: As you say in your memoirs, you’ve written for decades day in, day out – I think you said at least 1,000 words a day for many, many years – despatches, articles, lectures, books – in particular books. Doesn’t it give you some comfort that your thoughts, and indeed some version of you, is going to exist after your death, is imperishable?

CHRISTOPHER HITCHENS: Well, if you want to know – because I try to avoid the blues when talking about all of this, but if you want to know one of the most sour reflections that I have when I think that I’m 61 now and I might not make 65 – I quite easily might not.

One of the bitter aspects of that is, well, I put in 60 years at the coalface, I worked very hard. In the last few years I’ve got a fair amount of recognition for it. In my opinion, actually, rather more than I deserve. Certainly more than I expected. And I could have looked forward to a few years of, shall we say, cruising speed, you know, just, as it were, relishing that, enjoying it.

Not ceasing to work, not resting on the laurels, but savouring it a bit and that – I was just getting ready for that, as a matter of fact. I was hit right at the top of my form, right in the middle of a successful book tour. I’m not going to get that and that does upset me. So that’s how I demarcate it from immortality.

Similarly, I’m not going to see my grandchildren – almost certainly not. One has children in the expectation of dying before them. In fact, you want to make damn sure you die before them, just as you plant a tree or build a house knowing, hoping that it will outlive you. That’s how the human species has done as well as it has.

The great Cuban writer Jose Marti said that a man – he happened to say it was a man – three duties: to write a book, to plant a tree and to have a son. I remember the year my first son was born was the year I published my first real full-length book, and I had a book party for it and for him – Alexander, my son – and I planted a tree, a weeping willow and felt pretty good for the age of, what?, I think 32 or something.

But, the thought of mortality, in other words of being outlived, is fine when it’s your children, your books or your trees, but it doesn’t reconcile you to an early death. No, it doesn’t.

facebooktwittermail

Wade Davis on endangered cultures

Wade Davis on endangered cultures

You know, one of the intense pleasures of travel and one of the delights of ethnographic research is the opportunity to live amongst those who have not forgotten the old ways, who still feel their past in the wind, touch it in stones polished by rain, taste it in the bitter leaves of plants. Just to know that Jaguar shamans still journey beyond the Milky Way, or the myths of the Inuit elders still resonate with meaning, or that in the Himalaya, the Buddhists still pursue the breath of the Dharma, is to really remember the central revelation of anthropology, and that is the idea that the world in which we live in does not exist in some absolute sense, but is just one model of reality, the consequence of one particular set of adaptive choices that our lineage made, albeit successfully, many generations ago.

And of course, we all share the same adaptive imperatives. We’re all born. We all bring our children into the world. We go through initiation rites. We have to deal with the inexorable separation of death, so it shouldn’t surprise us that we all sing, we all dance, we all have art.

But what’s interesting is the unique cadence of the song, the rhythm of the dance in every culture. And whether it is the Penan in the forests of Borneo, or the Voodoo acolytes in Haiti, or the warriors in the Kaisut desert of Northern Kenya, the Curandero in the mountains of the Andes, or a caravanserai in the middle of the Sahara. This is incidentally the fellow that I travelled into the desert with a month ago, or indeed a yak herder in the slopes of Qomolangma, Everest, the goddess mother of the world.

All of these peoples teach us that there are other ways of being, other ways of thinking, other ways of orienting yourself in the Earth. And this is an idea, if you think about it, can only fill you with hope. Now, together the myriad cultures of the world make up a web of spiritual life and cultural life that envelops the planet, and is as important to the well-being of the planet as indeed is the biological web of life that you know as a biosphere. And you might think of this cultural web of life as being an ethnosphere and you might define the ethnosphere as being the sum total of all thoughts and dreams, myths, ideas, inspirations, intuitions brought into being by the human imagination since the dawn of consciousness. The ethnosphere is humanity’s great legacy. It’s the symbol of all that we are and all that we can be as an astonishingly inquisitive species.

facebooktwittermail

The role of place in the world

The role of place in the world

In recent years, the notion that the world, if not flat, is rapidly flattening as a result of the forces of globalization has gained currency to the point of becoming a platitude. So mobile, so interconnected, so integrated is this new world that historic barriers are no more, interaction is global, ever-freer trade rules the globe, the flow of ideas (and money and jobs) accelerates by the day, and choice, not constraint, is the canon of the converted. Join the “forces of flattening” and you will reap the benefits, say Thomas Friedman and others who advance this point of view. Don’t, and you will fall off the edge. The option is yours.

But is it? In truth, though the world has changed dramatically in the last 50 years, we are still parachuted into places so different that the common ground of globalization has just the thinnest of topsoil. One of some 7,000 languages will become our “mother tongue”; only a small minority of us will have the good fortune of being raised in a version of English, the primary language of globalization. One of tens of thousands of religious denominations is likely to transmit the indoctrination most of us will carry for life. A combination of genetic and environmental conditions defines health prospects that still vary widely around the planet. [continued...]

facebooktwittermail

The Genesis 2.0 Project

The Genesis 2.0 Project

Among the defining attributes of now are ever tinier gadgets, ever shorter attention spans, and the privileging of marketplace values above all. Life is manically parceled into financial quarters, three-minute YouTube videos, 140-character tweets. In my pocket is a phone/computer/camera/video recorder/TV/stereo system half the size of a pack of Marlboros. And what about pursuing knowledge purely for its own sake, without any real thought of, um, monetizing it? Cute.

And so in our hyper-capitalist flibbertigibbet day and age, the new Large Hadron Collider, buried about 330 feet beneath the Swiss-French border, near Geneva, is a bizarre outlier.

The L.H.C., which operates under the auspices of the European Organization for Nuclear Research, known by its French acronym, cern, is an almost unimaginably long-term project. It was conceived a quarter-century ago, was given the green light in 1994, and has been under construction for the last 13 years, the product of tens of millions of man-hours. It’s also gargantuan: a circular tunnel 17 miles around, punctuated by shopping-mall-size subterranean caverns and fitted out with more than $9 billion worth of steel and pipe and cable more reminiscent of Jules Verne than Steve Jobs.

The believe-it-or-not superlatives are so extreme and Tom Swiftian they make you smile. The L.H.C. is not merely the world’s largest particle accelerator but the largest machine ever built. At the center of just one of the four main experimental stations installed around its circumference, and not even the biggest of the four, is a magnet that generates a magnetic field 100,000 times as strong as Earth’s. And because the super-conducting, super-colliding guts of the collider must be cooled by 120 tons of liquid helium, inside the machine it’s one degree colder than outer space, thus making the L.H.C. the coldest place in the universe. [continued...]

facebooktwittermail

Of ants and men

Of ants and men

In The Superorganism: The Beauty, Elegance, and Strangeness of Insect Societies, Bert Holldobler and E.O. Wilson survey the last 15 years of myrmecological research. Picking up where their Pulitzer Prize-winning The Ants left off, The Superorganism is a completely wonderful book. It is packed with astonishing findings and beautiful illustrations, and, happily, it also contains enough information about ant civilization to set up a few ants-vs.-humans scenarios. Let us skip lightly over the fact that to compare ants and humans is to pit thousands of species against just one. Rather, let’s start with the idea that we begin the contest evenly matched—at 6.6 billion humans and approximately 5 million billion ants, humans and ants have roughly the same biomass. What if a global disaster struck? Who would come out on top?

We won’t be able to declare one species smarter or better—each is wildly successful in its own niche, and at any rate, that would make as much sense as saying one is better-looking than the other. Still, we can wonder about how robust life is at such extreme ends of the genes-mind spectrum. What if, for example, you hammered the Earth with a volcano or a big rock from space? Who would survive? Or think about that classic of speculative fiction—mass sterility. Imagine that both ants’ and humans’ biological clocks sputter and stop, and reproduction just doesn’t work as it used to. Is life as we know it over? Perhaps a mysterious plague has moved unnoticed among us until one morning we awake and 70 percent of both populations has disappeared? Could civilization recover? Either one? [continued...]

facebooktwittermail

PULSE: 20 Top Global Thinkers of 2009

PULSE: 20 Top Global Thinkers of 2009

On 30 November 2009 Foreign Policy magazine published its ’Top 100 Global Thinkers’ list. We were naturally skeptical since the selection included Dick Cheney, General Petraeus, Larry Summers, Thomas Friedman, Bernard-Henri Lévy, David Kilcullen, Ayaan Hirsi Ali, Salam Fayyad, The Kagan Family (yes, all of them) and Ahmed Rashid among others. We don’t consider any of these people thinkers, let alone having global significance, and we couldn’t help but notice that the main thrust of all their work aligns with the global military and economic agenda of the US government. In response we asked twelve of our writers and editors to nominate their Top 20 global thinkers of 2009. Our criteria included choosing those who inspire critical thinking, as well as those who have been able to buck received wisdom and shape public debate. Always agreeing with their statements and positions was not a requisite, but in all cases our selections involved nominating those who have spurred people to challenge or enhance their own thinking in different ways. The following is our unranked list. [continued...]

PULSE: 20 Top Global Media Figures of 2009

After we published our list of 20 Top Global Thinkers, we thought we would be remiss if we did not also honor those media figures and institutions who bring these voices to us in the first place. With the goal of recognizing those individuals and institutions responsible for exemplary reportage and awareness-raising in 2009, we asked our editors and writers to name their choices for the top 20 media figures, be they journalists, publications or publishers. We aggregated these nominations into the following list. Like our 20 Top Global Thinkers, our criteria for choosing media figures included people/publications/publishers who have shown a commitment to challenging power, holding it accountable, highlighting issues pertaining to peace and social justice and producing output that encourages critical thinking and questions conventional wisdom. [continued...]

facebooktwittermail

ANALYSIS: The end of Western hegemony

Crisis marks out a new geopolitical order

Blame greedy bankers. Blame Alan Greenspan’s careless stewardship of the US Federal Reserve. Blame feckless homeowners who took out loans they could never expect to repay. Blame politicians and regulators everywhere for closing their eyes to the approaching tempest.

All of the above are culpable. I am sure there are even more villains lurking out there. Sometimes, though, it is worth looking through the other end of the telescope. The wreckage of the financial system holds up a mirror to the changing geopolitical balance. It offers advice, and a warning, as to what the west should make of the emerging global order.

Until quite recently, the talk was about the humbling of America’s laisser faire capitalism. The US government’s $700bn bail-out was the price to be paid for past hubris. For reasons that still elude me, one or two European politicians seemed to delight in the troubles of an ally that still guarantees their security.

Schadenfreude comes before a fall. Solid, conservative Germany has been among the European nations forced to shore up its banks. Angela Merkel, the chancellor, has been driven to assure German voters publicly that their savings are safe.

Belgium and the Netherlands have rescued Fortis. Ireland and Greece have issued blanket guarantees to bank depositors. Others have done something similar. Most dramatically, Gordon Brown’s British government has part-nationalised all of its leading banks in a desperate bid to crack the ice of the credit freeze.

If the toxic mortgage securities and opaque credit swaps that infected the world’s financial system came with a made-in-the-US stamp, European banks were eager buyers. For the humbling of America, we should substitute the humbling of the west.

Asia, as we have seen in the markets this week, is not immune from the shocks and stresses. Japan, which has only quite recently emerged from the long twilight of its 1990s banking collapse, has now been hit anew by the global storm. China felt compelled this week to follow western central banks in cutting interest rates. So did a host of smaller Asian countries. Recession in the US and Europe will slow the growth of Asia’s rising economies.

Standing back, though, two things mark out this crisis as unique. First, is its sheer ferocity. I am not sure how useful it is to make comparisons with the 1930s. History never travels in a straight line. What is evident is that governments and central banks have had no previous experience of coping with shocks and stresses of the intensity and ubiquity we have seen during the past year.

The second difference is one of geography. For the first time, the epicentre has been in the west. Viewed from Washington, London or Paris, financial crises used to be things that happened to someone else – to Latin America, to Asia, to Russia.

The shock waves would sometimes lap at western shores, usually in the form of demands that the rich nations rescue their own imprudent banks. But these crises drew a line between north and south, between the industrialised and developing world. Emerging nations got into a mess; the west told them sternly what they must do to get out of it.

The instructions came in the form of the aptly-named Washington consensus: the painful prescriptions, including market liberalisation and fiscal consolidation, imposed as the price of financial support from the International Monetary Fund.

This time the crisis started on Wall Street, triggered by the steep decline in US house prices. The emerging nations have been the victims rather than the culprit. And the reason for this reversal of roles? They had supped enough of the west’s medicine.

A decade ago, after the crisis of 1997-98 wrought devastation on some of its most vibrant economies, Asia said never again. There would be no more going cap in hand when the going got rough. To avoid the IMF’s ruinous rules, governments would build their own defences against adversity by accumulating reserves of foreign currency.

Those reserves – more than $4,000bn-worth at the present count – financed credit in the US and Europe. There were other sources of liquidity, of course, notably the Fed and the reserves accumulated by energy producers. It also took financial chicanery to turn reckless mortgage lending in to triple A rated securities. But as a Chinese official told my FT colleague David Pilling the other day: “America drowned itself in Asian liquidity.”

Owning up to the geopolitical implications will be as painful for the rich nations as paying the domestic price for the profligacy. The erosion of the west’s moral authority that began with the Iraq war has been greatly accelerated. The west’s debtors cannot any longer expect their creditors to listen to their lectures. Here lies the broader lesson. The shift eastwards in global economic power has become a commonplace of political discourse. Almost everyone in the west now speaks with awe of the pace of China’s rise, of India’s emergence as a geopolitical player, of the growing roles in international relations of Brazil and South Africa.

Yet the rich nations have yet to face up properly to the implications. They can imagine sharing power, but they assume the bargain will be struck on their terms: that the emerging nations will be absorbed – at a pace, mind you, of the west’s choosing – into familiar international forums and institutions.

When American and European diplomats talk about the rising powers becoming responsible stakeholders in the global system, what they really mean is that China, India and the rest must not be allowed to challenge existing standards and norms.

This is the frame of mind that sees the Benelux countries still holding a bigger share than China of the votes at the IMF; and the Group of Seven leading industrialised nations presuming this weekend that it remains the right forum to redesign the global financial system.

I have no inhibitions about promoting the values of the west – of preaching the virtues of the rule of law, pluralist politics and fundamental human rights. Nor of asserting that, for all the financial storms, a liberal market system is the worst option except for all the others. The case for global rules – that open markets need multilateral governance – could not have been made more forcefully than by the present crisis.

Yet the big lesson is that the west can no longer assume the global order will be remade in its own image. For more than two centuries, the US and Europe have exercised an effortless economic, political and cultural hegemony. That era is ending. [complete article]

facebooktwittermail