Category Archives: Culture

Astra Taylor: Misogyny and the cult of internet openness

In December, and again in February, at the Google Bus blockades in San Francisco, one thing struck me forcefully: the technology corporation employees waiting for their buses were all staring so intently at their phones that they apparently didn’t notice the unusual things going on around them until their buses were surrounded. Sometimes I feel like I’m living in a science-fiction novel, because my region is so afflicted with people who stare at the tiny screens in their hands on trains, in restaurants, while crossing the street, and too often while driving. San Francisco is, after all, where director Phil Kaufman set the 1978 remake of Invasion of the Body Snatchers, the movie wherein a ferny spore-spouting form of alien life colonizes human beings so that they become zombie-like figures.

In the movies, such colonization took place secretly, or by force, or both: it was a war, and (once upon a time) an allegory for the Cold War and a possible communist takeover. Today, however — Hypercapitalism Invades! — we not only choose to carry around those mobile devices, but pay corporations hefty monthly fees to do so. In return, we get to voluntarily join the great hive of people being in touch all the time, so much so that human nature seems in the process of being remade, with the young immersed in a kind of contact that makes solitude seem like polar ice, something that’s melting away.

We got phones, and then Smart Phones, and then Angry Birds and a million apps — and a golden opportunity to be tracked all the time thanks to the GPS devices in those phones. Your cell phone is the shackle that chains you to corporate America (and potentially to government surveillance as well) and like the ankle bracelets that prisoners wear, it’s your monitor. It connects you to the Internet and so to the brave new world that has such men as Larry Ellison and Mark Zuckerberg in it. That world — maybe not so brave after all — is the subject of Astra Taylor’s necessary, alarming, and exciting new book, The People’s Platform: Taking Back Power and Culture in the Digital Age.

The Internet arose with little regulation, little public decision-making, and a whole lot of fantasy about how it was going to make everyone powerful and how everything would be free. Free, as in unregulated and open, got confused with free, as in not getting paid, and somehow everyone from Facebook to Arianna Huffington created massively lucrative sites (based on advertising dollars) in which the people who made the content went unpaid. Just as Russia woke up with oil oligarchs spreading like mushrooms after a night’s heavy rain, so we woke up with new titans of the tech industry throwing their billionaire weight around. The Internet turns out to be a superb mechanism for consolidating money and power and control, even as it gives toys and minor voices to the rest of us.

As Taylor writes in her book, “The online sphere inspires incessant talk of gift economies and public-spiritedness and democracy, but commercialization and privatization and inequality lurk beneath the surface. This contradiction is captured in a single word: ‘open,’ a concept capacious enough to contain both the communal and capitalistic impulses central to Web 2.0.” And she goes on to discuss, “the tendency of open systems to amplify inequality — and new media thinkers’ glib disregard for this fundamental characteristic.”  Part of what makes her book exceptional, in fact, is its breadth. It reviews much of the existing critique of the Internet and connects the critiques of specific aspects of it into an overview of how a phenomenon supposed to be wildly democratic has become wildly not that way at all.

And at a certain juncture, she turns to gender. Though far from the only weak point of the Internet as an egalitarian space — after all, there’s privacy (lack of), the environment (massive server farms), and economics (tax cheats, “content providers” like musicians fleeced) — gender politics, as she shows in today’s post adapted from her book, is one of the most spectacular problems online. Let’s imagine this as science fiction: a group of humans apparently dissatisfied with how things were going on Earth — where women were increasing their rights, representation, and participation — left our orbit and started their own society on their own planet. The new planet wasn’t far away or hard to get to (if you could afford the technology): it was called the Internet. We all know it by name; we all visit it; but we don’t name the society that dominates it much.

Taylor does: the dominant society, celebrating itself and pretty much silencing everyone else, makes the Internet bear a striking resemblance to Congress in 1850 or a gentlemen’s club (minus any gentleness). It’s a gated community, and as Taylor describes today, the security detail is ferocious, patrolling its borders by trolling and threatening dissident voices, and just having a female name or being identified as female is enough to become a target of hate and threats.

Early this year, a few essays were published on Internet misogyny that were so compelling I thought 2014 might be the year we revisit these online persecutions, the way that we revisited rape in 2013, thanks to the Steubenville and New Delhi assault cases of late 2012. But the subject hasn’t (yet) quite caught fire, and so not much gets said and less gets done about this dynamic new machinery for privileging male and silencing female voices. Which is why we need to keep examining and discussing this, as well as the other problems of the Internet. And why you need to read Astra Taylor’s book. This excerpt is part of her diagnosis of the problems; the book ends with ideas about a cure. Rebecca Solnit

Open systems and glass ceilings
The disappearing woman and life on the internet
By Astra Taylor

The Web is regularly hailed for its “openness” and that’s where the confusion begins, since “open” in no way means “equal.” While the Internet may create space for many voices, it also reflects and often amplifies real-world inequities in striking ways.

An elaborate system organized around hubs and links, the Web has a surprising degree of inequality built into its very architecture. Its traffic, for instance, tends to be distributed according to “power laws,” which follow what’s known as the 80/20 rule — 80% of a desirable resource goes to 20% of the population.

In fact, as anyone knows who has followed the histories of Google, Apple, Amazon, and Facebook, now among the biggest companies in the world, the Web is increasingly a winner-take-all, rich-get-richer sort of place, which means the disparate percentages in those power laws are only likely to look uglier over time.

Continue reading

Facebooktwittermail

America’s huge appetite for conspiracy theories

Conspiracy Theories and the Paranoid Style(s) of Mass Opinion,” a paper recently published in the American Journal of Political Science, finds that half of Americans consistently endorse at least one conspiracy theory.

Tom Jacobs writes: It’s easy to assume this represents widespread ignorance, but these findings suggest otherwise. Oliver and Wood report that, except for the Obama “birthers” and the 9/11 “truthers,” “respondents who endorse conspiracy theories are not less-informed about basic political facts than average citizens.”

So what does drive belief in these contrived explanations? The researchers argue the tendency to accept them is “derived from two innate psychological predispositions.”

The first, which has an evolutionary explanation, is an “unconscious cognitive bias to draw causal connections between seemingly related phenomena.” Jumping to conclusions based on weak evidence allows us to “project feelings of control in uncertain situations,” the researchers note.

The second is our “natural attraction towards melodramatic narratives as explanations for prominent events — particularly those that interpret history (in terms of) universal struggles between good and evil.”

Stories that fit that pattern “provide compelling explanations for otherwise confusing or ambiguous events, they write, noting that “many predominant beliefs systems … draw heavily upon the idea of unseen, intentional forces shaping contemporary events.”

“For many Americans, complicated or nuanced explanations for political events are both cognitively taxing and have limited appeal,” write Oliver and Wood. “A conspiracy narrative may provide a more accessible and convincing account of political events.”

That said, they add, “Even highly engaged or ideological segments of the population can be swayed by the power of these narratives, particularly when they coincide with their other political views.”

Facebooktwittermail

Cahokia: North America’s first melting pot?

Christian Science Monitor: The first experiment in “melting pot” politics in North America appears to have emerged nearly 1,000 years ago in the bottom lands of the Mississippi River near today’s St. Louis, according to archaeologists piecing together the story of the rise and fall of the native American urban complex known as Cahokia.

During its heyday, Cahokia’s population reached an estimated 20,000 people – a level the continent north of the Rio Grande wouldn’t see again until the eve of the American Revolution and the growth of New York and Philadelphia.

Cahokia’s ceremonial center, seven miles northeast of St. Louis’s Gateway Arch, boasted 120 earthen mounds, including a broad, tiered mound some 10 stories high. In East St. Louis, one of two major satellites hosts another 50 earthen mounds, as well as residences. St. Louis hosted another 26 mounds and associated dwellings.

These are three of the four largest native-American mound centers known, “all within spitting distance of one another,” says Thomas Emerson, Illinois State Archaeologist and a member of a team testing the melting-pot idea. “That’s some kind of large, integrated complex to some degree.”

Where did all those people come from? Archaeologists have been debating that question for years, Dr. Emerson says. Unfortunately, the locals left no written record of the complex’s history. Artifacts such as pottery, tools, or body ornaments give an ambiguous answer.

Artifacts from Cahokia have been found in other native-American centers from Arkansas and northern Louisiana to Oklahoma, Iowa, and Wisconsin, just as artifacts from these areas appear in digs at Cahokia.

“Archaeologists are always struggling with this: Are artifacts moving, or are people moving?” Emerson says.

Emerson and two colleagues at the University of Illinois at Urbana-Champaign tried to tackle the question using two radioactive forms of the element strontium found in human teeth. They discovered that throughout the 300 years that native Americans occupied Cahokia, the complex appeared to receive a steady stream of immigrants who stayed. [Continue reading…]

Facebooktwittermail

Throughout our existence humans have always been the most destructive creatures to roam this planet

woolly-mammoth

For those of us who see industrial civilization as the guarantor of humanity’s destruction, it’s easy to picture an idyllic era earlier in our evolution, located perhaps during the cultural flowering of the Great Leap Forward.

Communities then remained relatively egalitarian without workers enslaved in back-breaking labor, while subsistence on few material resources meant that time was neither controlled by the dictates of a stratified social hierarchy nor by the demands of survival.

When people could accord as much value to storytelling, ritual, and music-making, as they did to hunting and gathering food, we might like to think that human beings were living in balance with nature.

As George Monbiot reveals, the emerging evidence about of our early ancestors paints a much grimmer picture — one in which human nature appears to have always been profoundly destructive.

You want to know who we are? Really? You think you do, but you will regret it. This article, if you have any love for the world, will inject you with a venom – a soul-scraping sadness – without an obvious antidote.

The Anthropocene, now a popular term among scientists, is the epoch in which we live: one dominated by human impacts on the living world. Most date it from the beginning of the industrial revolution. But it might have begun much earlier, with a killing spree that commenced two million years ago. What rose onto its hind legs on the African savannahs was, from the outset, death: the destroyer of worlds.

Before Homo erectus, perhaps our first recognisably human ancestor, emerged in Africa, the continent abounded with monsters. There were several species of elephants. There were sabretooths and false sabretooths, giant hyenas and creatures like those released in The Hunger Games: amphicyonids, or bear dogs, vast predators with an enormous bite.

Prof Blaire van Valkenburgh has developed a means by which we could roughly determine how many of these animals there were. When there are few predators and plenty of prey, the predators eat only the best parts of the carcass. When competition is intense, they eat everything, including the bones. The more bones a carnivore eats, the more likely its teeth are to be worn or broken. The breakages in carnivores’ teeth were massively greater in the pre-human era.

Not only were there more species of predators, including species much larger than any found on Earth today, but they appear to have been much more abundant – and desperate. We evolved in a terrible, wonderful world – that was no match for us. [Continue reading…]

Facebooktwittermail

Devasting consequences of losing ‘knowledgeable elders’ in non-human cultures

bluefin-tuna

Culture — something we generally associate with its expressions through art, music, literature and so forth — is commonly viewed as one of the defining attributes of humanity. We supposedly rose above animal instinct when we started creating bodies of knowledge, held collectively and passed down from generation to generation.

But it increasingly appears that this perspective has less to do with an appreciation of what makes us human than it has with our ignorance about non-human cultures.

Although non-human cultures don’t produce the kind of artifacts we create, the role of knowledge-sharing seems to be just as vital to the success of these societies as it is to ours. In other words, what makes these creatures what they are cannot be reduced to the structure of their DNA — it also involves a dynamic and learned element: the transmission of collective knowledge.

The survival of some species doesn’t simply depend on their capacity to replicate their DNA; it depends on their ability to pass on what they know.

Scuola Internazionale Superiore di Studi Avanzati: Small changes in a population may lead to dramatic consequences, like the disappearance of the migratory route of a species. A study carried out in collaboration with the SISSA has created a model of the behaviour of a group of individuals on the move (like a school of fish, a herd of sheep or a flock of birds, etc.) which, by changing a few simple parameters, reproduces the collective behaviour patterns observed in the wild. The model shows that small quantitative changes in the number of knowledgeable individuals and availability of food can lead to radical qualitative changes in the group’s behaviour.

Until the ’50s, bluefin tuna fishing was a thriving industry in Norway, second only to sardine fishing. Every year, bluefin tuna used to migrate from the eastern Mediterranean up to the Norwegian coasts. Suddenly, however, over no more than 4-5 years, the tuna never went back to Norway. In an attempt to solve this problem, Giancarlo De Luca from SISSA (the International School for Advanced Studies of Trieste) together with an international team of researchers (from the Centre for Theoretical Physics — ICTP — of Trieste and the Technical University of Denmark) started to devise a model based on an “adaptive stochastic network.” The physicists wanted to simulate, simplifying it, the collective behaviour of animal groups. Their findings, published in the journal Interface, show that the number of “informed individuals” in a group, sociality and the strength of the decision of the informed individuals are “critical” variables, such that even minimal fluctuations in these variables can result in catastrophic changes to the system.

“We started out by taking inspiration from the phenomenon that affected the bluefin tuna, but in actual fact we then developed a general model that can be applied to many situations of groups “on the move,” explains De Luca.

The collective behaviour of a group can be treated as an “emerging property,” that is, the result of the self-organization of each individual’s behaviour. “The majority of individuals in a group may not possess adequate knowledge, for example, about where to find rich feeding grounds” explains De Luca. “However, for the group to function, it is enough that only a minority of individuals possess that information. The others, the ones who don’t, will obey simple social rules, for example by following their neighbours.”

The tendency to comply with the norm, the number of knowledgeable individuals and the determination with which they follow their preferred route (which the researchers interpreted as being directly related to the appeal, or abundance, of the resource) are critical variables. “When the number of informed individuals falls below a certain level, or the strength of their determination to go in a certain direction falls below a certain threshold, the migratory pathway disappears abruptly.”

“In our networks the individuals are “points,” with interconnections that form and disappear in the course of the process, following some established rules. It’s a simple and general way to model the system which has the advantage of being able to be solved analytically,” comments De Luca.

So what ever happened to the Norwegian tuna? “Based on our results we formulated some hypotheses which will, however, have to be tested experimentally,” says De Luca. In the’50s Norway experienced a reduction in biomass and in the quantity of herrings, the main prey of tuna, which might have played a role in their disappearance. “This is consistent with our model, but there’s more to the story. In a short time the herring population returned to normal levels, whereas the tuna never came back. Why?”

One hypothesis is that, although the overall number of Mediterranean tuna has not changed, what has changed is the composition of the population: “The most desirable tuna specimens for the fishing industry are the larger, older individuals, which are presumably also those with the greater amount of knowledge, in other words the knowledgeable elders.” concludes De Luca.

Another curious fact: what happens if there are too many knowledgeable elders? “Too many know-alls are useless,” jokes De Luca. “In fact, above a certain number of informed individuals, the group performance does not improve so much as to justify the “cost” of their training. The best cost-benefit ratio is obtained by keeping the number of informed individuals above a certain level, provided they remain a minority of the whole population.”

Facebooktwittermail

In unseen worlds, science invariably crosses paths with fantasy

f13-iconPhilip Ball writes: For centuries, scientists studied light to comprehend the visible world. Why are things colored? What is a rainbow? How do our eyes work? And what is light itself? These are questions that preoccupied scientists and philosophers since the time of Aristotle, including Roger Bacon, Isaac Newton, Michael Faraday, Thomas Young, and James Clerk Maxwell.

But in the late 19th century all that changed, and it was largely Maxwell’s doing. This was the period in which the whole focus of physics — then still emerging as a distinct scientific discipline — shifted from the visible to the invisible. Light itself was instrumental to that change. Not only were the components of light invisible “fields,” but light was revealed as merely a small slice of a rainbow extending far into the unseen.

Physics has never looked back. Today its theories and concepts are concerned largely with invisible entities: not only unseen force fields and insensible rays but particles too small to see even with the most advanced microscopes. We now know that our everyday perception grants us access to only a tiny fraction of reality. Telescopes responding to radio waves, infrared radiation, and X-rays have vastly expanded our view of the universe, while electron microscopes, X-ray beams, and other fine probes of nature’s granularity have unveiled the microworld hidden beyond our visual acuity. Theories at the speculative forefront of physics flesh out this unseen universe with parallel worlds and with mysterious entities named for their very invisibility: dark matter and dark energy.

This move beyond the visible has become a fundamental part of science’s narrative. But it’s a more complicated shift than we often appreciate. Making sense of what is unseen — of what lies “beyond the light” — has a much longer history in human experience. Before science had the means to explore that realm, we had to make do with stories that became enshrined in myth and folklore. Those stories aren’t banished as science advances; they are simply reinvented. Scientists working at the forefront of the invisible will always be confronted with gaps in knowledge, understanding, and experimental capability. In the face of those limits, they draw unconsciously on the imagery of the old stories. This is a necessary part of science, and these stories can sometimes suggest genuinely productive scientific ideas. But the danger is that we will start to believe them at face value, mistaking them for theories.

A backward glance at the history of the invisible shows how the narratives and tropes of myth and folklore can stimulate science, while showing that the truth will probably turn out to be far stranger and more unexpected than these old stories can accommodate. [Continue reading…]

Facebooktwittermail

The roots of America’s narcissism epidemic

f13-iconWill Storr writes: For much of human history, our beliefs have been based on the assumption that people are fundamentally bad. Strip away a person’s smile and you’ll find a grotesque, writhing animal-thing. Human instincts have to be controlled, and religions have often been guides for containing the demons. Sigmund Freud held a similar view: Psychotherapy was his method of making the unconscious conscious, helping people restrain their bestial desires and accord with the moral laws of civilization.

In the middle of the 20th century, an alternative school of thought appeared. It was popularized by Carl Rogers, an influential psychotherapist at the University of Chicago, and it reversed the presumption of original sin. Rogers argued that people are innately decent. Children, he believed, should be raised in an environment of “unconditional positive regard”. They should be liberated from the inhibitions and restraints that prevented them from attaining their full potential.

It was a characteristically American idea — perhaps even the American idea. Underneath it all, people are good, and to get the best out of themselves, they just need to be free.

Economic change gave Rogers’s theory traction. It was the 1950s, and a nation of workmen was turning into a nation of salesmen. To make good in life, interpersonal sunniness was becoming essential. Meanwhile, rising divorce rates and the surge of women into the workplace were triggering anxieties about the lives of children born into the baby boom. Parents wanted to counteract the stresses of modern family life, and boosting their children’s self-esteem seemed like the solution.

By the early 1960s, wild thinkers in California were pushing Rogers’s idea even further. The “human potential movement” argued that most people were using just 10 percent of their intellectual capacity. It leaned on the work of Abraham Maslow, who studied exceptional people such as Albert Einstein and Eleanor Roosevelt and said there were five human needs, the most important of which was self-actualization—the realization of one’s maximum potential. Number two on the list was esteem.

At the close of the decade, the idea that self-esteem was the key to psychological riches finally exploded. The trigger was Nathaniel Branden, a handsome Canadian psychotherapist who had moved to Los Angeles as a disciple of the philosopher Ayn Rand. One of Rand’s big ideas was that that moral good would arise when humans ruthlessly pursued their own self-interest. She and Branden began a tortuous love affair, and her theories had an intense impact on the young psychotherapist. In The Psychology of Self-Esteem, published in 1969, Branden argued that self-esteem “has profound effects on a man’s thinking processes, emotions, desires, values and goals. It is the single most significant key to his behavior.” It was an international bestseller, and it propelled the self-esteem movement out of the counterculture and into the mainstream.

The year that Branden published his book, a sixteen-year-old in Euclid, Ohio named Roy Baumeister was grappling with his own self-esteem problem: his Dad. [Continue reading…]

Facebooktwittermail

The great rewilding

f13-iconOrion magazine: One day, the British environmental writer George Monbiot was digging in his garden when he had a revelation—that his life had become too tidy and constrained. While exploring what it would take to re-ignite his own sense of wonder, he waded into a sea of ideas about restoration and rewilding that so captured his imagination that it became the focus of his next book. Feral: Searching for Enchantment on the Frontiers of Rewilding was published in the United Kingdom in 2013, to much acclaim, and is forthcoming in the U.S. in 2014. Orion editor Jennifer Sahn caught up with Monbiot to talk about rewilding — what it means for people, for nature, and for an environmental movement that is in great need of having far wider appeal.

***

Jennifer Sahn: It’s sort of an obvious starting place, but I think it makes sense to begin by asking how you define rewilding.

George Monbiot: Actually, there are two definitions of rewilding that appeal to me. One is the mass restoration of ecosystems. By restoration, I really mean bringing back their trophic function. Trophic function involves feeding. It’s about eating and being eaten. Trophic function is the interactions between animals and plants in the food chain. Most of our ecosystems are very impoverished as far as those interactions are concerned. They’re missing the top predators and the big herbivores, and so they’re missing a lot of their ecological dynamism. That, above all, is what I want to restore.

I see the mass restoration of ecosystems, meaning taking down the fences, blocking up the drainage ditches, enabling wildlife to spread. Reintroducing missing species, and particularly missing species which are keystone species, or ecosystem engineers. These are species which have impacts greater than their biomass alone would suggest. They create habitats, and create opportunities for many other species. Good examples would be beavers, wolves, wild boar, elephants, whales — all of which have huge ramifying effects on the ecosystem, including parts of the ecosystem with which they have no direct contact.

Otherwise, I see humans having very little continuing management role in the ecosystem. Having brought back the elements which can restore that dynamism, we then step back and stop trying to interfere. That, in a way, is the hardest thing of all — to stop believing that, without our help, everything’s going to go horribly wrong. I think in many ways we still suffer from the biblical myth of dominion where we see ourselves as the guardians or the stewards of the planet, whereas I think it does best when we have as little influence as we can get away with.

The other definition of rewilding that interests me is the rewilding of our own lives. I believe the two processes are closely intertwined—if we have spaces on our doorsteps in which nature is allowed to do its own thing, in which it can be to some extent self-willed, driven by its own dynamic processes, that, I feel, is a much more exciting and thrilling ecosystem to explore and discover, and it enables us to enrich our lives, to fill them with wonder and enchantment.

Jennifer: So you’re using rewilding in part as a reflexive verb?

George: Absolutely. Of all the species that need rewilding, I think human beings come at the top of the list. I would love to see a more intense and emotional engagement of human beings with the living world. The process of rewilding the ecosystem gives us an opportunity to make our lives richer and rawer than they tend to be in our very crowded and overcivilized and buttoned-down societies. [Continue reading…]

Facebooktwittermail

How the north ended up on top of the map

f13-iconNick Danforth writes: Why do maps always show the north as up? For those who don’t just take it for granted, the common answer is that Europeans made the maps and they wanted to be on top. But there’s really no good reason for the north to claim top-notch cartographic real estate over any other bearing, as an examination of old maps from different places and periods can confirm.

The profound arbitrariness of our current cartographic conventions was made evident by McArthur’s Universal Corrective Map of the World, an iconic “upside down” view of the world that recently celebrated its 35th anniversary. Launched by Australian Stuart McArthur on Jan. 26, 1979 (Australia Day, naturally), this map is supposed to challenge our casual acceptance of European perspectives as global norms. But seen today with the title “Australia: No Longer Down Under,” it’s hard not to wonder why the upside-down map, for all its subversiveness, wasn’t called “Botswana: Back Where It Belongs” or perhaps “Paraguay Paramount!”

The McArthur map also makes us wonder why we are so quick to assume that Northern Europeans were the ones who invented the modern map — and decided which way to hold it — in the first place. As is so often the case, our eagerness to invoke Eurocentrism displays a certain bias of its own, since in fact, the north’s elite cartographic status owes more to Byzantine monks and Majorcan Jews than it does to any Englishman. [Continue reading…]

Facebooktwittermail

Studying ritual in order to understand politics in Libya

When I was an undergraduate, early on I learned about the value of interdisciplinary studies. Had I been on a conventional academic track, that probably wouldn’t have happened, but I was lucky enough to be in a department that brought together anthropologists, sociologists, philosophers, theologians, and religious studies scholars. In such an environment, the sharp defense of disciplinary turf was not only unwelcome — it simply made no sense.

Even so, universities remain structurally antagonistic to interdisciplinarity, both for intellectual reasons but perhaps more than anything for professional reasons. Anyone who wants to set themselves on a track towards tenure needs to get published and academic journals all fall within and help sustain disciplinary boundaries.

I mention this because when questions are raised such as what’s happening in Libya? or the more loaded, what’s gone wrong in Libya? the range of experts who get called on to respond, tends to be quite limited. There will be regional experts, political scientists, and perhaps economists. But calling on someone with an understanding of the human function of ritual along with the role different forms of ritual may have had in the development of civilization, is not an obvious way of trying to gain insight into events in Benghazi.

Moreover, within discourse that is heavily influenced by secular assumptions about the problematic nature of religion and the irrational roots of extremism, there is a social bias in the West that favors a popular dismissal.

What’s wrong with Libya? Those people are nuts.

Philip Weiss helped popularize the expression Progressive Except on Palestine — an accusation that most frequently gets directed at American liberal Zionists. But over the last two years a new variant which is perhaps even more commonplace has proliferated across the Left which with only slight overstatement could be called Progressive Except on the Middle East.

From this perspective, a suspicion of Muslim men with beards — especially those in Libya and Syria — has become a way through which a Clash of Civilizations narrative is unwittingly being reborn. Add to that the influence of the likes of Richard Dawkins and his cohorts on their mission to “decry supernaturalism in all its forms” and what you end up with is a stifling of curiosity — a lack of any genuine interest in trying to understand why people behave the way they do if you’ve already concluded that their behavior is something to be condemned.

A year ago, the science journal Nature, published an article on human rituals, their role in the growth of community and the emergence of civilization.

The report focuses on a global project one of whose principal aims is to test a theory that rituals come in two basic forms: one that through intense and often traumatic experience can forge tight bonds in small groups and the other that provides social cohesion less intensely but on a larger scale through doctrinal unity.

Last week, the State department designated three branches of Ansar al Shariah — two in Libya and one in Tunisia — as terrorist organizations. The information provided gives no indication about how or if the groups are linked beyond the fact that they share the same name — a name used by separate groups in eight different countries.

There’s reason to suspect that the U.S. government is engaged in its own form of ritualistic behavior much like the Spanish Inquisition busily branding heretics.

Maybe if the Obama administration spent a bit more time talking to anthropologists and archeologists rather than political consultants and security advisers, they would be able to develop a more coherent and constructive policy on Libya. I’m not kidding.

In Nature, Dan Jones writes: By July 2011, when Brian McQuinn made the 18-hour boat trip from Malta to the Libyan port of Misrata, the bloody uprising against Libyan dictator Muammar Gaddafi had already been under way for five months.

“The whole city was under siege, with Gaddafi forces on all sides,” recalls Canadian-born McQuinn. He was no stranger to such situations, having spent the previous decade working for peace-building organizations in countries including Rwanda and Bosnia. But this time, as a doctoral student in anthropology at the University of Oxford, UK, he was taking the risk for the sake of research. His plan was to make contact with rebel groups and travel with them as they fought, studying how they used ritual to create solidarity and loyalty amid constant violence.

It worked: McQuinn stayed with the rebels for seven months, compiling a strikingly close and personal case study of how rituals evolved through combat and eventual victory. And his work was just one part of a much bigger project: a £3.2-million (US$5-million) investigation into ritual, community and conflict, which is funded until 2016 by the UK Economic and Social Research Council (ESRC) and headed by McQuinn’s supervisor, Oxford anthropologist Harvey Whitehouse.

Rituals are a human universal — “the glue that holds social groups together”, explains Whitehouse, who leads the team of anthropologists, psychologists, historians, economists and archaeologists from 12 universities in the United Kingdom, the United States and Canada. Rituals can vary enormously, from the recitation of prayers in church, to the sometimes violent and humiliating initiations of US college fraternity pledges, to the bleeding of a young man’s penis with bamboo razors and pig incisors in purity rituals among the Ilahita Arapesh of New Guinea. But beneath that diversity, Whitehouse believes, rituals are always about building community — which arguably makes them central to understanding how civilization itself began.

To explore these possibilities, and to tease apart how this social glue works, Whitehouse’s project will combine fieldwork such as McQuinn’s with archaeological digs and laboratory studies around the world, from Vancouver, Canada, to the island archipelago of Vanuatu in the south Pacific Ocean. “This is the most wide-ranging scientific project on rituals attempted to date,” says Scott Atran, director of anthropological research at the CNRS, the French national research organization, in Paris, and an adviser to the project.
Human rites

A major aim of the investigation is to test Whitehouse’s theory that rituals come in two broad types, which have different effects on group bonding. Routine actions such as prayers at church, mosque or synagogue, or the daily pledge of allegiance recited in many US elementary schools, are rituals operating in what Whitehouse calls the ‘doctrinal mode’. He argues that these rituals, which are easily transmitted to children and strangers, are well suited to forging religions, tribes, cities and nations — broad-based communities that do not depend on face-to-face contact.

Rare, traumatic activities such as beating, scarring or self-mutilation, by contrast, are rituals operating in what Whitehouse calls the ‘imagistic mode’. “Traumatic rituals create strong bonds among those who experience them together,” he says, which makes them especially suited to creating small, intensely committed groups such as cults, military platoons or terrorist cells. “With the imagistic mode, we never find groups of the same kind of scale, uniformity, centralization or hierarchical structure that typifies the doctrinal mode,” he says.

Whitehouse has been developing this theory of ‘divergent modes of ritual and religion’ since the late 1980s, based on his field work in Papua New Guinea and elsewhere. His ideas have attracted the attention of psychologists, archaeologists and historians.

Until recently, however, the theory was largely based on selected ethnographic and historical case studies, leaving it open to the charge of cherry-picking. The current rituals project is an effort by Whitehouse and his colleagues to answer that charge with deeper, more systematic data.

The pursuit of such data sent McQuinn to Libya. His strategy was to look at how the defining features of the imagistic and doctrinal modes — emotionally intense experiences shared among a small number of people, compared with routine, daily practices that large numbers of people engage in — fed into the evolution of rebel fighting groups from small bands to large brigades.

At first, says McQuinn, neighbourhood friends formed small groups comprising “the number of people you could fit in a car”. Later, fighters began living together in groups of 25–40 in disused buildings and the mansions of rich supporters. Finally, after Gaddafi’s forces were pushed out of Misrata, much larger and hierarchically organized brigades emerged that patrolled long stretches of the defensive border of the city. There was even a Misratan Union of Revolutionaries, which by November 2011 had registered 236 rebel brigades.

McQuinn interviewed more than 300 fighters from 21 of these rebel groups, which varied in size from 12 to just over 1,000 members. He found that the early, smaller brigades tended to form around pre-existing personal ties, and became more cohesive and the members more committed to each other as they collectively experienced the fear and excitement of fighting a civil war on the streets of Misrata.

But six of the groups evolved into super-brigades of more than 750 fighters, becoming “something more like a corporate entity with their own organizational rituals”, says McQuinn. A number of the group leaders had run successful businesses, and would bring everyone together each day for collective training, briefings and to reiterate their moral codes of conduct — the kinds of routine group activities characteristic of the doctrinal mode. “These daily practices moved people from being ‘our little group’ to ‘everyone training here is part of our group’,” says McQuinn.

McQuinn and Whitehouse’s work with Libyan fighters underscores how small groups can be tightly fused by the shared trauma of war, just as imagistic rituals induce terror to achieve the same effect. Whitehouse says that he is finding the same thing in as-yet-unpublished studies of the scary, painful and humiliating ‘hazing’ rituals of fraternity and sorority houses on US campuses, as well as in surveys of Vietnam veterans showing how shared trauma shaped loyalty to their fellow soldiers. [Continue reading…]

When people talk about nation-building, they talk about the need to establish security, the rule of law and the development of democratic institutions. They focus on political and civil structures through which social stability takes on a recognizable form — the operation for instance of effective court systems and law enforcement authorities that do not abuse their powers. But what makes all this work, or fail to work, is a sufficient level of social cohesion and if that is lacking, the institutional structures will probably be of little value.

Over the last year and a half, American interest in Libya seems to have been reduced to analysis about what happened on one day in Benghazi. But what might help Libya much more than America’s obsessive need to spot terrorists would be to focus instead on things like promoting football. A win for the national team could work wonders.

*

In the video below, Harvey Whitehouse describes the background to his research.

Facebooktwittermail

Bees translate polarized light into a navigational dance

bee

Queensland Brain Institute: QBI scientists at The University of Queensland have found that honeybees use the pattern of polarised light in the sky invisible to humans to direct one another to a honey source.

The study, conducted in Professor Mandyam Srinivasan’s laboratory at the Queensland Brain Institute, a member of the Australian Research Council Centre of Excellence in Vision Science (ACEVS), demonstrated that bees navigate to and from honey sources by reading the pattern of polarised light in the sky.

“The bees tell each other where the nectar is by converting their polarised ‘light map’ into dance movements,” Professor Srinivasan said.

“The more we find out how honeybees make their way around the landscape, the more awed we feel at the elegant way they solve very complicated problems of navigation that would floor most people – and then communicate them to other bees,” he said.

The discovery shines new light on the astonishing navigational and communication skills of an insect with a brain the size of a pinhead.

The researchers allowed bees to fly down a tunnel to a sugar source, shining only polarised light from above, either aligned with the tunnel or at right angles to the tunnel.

They then filmed what the bees ‘told’ their peers, by waggling their bodies when they got back to the hive.

“It is well known that bees steer by the sun, adjusting their compass as it moves across the sky, and then convert that information into instructions for other bees by waggling their body to signal the direction of the honey,” Professor Srinivasan said.

“Other laboratories have shown from studying their eyes that bees can see a pattern of polarised light in the sky even when the sun isn’t shining: the big question was could they translate the navigational information it provides into their waggle dance.”

The researchers conclude that even when the sun is not shining, bees can tell one another where to find food by reading and dancing to their polarised sky map.

In addition to revealing how bees perform their remarkable tasks, Professor Srinivasan says it also adds to our understanding of some of the most basic machinery of the brain itself.

Professor Srinivasan’s team conjectures that flight under polarised illumination activates discrete populations of cells in the insect’s brain.

When the polarised light was aligned with the tunnel, one pair of ‘place cells’ – neurons important for spatial navigation – became activated, whereas when the light was oriented across the tunnel a different pair of place cells was activated.

The researchers suggest that depending on which set of cells is activated, the bee can work out if the food source lies in a direction toward or opposite the direction of the sun, or in a direction ninety degrees to the left or right of it.

The study, “Honeybee navigation: critically examining the role of polarization compass”, is published in the 6 January 2014 issue of the Philosophical Transactions of the Royal Society B.

Facebooktwittermail

Why non-believers need rituals too

Suzanne Moore writes: The last time I put my own atheism through the spin cycle rather than simply wiping it clean was when I wanted to make a ceremony after the birth of my third child. Would it be a blessing? From who? What does the common notion of a new baby as a gift mean? How would we make it meaningful to the people we invited who were from different faiths? And, importantly, what would it look like?

One of the problems I have with the New Atheism is that it fixates on ethics, ignoring aesthetics at its peril. It tends also towards atomisation, relying on abstracts such as “civic law” to conjure a collective experience. But I love ritual, because it is through ritual that we remake and strengthen our social bonds. As I write, down the road there is a memorial being held for Lou Reed, hosted by the local Unitarian church. Most people there will have no belief in God but will feel glad to be part of a shared appreciation of a man whose god was rock’n’roll.

When it came to making a ceremony, I really did not want the austerity of some humanist events I have attended, where I feel the sensual world is rejected. This is what I mean about aesthetics. Do we cede them to the religious and just look like a bunch of Calvinists? I found myself turning to flowers, flames and incense. Is there anything more beautiful than the offerings made all over the world, of tiny flames and blossom on leaves floating on water?

Already, I am revealing a kind of neo-paganism that hardcore rationalist will find unacceptable. But they find most human things unacceptable. For me, not believing in God does not mean one has to forgo poetry, magic, the chaos of ritual, the remaking of shared bonds. I fear ultra-orthodox atheism has come to resemble a rigid and patriarchal faith itself. [Continue reading…]

Facebooktwittermail

Americans give thanks — for television

Jason Lynch writes: Thanksgiving is a day when more than 100 million Americans will observe the most honored of traditions: gathering with family and friends to watch as many as 15 hours straight of TV.

More than any other major American holiday, Thanksgiving has become a TV-centric day, where people seem to spend far more time in front of the television than they do at the dinner table. And the broadcast networks are taking advantage of that rapt audience through marquee programs that last year attracted more than 114 million viewers.

The TV turkey day festivities kick off at 9am with the Macy’s Thanksgiving Day Parade on NBC, which averaged 22.4 million viewers last year, its largest audience since 2001. NBC Research estimates that 43.2 million people watched at least a portion of the parade. An additional 7.5 million CBS viewers watched that network’s unofficial coverage of the New York City event, billed as The Thanksgiving Day Parade on CBS. The parade concludes at 12pm, and segues into NBC’s coverage of The National Dog Show, which drew 9.2 million viewers in 2012. NBC Research estimates that 19.3 million viewers took in at least part of the Dog Show. [Continue reading…]

Facebooktwittermail

Vanishing tribal cultures

Before They Pass Away,” by British photographer Jimmy Nelson, is described by an Amazon reviewer as “an essential item on everyone’s coffee table.”

It’s ironically fitting that this description comes from a “place” whose name — at least in the U.S. — now more frequently refers to the online mega-store rather than to the South American region. An indication perhaps that we care more about what we buy that what we breath.

Leaving aside the question as to whether anything can be said to be essential on a coffee table, the fact that a record of vanishing peoples would be trivialized by being ascribed this value says a lot about why they are vanishing.

Are we to superficially mourn the loss of cultures yet simultaneously be glad that something was preserved in the form of exquisite photographs? Content, perhaps, that before their demise we were able to snatch images of their exotic dress and thereby from the comfort of a couch somehow enhance our own appreciation of a world gradually being lost?

One could view cultural loss as a representation of cultural failure — that those under threat are those who proved least capable of adaptation. Or, one can see the failure as ours — that this represents yet another frontier in the destructive impact of those who have claimed global cultural domination and in so doing are busy destroying the atmosphere, the biosphere, and the ethnosphere.

Maori

Facebooktwittermail

Plato foresaw the danger of automation

Automation takes many forms and as members of a culture that reveres technology, we generally perceive automation in terms of its output: what it accomplishes, be that through manufacturing, financial transactions, flying aircraft, and so forth.

But automation doesn’t merely accomplish things for human beings; it simultaneously changes us by externalizing intelligence. The intelligence required by a person is transferred to a machine with its embedded commands, allowing the person to turn his intelligence elsewhere — or nowhere.

Automation is invariably sold on the twin claims that it offers greater efficiency, while freeing people from tedious tasks so that — at least in theory — they can give their attention to something more fulfilling.

There’s no disputing the efficiency argument — there could never have been such a thing as mass production without automation — but the promise of freedom has always been oversold. Automation has resulted in the creation of many of the most tedious, soul-destroying forms of labor in human history.

Automated systems are, however, never perfect, and when they break, they reveal the corrupting effect they have had on human intelligence — intelligence whose skilful application has atrophied through lack of use.

Nicholas Carr writes: On the evening of February 12, 2009, a Continental Connection commuter flight made its way through blustery weather between Newark, New Jersey, and Buffalo, New York. As is typical of commercial flights today, the pilots didn’t have all that much to do during the hour-long trip. The captain, Marvin Renslow, manned the controls briefly during takeoff, guiding the Bombardier Q400 turboprop into the air, then switched on the autopilot and let the software do the flying. He and his co-pilot, Rebecca Shaw, chatted — about their families, their careers, the personalities of air-traffic controllers — as the plane cruised uneventfully along its northwesterly route at 16,000 feet. The Q400 was well into its approach to the Buffalo airport, its landing gear down, its wing flaps out, when the pilot’s control yoke began to shudder noisily, a signal that the plane was losing lift and risked going into an aerodynamic stall. The autopilot disconnected, and the captain took over the controls. He reacted quickly, but he did precisely the wrong thing: he jerked back on the yoke, lifting the plane’s nose and reducing its airspeed, instead of pushing the yoke forward to gain velocity. Rather than preventing a stall, Renslow’s action caused one. The plane spun out of control, then plummeted. “We’re down,” the captain said, just before the Q400 slammed into a house in a Buffalo suburb.

The crash, which killed all 49 people on board as well as one person on the ground, should never have happened. A National Transportation Safety Board investigation concluded that the cause of the accident was pilot error. The captain’s response to the stall warning, the investigators reported, “should have been automatic, but his improper flight control inputs were inconsistent with his training” and instead revealed “startle and confusion.” An executive from the company that operated the flight, the regional carrier Colgan Air, admitted that the pilots seemed to lack “situational awareness” as the emergency unfolded.

The Buffalo crash was not an isolated incident. An eerily similar disaster, with far more casualties, occurred a few months later. On the night of May 31, an Air France Airbus A330 took off from Rio de Janeiro, bound for Paris. The jumbo jet ran into a storm over the Atlantic about three hours after takeoff. Its air-speed sensors, coated with ice, began giving faulty readings, causing the autopilot to disengage. Bewildered, the pilot flying the plane, Pierre-Cédric Bonin, yanked back on the stick. The plane rose and a stall warning sounded, but he continued to pull back heedlessly. As the plane climbed sharply, it lost velocity. The airspeed sensors began working again, providing the crew with accurate numbers. Yet Bonin continued to slow the plane. The jet stalled and began to fall. If he had simply let go of the control, the A330 would likely have righted itself. But he didn’t. The plane dropped 35,000 feet in three minutes before hitting the ocean. All 228 passengers and crew members died.

The first automatic pilot, dubbed a “metal airman” in a 1930 Popular Science article, consisted of two gyroscopes, one mounted horizontally, the other vertically, that were connected to a plane’s controls and powered by a wind-driven generator behind the propeller. The horizontal gyroscope kept the wings level, while the vertical one did the steering. Modern autopilot systems bear little resemblance to that rudimentary device. Controlled by onboard computers running immensely complex software, they gather information from electronic sensors and continuously adjust a plane’s attitude, speed, and bearings. Pilots today work inside what they call “glass cockpits.” The old analog dials and gauges are mostly gone. They’ve been replaced by banks of digital displays. Automation has become so sophisticated that on a typical passenger flight, a human pilot holds the controls for a grand total of just three minutes. What pilots spend a lot of time doing is monitoring screens and keying in data. They’ve become, it’s not much of an exaggeration to say, computer operators.

And that, many aviation and automation experts have concluded, is a problem. Overuse of automation erodes pilots’ expertise and dulls their reflexes, leading to what Jan Noyes, an ergonomics expert at Britain’s University of Bristol, terms “a de-skilling of the crew.” No one doubts that autopilot has contributed to improvements in flight safety over the years. It reduces pilot fatigue and provides advance warnings of problems, and it can keep a plane airborne should the crew become disabled. But the steady overall decline in plane crashes masks the recent arrival of “a spectacularly new type of accident,” says Raja Parasuraman, a psychology professor at George Mason University and a leading authority on automation. When an autopilot system fails, too many pilots, thrust abruptly into what has become a rare role, make mistakes. Rory Kay, a veteran United captain who has served as the top safety official of the Air Line Pilots Association, put the problem bluntly in a 2011 interview with the Associated Press: “We’re forgetting how to fly.” The Federal Aviation Administration has become so concerned that in January it issued a “safety alert” to airlines, urging them to get their pilots to do more manual flying. An overreliance on automation, the agency warned, could put planes and passengers at risk.

The experience of airlines should give us pause. It reveals that automation, for all its benefits, can take a toll on the performance and talents of those who rely on it. The implications go well beyond safety. Because automation alters how we act, how we learn, and what we know, it has an ethical dimension. The choices we make, or fail to make, about which tasks we hand off to machines shape our lives and the place we make for ourselves in the world. That has always been true, but in recent years, as the locus of labor-saving technology has shifted from machinery to software, automation has become ever more pervasive, even as its workings have become more hidden from us. Seeking convenience, speed, and efficiency, we rush to off-load work to computers without reflecting on what we might be sacrificing as a result. [Continue reading…]

Now if we think of automation as a form of forgetfulness, we will see that it extends much more deeply into civilization than just its modern manifestations through mechanization and digitization.

In the beginning was the Word and later came the Fall: the point at which language — the primary tool for shaping, expressing and sharing human intelligence — was cut adrift from the human mind and given autonomy in the form of writing.

Through the written word, thought can be immortalized and made universal. No other mechanism could have ever had such a dramatic effect on the exchange of ideas. Without writing, there would have been no such thing as humanity. But we also incurred a loss and because we have such little awareness of this loss, we might find it hard to imagine that preliterate people possessed forms of intelligence we now lack.

Plato described what writing would do — and by extension, what would happen to pilots.

In Phaedrus, he describes an exchange between the god Thamus, king and ruler of all Egypt, and the god Theuth, who has invented writing. Theuth, who is very proud of what he has created says: “This invention, O king, will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.” But Thamus points out that while one man has the ability to invent, the ability to judge an invention’s usefulness or harmfulness belongs to another.

If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.

Bedazzled by our ingenuity and its creations, we are fast forgetting the value of this quality that can never be implanted in a machine (or a text): wisdom.

Even the word itself is beginning to sound arcane — as though it should be reserved for philosophers and storytellers and is no longer something we should all strive to possess.

Facebooktwittermail

Master of many trades

Robert Twigger writes: I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.

We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?

The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’ [Continue reading…]

Facebooktwittermail

Smartphones are killing society

Henry Grabar writes: The host collects phones at the door of the dinner party. At a law firm, partners maintain a no-device policy at meetings. Each day, a fleet of vans assembles outside New York’s high schools, offering, for a small price, to store students’ contraband during the day. In situations where politeness and concentration are expected, backlash is mounting against our smartphones.

In public, of course, it’s a free country. It’s hard to think of a place beyond the sublime darkness of the movie theater where phone use is shunned, let alone regulated. (Even the cinematic exception is up for debate.) At restaurants, phones occupy that choice tablecloth real estate once reserved for a pack of cigarettes. In truly public space — on sidewalks, in parks, on buses and on trains — we move face down, our phones cradled like amulets.

No observer can fail to notice how deeply this development has changed urban life. A deft user can digitally enhance her experience of the city. She can study a map; discover an out-of-the-way restaurant; identify the trees that line the block and the architect who designed the building at the corner. She can photograph that building, share it with friends, and in doing so contribute her observations to a digital community. On her way to the bus (knowing just when it will arrive) she can report the existence of a pothole and check a local news blog.

It would be unfair to say this person isn’t engaged in the city; on the contrary, she may be more finely attuned to neighborhood history and happenings than her companions. But her awareness is secondhand: She misses the quirks and cues of the sidewalk ballet, fails to make eye contact, and limits her perception to a claustrophobic one-fifth of normal. Engrossed in the virtual, she really isn’t here with the rest of us.

Consider the case of a recent murder on a San Francisco train. On Sept. 23, in a crowded car, a man pulls a pistol from his jacket. In Vivian Ho’s words: “He raises the gun, pointing it across the aisle, before tucking it back against his side. He draws it out several more times, once using the hand holding the gun to wipe his nose. Dozens of passengers stand and sit just feet away — but none reacts. Their eyes, focused on smartphones and tablets, don’t lift until the gunman fires a bullet into the back of a San Francisco State student getting off the train.” [Continue reading…]

Facebooktwittermail