Richard Dawkins has lost it: ignorant sexism gives atheists a bad name

Adam Lee writes: I became an atheist on my own, but it was Richard Dawkins who strengthened and confirmed my decision. For a long time, I admired his insightful science writing, his fierce polemics, his uncompromising passion for the truth. When something I’d written got a (brief) mention in The God Delusion, it was one of the high points of my life.

So, I’m not saying this is easy, but I have to say it: Richard Dawkins, I’m just not that into you anymore.

The atheist movement – a loosely-knit community of conference-goers, advocacy organizations, writers and activists – has been wracked by infighting the last few years over its persistent gender imbalance and the causes of it. Many female atheists have explained that they don’t get more involved because of the casual sexism endemic to the movement: parts of it see nothing problematic about hosting conferences with all-male speakers or having all-male leadership – and that’s before you get to the vitriolic and dangerous sexual harassment, online and off, that’s designed to intimidate women into silence.

Richard Dawkins has involved himself in some of these controversies, and rarely for the better – as with his infamous “Dear Muslima” letter in 2011, in which he essentially argued that, because women in Muslim countries suffer more from sexist mistreatment, women in the west shouldn’t speak up about sexual harassment or physical intimidation. There was also his sneer at women who advocate anti-sexual harassment policies.

But over the last few months, Dawkins showed signs of détente with his feminist critics – even progress. He signed a joint letter with the writer Ophelia Benson, denouncing and rejecting harassment; he even apologized for the “Dear Muslima” letter. On stage at a conference in Oxford in August, Dawkins claimed to be a feminist and said that everyone else should be, too.

Then another prominent male atheist, Sam Harris, crammed his foot in his mouth and said that atheist activism lacks an “estrogen vibe” and was “to some degree intrinsically male”. And, just like that, the brief Dawkins Spring was over. [Continue reading...]

facebooktwittermail

Ants are cool but teach us nothing

E.O. Wilson writes: For nearly seven decades, starting in boyhood, I’ve studied hundreds of kinds of ants around the world, and this qualifies me, I believe, to offer some advice on ways their lives can be applied to ours. I’ll start with the question I’m most often asked: “What can I do about the ants in my kitchen?” My response comes from the heart: Watch your step, be careful of little lives. Ants especially like honey, tuna and cookie crumbs. So put down bits of those on the floor, and watch as the first scout finds the bait and reports back to her colony by laying an odor trail. Then, as a little column follows her out to the food, you will see social behavior so strange it might be on another planet. Think of kitchen ants not as pests or bugs, but as your personal guest superorganism.

Another question I hear a lot is, “What can we learn of moral value from the ants?” Here again I will answer definitively: nothing. Nothing at all can be learned from ants that our species should even consider imitating. For one thing, all working ants are female. Males are bred and appear in the nest only once a year, and then only briefly. They are pitiful creatures with wings, huge eyes, small brains and genitalia that make up a large portion of their rear body segment. They have only one function in life: to inseminate the virgin queens during the nuptial season. They are built to be robot flying sexual missiles. Upon mating or doing their best to mate, they are programmed to die within hours, usually as victims of predators.

Many kinds of ants eat their dead — and their injured, too. You may have seen ant workers retrieve nestmates that you have mangled or killed underfoot (accidentally, I hope), thinking it battlefield heroism. The purpose, alas, is more sinister. [Continue reading...]

facebooktwittermail

Will misogyny bring down the atheist movement?

Mark Oppenheimer writes: Several women told me that women new to the movement were often warned about the intentions of certain older men, especially [Michael] Shermer [the founder of Skeptic magazine]. Two more women agreed to go on the record, by name, with their Shermer stories… These stories help flesh out a man who, whatever his progressive views on science and reason, is decidedly less evolved when it comes to women.

Yet Shermer remains a leader in freethought — arguably the leader. And in his attitudes, he is hardly an exception. Hitchens, the best-selling author of God Is Not Great, who died in 2011, wrote a notorious Vanity Fair article called “Why Women Aren’t Funny.” Richard Dawkins, another author whose books have brought atheism to the masses, has alienated many women — and men — by belittling accusations of sexism in the movement; he seems to go out of his way to antagonize feminists generally, and just this past July 29 he tweeted, “Date rape is bad. Stranger rape at knifepoint is worse. If you think that’s an endorsement of date rape, go away and learn how to think.” And Penn Jillette, the talking half of the Penn and Teller duo, famously revels in using words like “cunt.”

The reality of sexism in freethought is not limited to a few famous leaders; it has implications throughout the small but quickly growing movement. Thanks to the internet, and to popular authors like Dawkins, Hitchens, and Sam Harris, atheism has greater visibility than at any time since the 18th-century Enlightenment. Yet it is now cannibalizing itself. For the past several years, Twitter, Facebook, Reddit, and online forums have become hostile places for women who identify as feminists or express concern about widely circulated tales of sexism in the movement. Some women say they are now harassed or mocked at conventions, and the online attacks — which include Jew-baiting, threats of anal rape, and other pleasantries — are so vicious that two activists I spoke with have been diagnosed with post-traumatic stress disorder. One of these women has been bedridden for two years.

To those outside the community, freethought would seem an unlikely candidate for this sort of internal strife. Aren’t atheists and agnostics supposed to be liberal, forward-thinking types? But from the beginning, there has been a division in freethought between the humanists, who see atheism as one part of a larger progressive vision for society, and the libertarians, for whom the banishment of God sits comfortably with capitalism, gun rights, and free-speech absolutism. One group sees men like Michael Shermer as freethought’s big problem, while the other sees defending them as crucial to freethought’s mission. [Continue reading...]

facebooktwittermail

ISIS is about to destroy Biblical history in Iraq

Christopher Dickey reports that soon after ISIS took control of Mosul: the minions of the self-appointed caliph of the freshly self-declared Islamic State, Abu Bakr al-Baghdadi, paid a visit to the Mosul Museum. It has been closed for years for restoration, ever since it was looted along with many of Iraq’s other institutions in the wake of the culturally oblivious American-led invasion of 2003. But the Mosul Museum was on the verge of reopening, at last, and the full collection had been stored there.

“These groups of terrorists—their arrival was a brutal shock, with no warning,” Iraqi National Museum Director Qais Hussein Rashid told me when he visited Paris last week with a mission pleading for international help. “We were not able to take preventive measures.”

Indeed, museum curators and staff were no better prepared than any other part of the Iraqi government. They could have learned from al-Baghdadi’s operations in neighboring Syria that a major source of revenue for his insurgency has been the sale of looted antiquities on the black market. As reported in The Guardian, a windfall of intelligence just before Mosul fell revealed that al-Baghdadi had accumulated a $2 billion war chest, in part by selling off ancient artifacts from captured Syrian sites. But the Iraqi officials concerned with antiquities said the Iraqi intelligence officers privy to that information have not shared it with them.

So the risk now — the virtual certainty, in fact — is that irreplaceable history will be annihilated or sold into the netherworld of corrupt and cynical collectors. And it was plain when I met with Rashid and his colleagues that they are desperate to stop it, but have neither the strategy nor the resources to do so. [Continue reading...]

facebooktwittermail

Maya Angelou: American titan who lived as though there were no tomorrow

Following the death of Maya Angelou, Gary Younge writes: By the time she reached 40 she had been a professional dancer, prostitute, madam, lecturer, activist, singer and editor. She had worked with Martin Luther King and Malcolm X, lived in Ghana and Egypt, toured Europe with a dance troupe and settled in pretty much every region of the United States. And then she wrote about it, the whole time crafting a path as a poet, epigrammist and performer. “My life has been long,” she wrote in one her last books. “And believing that life loves the liver of it, I have dared to try many things, sometimes trembling, but daring still.”

In a subsequent interview I described her as the “Desiderata in human form” and “a professional hopemonger”. She lived as though there were no tomorrow. And now that there really is no tomorrow, for her, we are left to contemplate – for us as well as her – where daring can get you.

But with her passing, America has not just lost a talented Renaissance woman and gifted raconteur. It has lost a connection to its recent past that had helped it make sense of its present. At a time when so many Americans seek to travel ‘color blind’, and free from the baggage of the nation’s racial history, here she stood, tall, straight and true: a black woman from the south intimately connected to the transformative people and politics who helped shape much of America’s racial landscape.

A woman determined to give voice to both frustration and a militancy without being so consumed by either that she could not connect with those who did not instinctively relate to it. A woman who, in her own words, was determined to go through life with “passion, compassion, humor and some style”, and would use all those attributes and more to remind America of where this frustration and militancy was coming from.

She described the 9/11 attacks as a “hate crime”, and said: “Living in a state of terror was new to many white people in America, but black people have been living in a state of terror in this country for more than 400 years.” [Continue reading...]

facebooktwittermail

The cloud of unknowing

Karl Taro Greenfeld writes: I can’t help it. Every few weeks, my wife mentions the latest book her book club is reading, and no matter what it is, whether I’ve read it or not, I offer an opinion of the work, based entirely on … what, exactly? Often, these are books I’ve not even read a review or essay about, yet I freely hold forth on the grandiosity of Cheryl Strayed or the restrained sentimentality of Edwidge Danticat. These data motes are gleaned, apparently, from the ether — or, more realistically, from various social media feeds.

What was Solange Knowles’s elevator attack on Jay-Z about? I didn’t watch the security-camera video on TMZ — it would have taken too long — but I scrolled through enough chatter to know that Solange had scrubbed her Instagram feed of photos of her sister, Beyoncé. How about this season of “Game of Thrones” and that nonconsensual intercourse in the crypt? I don’t watch the show, but I’ve scanned the recaps on Vulture.com, and I am prepared to argue that this was deeply offensive. Is Pope Francis a postmodern pontiff? I’ve never listened to one of his homilies nor watched his recent “60 Minutes” appearance, but I’ve seen plenty of his @Pontifex tweets retweeted, so I’m ready to say his position on inequality and social justice is remarkably progressive.

It’s never been so easy to pretend to know so much without actually knowing anything. We pick topical, relevant bits from Facebook, Twitter or emailed news alerts, and then regurgitate them. Instead of watching “Mad Men” or the Super Bowl or the Oscars or a presidential debate, you can simply scroll through someone else’s live-tweeting of it, or read the recaps the next day. Our cultural canon is becoming determined by whatever gets the most clicks.

In his 1987 book “Cultural Literacy: What Every American Needs to Know,” E. D. Hirsch Jr. listed 5,000 essential concepts and names — 1066, Babbitt, Pickwickian — that educated people should be familiar with. (Or at least that’s what I believe he wrote, not having actually read the book.) Mr. Hirsch’s book, along with its contemporary “The Closing of the American Mind” by Allan Bloom, made the point that cultural literacy — Mr. Bloom’s canon — was the bedrock of our agreed-upon values.

What we all feel now is the constant pressure to know enough, at all times, lest we be revealed as culturally illiterate. So that we can survive an elevator pitch, a business meeting, a visit to the office kitchenette, a cocktail party, so that we can post, tweet, chat, comment, text as if we have seen, read, watched, listened. What matters to us, awash in petabytes of data, is not necessarily having actually consumed this content firsthand but simply knowing that it exists — and having a position on it, being able to engage in the chatter about it. We come perilously close to performing a pastiche of knowledgeability that is really a new model of know-nothingness. [Continue reading...]

facebooktwittermail

The mounting casualties in the war of the Anthropocene

Justin E.H. Smith writes: There is a great die-off under way, one that may justly be compared to the disappearance of dinosaurs at the end of the Cretaceous, or the sudden downfall of so many great mammals at the beginning of the Holocene. But how far can such a comparison really take us in assessing the present moment?

The hard data tell us that what is happening to animals right now is part of the same broad historical process that has swept up humans: We are all being homogenized, subjected to uniform standards, domesticated. A curiosity that might help to drive this home: At present, the total biomass of mammals raised for food vastly exceeds the biomass of all mammalian wildlife on the planet (it also exceeds that of the human species itself). This was certainly not the case 10,000 or so years ago, at the dawn of the age of pastoralism.

It is hard to know where exactly, or even inexactly, to place the boundary between prehistory and history. Indeed, some authors argue that the very idea of prehistory is a sort of artificial buffer zone set up to protect properly human society from the vast expanse of mere nature that preceded us. But if we must set up a boundary, I suggest the moment when human beings began to dominate and control other large mammals for their own, human ends.

We tend to think about history as human history. Yet a suitably wide-focused perspective reveals that nothing in the course of human affairs makes complete sense without some account of animal actors. History has, in fact, been a question of human-animal interaction all along. Cherchez la vache is how the anthropologist E.E. Evans-­Pritchard argued that the social life of the cattle-herding Nuer of southern Sudan might best be summed up — “look for the cow” — but one could probably, without much stretching, extend that principle to human society in general. The cattle that now outweigh us are a mirror of our political and economic crisis, just as cattle were once a mirror of the sociocosmic harmony that characterized Nuer life. [Continue reading...]

facebooktwittermail

Astra Taylor: Misogyny and the cult of internet openness

In December, and again in February, at the Google Bus blockades in San Francisco, one thing struck me forcefully: the technology corporation employees waiting for their buses were all staring so intently at their phones that they apparently didn’t notice the unusual things going on around them until their buses were surrounded. Sometimes I feel like I’m living in a science-fiction novel, because my region is so afflicted with people who stare at the tiny screens in their hands on trains, in restaurants, while crossing the street, and too often while driving. San Francisco is, after all, where director Phil Kaufman set the 1978 remake of Invasion of the Body Snatchers, the movie wherein a ferny spore-spouting form of alien life colonizes human beings so that they become zombie-like figures.

In the movies, such colonization took place secretly, or by force, or both: it was a war, and (once upon a time) an allegory for the Cold War and a possible communist takeover. Today, however — Hypercapitalism Invades! — we not only choose to carry around those mobile devices, but pay corporations hefty monthly fees to do so. In return, we get to voluntarily join the great hive of people being in touch all the time, so much so that human nature seems in the process of being remade, with the young immersed in a kind of contact that makes solitude seem like polar ice, something that’s melting away.

We got phones, and then Smart Phones, and then Angry Birds and a million apps — and a golden opportunity to be tracked all the time thanks to the GPS devices in those phones. Your cell phone is the shackle that chains you to corporate America (and potentially to government surveillance as well) and like the ankle bracelets that prisoners wear, it’s your monitor. It connects you to the Internet and so to the brave new world that has such men as Larry Ellison and Mark Zuckerberg in it. That world — maybe not so brave after all — is the subject of Astra Taylor’s necessary, alarming, and exciting new book, The People’s Platform: Taking Back Power and Culture in the Digital Age.

The Internet arose with little regulation, little public decision-making, and a whole lot of fantasy about how it was going to make everyone powerful and how everything would be free. Free, as in unregulated and open, got confused with free, as in not getting paid, and somehow everyone from Facebook to Arianna Huffington created massively lucrative sites (based on advertising dollars) in which the people who made the content went unpaid. Just as Russia woke up with oil oligarchs spreading like mushrooms after a night’s heavy rain, so we woke up with new titans of the tech industry throwing their billionaire weight around. The Internet turns out to be a superb mechanism for consolidating money and power and control, even as it gives toys and minor voices to the rest of us.

As Taylor writes in her book, “The online sphere inspires incessant talk of gift economies and public-spiritedness and democracy, but commercialization and privatization and inequality lurk beneath the surface. This contradiction is captured in a single word: ‘open,’ a concept capacious enough to contain both the communal and capitalistic impulses central to Web 2.0.” And she goes on to discuss, “the tendency of open systems to amplify inequality — and new media thinkers’ glib disregard for this fundamental characteristic.”  Part of what makes her book exceptional, in fact, is its breadth. It reviews much of the existing critique of the Internet and connects the critiques of specific aspects of it into an overview of how a phenomenon supposed to be wildly democratic has become wildly not that way at all.

And at a certain juncture, she turns to gender. Though far from the only weak point of the Internet as an egalitarian space — after all, there’s privacy (lack of), the environment (massive server farms), and economics (tax cheats, “content providers” like musicians fleeced) — gender politics, as she shows in today’s post adapted from her book, is one of the most spectacular problems online. Let’s imagine this as science fiction: a group of humans apparently dissatisfied with how things were going on Earth — where women were increasing their rights, representation, and participation — left our orbit and started their own society on their own planet. The new planet wasn’t far away or hard to get to (if you could afford the technology): it was called the Internet. We all know it by name; we all visit it; but we don’t name the society that dominates it much.

Taylor does: the dominant society, celebrating itself and pretty much silencing everyone else, makes the Internet bear a striking resemblance to Congress in 1850 or a gentlemen’s club (minus any gentleness). It’s a gated community, and as Taylor describes today, the security detail is ferocious, patrolling its borders by trolling and threatening dissident voices, and just having a female name or being identified as female is enough to become a target of hate and threats.

Early this year, a few essays were published on Internet misogyny that were so compelling I thought 2014 might be the year we revisit these online persecutions, the way that we revisited rape in 2013, thanks to the Steubenville and New Delhi assault cases of late 2012. But the subject hasn’t (yet) quite caught fire, and so not much gets said and less gets done about this dynamic new machinery for privileging male and silencing female voices. Which is why we need to keep examining and discussing this, as well as the other problems of the Internet. And why you need to read Astra Taylor’s book. This excerpt is part of her diagnosis of the problems; the book ends with ideas about a cure. Rebecca Solnit

Open systems and glass ceilings
The disappearing woman and life on the internet
By Astra Taylor

The Web is regularly hailed for its “openness” and that’s where the confusion begins, since “open” in no way means “equal.” While the Internet may create space for many voices, it also reflects and often amplifies real-world inequities in striking ways.

An elaborate system organized around hubs and links, the Web has a surprising degree of inequality built into its very architecture. Its traffic, for instance, tends to be distributed according to “power laws,” which follow what’s known as the 80/20 rule — 80% of a desirable resource goes to 20% of the population.

In fact, as anyone knows who has followed the histories of Google, Apple, Amazon, and Facebook, now among the biggest companies in the world, the Web is increasingly a winner-take-all, rich-get-richer sort of place, which means the disparate percentages in those power laws are only likely to look uglier over time.

[Read more...]

facebooktwittermail

America’s huge appetite for conspiracy theories

Conspiracy Theories and the Paranoid Style(s) of Mass Opinion,” a paper recently published in the American Journal of Political Science, finds that half of Americans consistently endorse at least one conspiracy theory.

Tom Jacobs writes: It’s easy to assume this represents widespread ignorance, but these findings suggest otherwise. Oliver and Wood report that, except for the Obama “birthers” and the 9/11 “truthers,” “respondents who endorse conspiracy theories are not less-informed about basic political facts than average citizens.”

So what does drive belief in these contrived explanations? The researchers argue the tendency to accept them is “derived from two innate psychological predispositions.”

The first, which has an evolutionary explanation, is an “unconscious cognitive bias to draw causal connections between seemingly related phenomena.” Jumping to conclusions based on weak evidence allows us to “project feelings of control in uncertain situations,” the researchers note.

The second is our “natural attraction towards melodramatic narratives as explanations for prominent events — particularly those that interpret history (in terms of) universal struggles between good and evil.”

Stories that fit that pattern “provide compelling explanations for otherwise confusing or ambiguous events, they write, noting that “many predominant beliefs systems … draw heavily upon the idea of unseen, intentional forces shaping contemporary events.”

“For many Americans, complicated or nuanced explanations for political events are both cognitively taxing and have limited appeal,” write Oliver and Wood. “A conspiracy narrative may provide a more accessible and convincing account of political events.”

That said, they add, “Even highly engaged or ideological segments of the population can be swayed by the power of these narratives, particularly when they coincide with their other political views.”

facebooktwittermail

Cahokia: North America’s first melting pot?

Christian Science Monitor: The first experiment in “melting pot” politics in North America appears to have emerged nearly 1,000 years ago in the bottom lands of the Mississippi River near today’s St. Louis, according to archaeologists piecing together the story of the rise and fall of the native American urban complex known as Cahokia.

During its heyday, Cahokia’s population reached an estimated 20,000 people – a level the continent north of the Rio Grande wouldn’t see again until the eve of the American Revolution and the growth of New York and Philadelphia.

Cahokia’s ceremonial center, seven miles northeast of St. Louis’s Gateway Arch, boasted 120 earthen mounds, including a broad, tiered mound some 10 stories high. In East St. Louis, one of two major satellites hosts another 50 earthen mounds, as well as residences. St. Louis hosted another 26 mounds and associated dwellings.

These are three of the four largest native-American mound centers known, “all within spitting distance of one another,” says Thomas Emerson, Illinois State Archaeologist and a member of a team testing the melting-pot idea. “That’s some kind of large, integrated complex to some degree.”

Where did all those people come from? Archaeologists have been debating that question for years, Dr. Emerson says. Unfortunately, the locals left no written record of the complex’s history. Artifacts such as pottery, tools, or body ornaments give an ambiguous answer.

Artifacts from Cahokia have been found in other native-American centers from Arkansas and northern Louisiana to Oklahoma, Iowa, and Wisconsin, just as artifacts from these areas appear in digs at Cahokia.

“Archaeologists are always struggling with this: Are artifacts moving, or are people moving?” Emerson says.

Emerson and two colleagues at the University of Illinois at Urbana-Champaign tried to tackle the question using two radioactive forms of the element strontium found in human teeth. They discovered that throughout the 300 years that native Americans occupied Cahokia, the complex appeared to receive a steady stream of immigrants who stayed. [Continue reading...]

facebooktwittermail

Throughout our existence humans have always been the most destructive creatures to roam this planet

woolly-mammoth

For those of us who see industrial civilization as the guarantor of humanity’s destruction, it’s easy to picture an idyllic era earlier in our evolution, located perhaps during the cultural flowering of the Great Leap Forward.

Communities then remained relatively egalitarian without workers enslaved in back-breaking labor, while subsistence on few material resources meant that time was neither controlled by the dictates of a stratified social hierarchy nor by the demands of survival.

When people could accord as much value to storytelling, ritual, and music-making, as they did to hunting and gathering food, we might like to think that human beings were living in balance with nature.

As George Monbiot reveals, the emerging evidence about of our early ancestors paints a much grimmer picture — one in which human nature appears to have always been profoundly destructive.

You want to know who we are? Really? You think you do, but you will regret it. This article, if you have any love for the world, will inject you with a venom – a soul-scraping sadness – without an obvious antidote.

The Anthropocene, now a popular term among scientists, is the epoch in which we live: one dominated by human impacts on the living world. Most date it from the beginning of the industrial revolution. But it might have begun much earlier, with a killing spree that commenced two million years ago. What rose onto its hind legs on the African savannahs was, from the outset, death: the destroyer of worlds.

Before Homo erectus, perhaps our first recognisably human ancestor, emerged in Africa, the continent abounded with monsters. There were several species of elephants. There were sabretooths and false sabretooths, giant hyenas and creatures like those released in The Hunger Games: amphicyonids, or bear dogs, vast predators with an enormous bite.

Prof Blaire van Valkenburgh has developed a means by which we could roughly determine how many of these animals there were. When there are few predators and plenty of prey, the predators eat only the best parts of the carcass. When competition is intense, they eat everything, including the bones. The more bones a carnivore eats, the more likely its teeth are to be worn or broken. The breakages in carnivores’ teeth were massively greater in the pre-human era.

Not only were there more species of predators, including species much larger than any found on Earth today, but they appear to have been much more abundant – and desperate. We evolved in a terrible, wonderful world – that was no match for us. [Continue reading...]

facebooktwittermail

Devasting consequences of losing ‘knowledgeable elders’ in non-human cultures

bluefin-tuna

Culture — something we generally associate with its expressions through art, music, literature and so forth — is commonly viewed as one of the defining attributes of humanity. We supposedly rose above animal instinct when we started creating bodies of knowledge, held collectively and passed down from generation to generation.

But it increasingly appears that this perspective has less to do with an appreciation of what makes us human than it has with our ignorance about non-human cultures.

Although non-human cultures don’t produce the kind of artifacts we create, the role of knowledge-sharing seems to be just as vital to the success of these societies as it is to ours. In other words, what makes these creatures what they are cannot be reduced to the structure of their DNA — it also involves a dynamic and learned element: the transmission of collective knowledge.

The survival of some species doesn’t simply depend on their capacity to replicate their DNA; it depends on their ability to pass on what they know.

Scuola Internazionale Superiore di Studi Avanzati: Small changes in a population may lead to dramatic consequences, like the disappearance of the migratory route of a species. A study carried out in collaboration with the SISSA has created a model of the behaviour of a group of individuals on the move (like a school of fish, a herd of sheep or a flock of birds, etc.) which, by changing a few simple parameters, reproduces the collective behaviour patterns observed in the wild. The model shows that small quantitative changes in the number of knowledgeable individuals and availability of food can lead to radical qualitative changes in the group’s behaviour.

Until the ’50s, bluefin tuna fishing was a thriving industry in Norway, second only to sardine fishing. Every year, bluefin tuna used to migrate from the eastern Mediterranean up to the Norwegian coasts. Suddenly, however, over no more than 4-5 years, the tuna never went back to Norway. In an attempt to solve this problem, Giancarlo De Luca from SISSA (the International School for Advanced Studies of Trieste) together with an international team of researchers (from the Centre for Theoretical Physics — ICTP — of Trieste and the Technical University of Denmark) started to devise a model based on an “adaptive stochastic network.” The physicists wanted to simulate, simplifying it, the collective behaviour of animal groups. Their findings, published in the journal Interface, show that the number of “informed individuals” in a group, sociality and the strength of the decision of the informed individuals are “critical” variables, such that even minimal fluctuations in these variables can result in catastrophic changes to the system.

“We started out by taking inspiration from the phenomenon that affected the bluefin tuna, but in actual fact we then developed a general model that can be applied to many situations of groups “on the move,” explains De Luca.

The collective behaviour of a group can be treated as an “emerging property,” that is, the result of the self-organization of each individual’s behaviour. “The majority of individuals in a group may not possess adequate knowledge, for example, about where to find rich feeding grounds” explains De Luca. “However, for the group to function, it is enough that only a minority of individuals possess that information. The others, the ones who don’t, will obey simple social rules, for example by following their neighbours.”

The tendency to comply with the norm, the number of knowledgeable individuals and the determination with which they follow their preferred route (which the researchers interpreted as being directly related to the appeal, or abundance, of the resource) are critical variables. “When the number of informed individuals falls below a certain level, or the strength of their determination to go in a certain direction falls below a certain threshold, the migratory pathway disappears abruptly.”

“In our networks the individuals are “points,” with interconnections that form and disappear in the course of the process, following some established rules. It’s a simple and general way to model the system which has the advantage of being able to be solved analytically,” comments De Luca.

So what ever happened to the Norwegian tuna? “Based on our results we formulated some hypotheses which will, however, have to be tested experimentally,” says De Luca. In the’50s Norway experienced a reduction in biomass and in the quantity of herrings, the main prey of tuna, which might have played a role in their disappearance. “This is consistent with our model, but there’s more to the story. In a short time the herring population returned to normal levels, whereas the tuna never came back. Why?”

One hypothesis is that, although the overall number of Mediterranean tuna has not changed, what has changed is the composition of the population: “The most desirable tuna specimens for the fishing industry are the larger, older individuals, which are presumably also those with the greater amount of knowledge, in other words the knowledgeable elders.” concludes De Luca.

Another curious fact: what happens if there are too many knowledgeable elders? “Too many know-alls are useless,” jokes De Luca. “In fact, above a certain number of informed individuals, the group performance does not improve so much as to justify the “cost” of their training. The best cost-benefit ratio is obtained by keeping the number of informed individuals above a certain level, provided they remain a minority of the whole population.”

facebooktwittermail

In unseen worlds, science invariably crosses paths with fantasy

f13-iconPhilip Ball writes: For centuries, scientists studied light to comprehend the visible world. Why are things colored? What is a rainbow? How do our eyes work? And what is light itself? These are questions that preoccupied scientists and philosophers since the time of Aristotle, including Roger Bacon, Isaac Newton, Michael Faraday, Thomas Young, and James Clerk Maxwell.

But in the late 19th century all that changed, and it was largely Maxwell’s doing. This was the period in which the whole focus of physics — then still emerging as a distinct scientific discipline — shifted from the visible to the invisible. Light itself was instrumental to that change. Not only were the components of light invisible “fields,” but light was revealed as merely a small slice of a rainbow extending far into the unseen.

Physics has never looked back. Today its theories and concepts are concerned largely with invisible entities: not only unseen force fields and insensible rays but particles too small to see even with the most advanced microscopes. We now know that our everyday perception grants us access to only a tiny fraction of reality. Telescopes responding to radio waves, infrared radiation, and X-rays have vastly expanded our view of the universe, while electron microscopes, X-ray beams, and other fine probes of nature’s granularity have unveiled the microworld hidden beyond our visual acuity. Theories at the speculative forefront of physics flesh out this unseen universe with parallel worlds and with mysterious entities named for their very invisibility: dark matter and dark energy.

This move beyond the visible has become a fundamental part of science’s narrative. But it’s a more complicated shift than we often appreciate. Making sense of what is unseen — of what lies “beyond the light” — has a much longer history in human experience. Before science had the means to explore that realm, we had to make do with stories that became enshrined in myth and folklore. Those stories aren’t banished as science advances; they are simply reinvented. Scientists working at the forefront of the invisible will always be confronted with gaps in knowledge, understanding, and experimental capability. In the face of those limits, they draw unconsciously on the imagery of the old stories. This is a necessary part of science, and these stories can sometimes suggest genuinely productive scientific ideas. But the danger is that we will start to believe them at face value, mistaking them for theories.

A backward glance at the history of the invisible shows how the narratives and tropes of myth and folklore can stimulate science, while showing that the truth will probably turn out to be far stranger and more unexpected than these old stories can accommodate. [Continue reading...]

facebooktwittermail

The roots of America’s narcissism epidemic

f13-iconWill Storr writes: For much of human history, our beliefs have been based on the assumption that people are fundamentally bad. Strip away a person’s smile and you’ll find a grotesque, writhing animal-thing. Human instincts have to be controlled, and religions have often been guides for containing the demons. Sigmund Freud held a similar view: Psychotherapy was his method of making the unconscious conscious, helping people restrain their bestial desires and accord with the moral laws of civilization.

In the middle of the 20th century, an alternative school of thought appeared. It was popularized by Carl Rogers, an influential psychotherapist at the University of Chicago, and it reversed the presumption of original sin. Rogers argued that people are innately decent. Children, he believed, should be raised in an environment of “unconditional positive regard”. They should be liberated from the inhibitions and restraints that prevented them from attaining their full potential.

It was a characteristically American idea — perhaps even the American idea. Underneath it all, people are good, and to get the best out of themselves, they just need to be free.

Economic change gave Rogers’s theory traction. It was the 1950s, and a nation of workmen was turning into a nation of salesmen. To make good in life, interpersonal sunniness was becoming essential. Meanwhile, rising divorce rates and the surge of women into the workplace were triggering anxieties about the lives of children born into the baby boom. Parents wanted to counteract the stresses of modern family life, and boosting their children’s self-esteem seemed like the solution.

By the early 1960s, wild thinkers in California were pushing Rogers’s idea even further. The “human potential movement” argued that most people were using just 10 percent of their intellectual capacity. It leaned on the work of Abraham Maslow, who studied exceptional people such as Albert Einstein and Eleanor Roosevelt and said there were five human needs, the most important of which was self-actualization—the realization of one’s maximum potential. Number two on the list was esteem.

At the close of the decade, the idea that self-esteem was the key to psychological riches finally exploded. The trigger was Nathaniel Branden, a handsome Canadian psychotherapist who had moved to Los Angeles as a disciple of the philosopher Ayn Rand. One of Rand’s big ideas was that that moral good would arise when humans ruthlessly pursued their own self-interest. She and Branden began a tortuous love affair, and her theories had an intense impact on the young psychotherapist. In The Psychology of Self-Esteem, published in 1969, Branden argued that self-esteem “has profound effects on a man’s thinking processes, emotions, desires, values and goals. It is the single most significant key to his behavior.” It was an international bestseller, and it propelled the self-esteem movement out of the counterculture and into the mainstream.

The year that Branden published his book, a sixteen-year-old in Euclid, Ohio named Roy Baumeister was grappling with his own self-esteem problem: his Dad. [Continue reading...]

facebooktwittermail

The great rewilding

f13-iconOrion magazine: One day, the British environmental writer George Monbiot was digging in his garden when he had a revelation—that his life had become too tidy and constrained. While exploring what it would take to re-ignite his own sense of wonder, he waded into a sea of ideas about restoration and rewilding that so captured his imagination that it became the focus of his next book. Feral: Searching for Enchantment on the Frontiers of Rewilding was published in the United Kingdom in 2013, to much acclaim, and is forthcoming in the U.S. in 2014. Orion editor Jennifer Sahn caught up with Monbiot to talk about rewilding — what it means for people, for nature, and for an environmental movement that is in great need of having far wider appeal.

***

Jennifer Sahn: It’s sort of an obvious starting place, but I think it makes sense to begin by asking how you define rewilding.

George Monbiot: Actually, there are two definitions of rewilding that appeal to me. One is the mass restoration of ecosystems. By restoration, I really mean bringing back their trophic function. Trophic function involves feeding. It’s about eating and being eaten. Trophic function is the interactions between animals and plants in the food chain. Most of our ecosystems are very impoverished as far as those interactions are concerned. They’re missing the top predators and the big herbivores, and so they’re missing a lot of their ecological dynamism. That, above all, is what I want to restore.

I see the mass restoration of ecosystems, meaning taking down the fences, blocking up the drainage ditches, enabling wildlife to spread. Reintroducing missing species, and particularly missing species which are keystone species, or ecosystem engineers. These are species which have impacts greater than their biomass alone would suggest. They create habitats, and create opportunities for many other species. Good examples would be beavers, wolves, wild boar, elephants, whales — all of which have huge ramifying effects on the ecosystem, including parts of the ecosystem with which they have no direct contact.

Otherwise, I see humans having very little continuing management role in the ecosystem. Having brought back the elements which can restore that dynamism, we then step back and stop trying to interfere. That, in a way, is the hardest thing of all — to stop believing that, without our help, everything’s going to go horribly wrong. I think in many ways we still suffer from the biblical myth of dominion where we see ourselves as the guardians or the stewards of the planet, whereas I think it does best when we have as little influence as we can get away with.

The other definition of rewilding that interests me is the rewilding of our own lives. I believe the two processes are closely intertwined—if we have spaces on our doorsteps in which nature is allowed to do its own thing, in which it can be to some extent self-willed, driven by its own dynamic processes, that, I feel, is a much more exciting and thrilling ecosystem to explore and discover, and it enables us to enrich our lives, to fill them with wonder and enchantment.

Jennifer: So you’re using rewilding in part as a reflexive verb?

George: Absolutely. Of all the species that need rewilding, I think human beings come at the top of the list. I would love to see a more intense and emotional engagement of human beings with the living world. The process of rewilding the ecosystem gives us an opportunity to make our lives richer and rawer than they tend to be in our very crowded and overcivilized and buttoned-down societies. [Continue reading...]

facebooktwittermail

How the north ended up on top of the map

f13-iconNick Danforth writes: Why do maps always show the north as up? For those who don’t just take it for granted, the common answer is that Europeans made the maps and they wanted to be on top. But there’s really no good reason for the north to claim top-notch cartographic real estate over any other bearing, as an examination of old maps from different places and periods can confirm.

The profound arbitrariness of our current cartographic conventions was made evident by McArthur’s Universal Corrective Map of the World, an iconic “upside down” view of the world that recently celebrated its 35th anniversary. Launched by Australian Stuart McArthur on Jan. 26, 1979 (Australia Day, naturally), this map is supposed to challenge our casual acceptance of European perspectives as global norms. But seen today with the title “Australia: No Longer Down Under,” it’s hard not to wonder why the upside-down map, for all its subversiveness, wasn’t called “Botswana: Back Where It Belongs” or perhaps “Paraguay Paramount!”

The McArthur map also makes us wonder why we are so quick to assume that Northern Europeans were the ones who invented the modern map — and decided which way to hold it — in the first place. As is so often the case, our eagerness to invoke Eurocentrism displays a certain bias of its own, since in fact, the north’s elite cartographic status owes more to Byzantine monks and Majorcan Jews than it does to any Englishman. [Continue reading...]

facebooktwittermail

Studying ritual in order to understand politics in Libya

When I was an undergraduate, early on I learned about the value of interdisciplinary studies. Had I been on a conventional academic track, that probably wouldn’t have happened, but I was lucky enough to be in a department that brought together anthropologists, sociologists, philosophers, theologians, and religious studies scholars. In such an environment, the sharp defense of disciplinary turf was not only unwelcome — it simply made no sense.

Even so, universities remain structurally antagonistic to interdisciplinarity, both for intellectual reasons but perhaps more than anything for professional reasons. Anyone who wants to set themselves on a track towards tenure needs to get published and academic journals all fall within and help sustain disciplinary boundaries.

I mention this because when questions are raised such as what’s happening in Libya? or the more loaded, what’s gone wrong in Libya? the range of experts who get called on to respond, tends to be quite limited. There will be regional experts, political scientists, and perhaps economists. But calling on someone with an understanding of the human function of ritual along with the role different forms of ritual may have had in the development of civilization, is not an obvious way of trying to gain insight into events in Benghazi.

Moreover, within discourse that is heavily influenced by secular assumptions about the problematic nature of religion and the irrational roots of extremism, there is a social bias in the West that favors a popular dismissal.

What’s wrong with Libya? Those people are nuts.

Philip Weiss helped popularize the expression Progressive Except on Palestine — an accusation that most frequently gets directed at American liberal Zionists. But over the last two years a new variant which is perhaps even more commonplace has proliferated across the Left which with only slight overstatement could be called Progressive Except on the Middle East.

From this perspective, a suspicion of Muslim men with beards — especially those in Libya and Syria — has become a way through which a Clash of Civilizations narrative is unwittingly being reborn. Add to that the influence of the likes of Richard Dawkins and his cohorts on their mission to “decry supernaturalism in all its forms” and what you end up with is a stifling of curiosity — a lack of any genuine interest in trying to understand why people behave the way they do if you’ve already concluded that their behavior is something to be condemned.

A year ago, the science journal Nature, published an article on human rituals, their role in the growth of community and the emergence of civilization.

The report focuses on a global project one of whose principal aims is to test a theory that rituals come in two basic forms: one that through intense and often traumatic experience can forge tight bonds in small groups and the other that provides social cohesion less intensely but on a larger scale through doctrinal unity.

Last week, the State department designated three branches of Ansar al Shariah — two in Libya and one in Tunisia — as terrorist organizations. The information provided gives no indication about how or if the groups are linked beyond the fact that they share the same name — a name used by separate groups in eight different countries.

There’s reason to suspect that the U.S. government is engaged in its own form of ritualistic behavior much like the Spanish Inquisition busily branding heretics.

Maybe if the Obama administration spent a bit more time talking to anthropologists and archeologists rather than political consultants and security advisers, they would be able to develop a more coherent and constructive policy on Libya. I’m not kidding.

In Nature, Dan Jones writes: By July 2011, when Brian McQuinn made the 18-hour boat trip from Malta to the Libyan port of Misrata, the bloody uprising against Libyan dictator Muammar Gaddafi had already been under way for five months.

“The whole city was under siege, with Gaddafi forces on all sides,” recalls Canadian-born McQuinn. He was no stranger to such situations, having spent the previous decade working for peace-building organizations in countries including Rwanda and Bosnia. But this time, as a doctoral student in anthropology at the University of Oxford, UK, he was taking the risk for the sake of research. His plan was to make contact with rebel groups and travel with them as they fought, studying how they used ritual to create solidarity and loyalty amid constant violence.

It worked: McQuinn stayed with the rebels for seven months, compiling a strikingly close and personal case study of how rituals evolved through combat and eventual victory. And his work was just one part of a much bigger project: a £3.2-million (US$5-million) investigation into ritual, community and conflict, which is funded until 2016 by the UK Economic and Social Research Council (ESRC) and headed by McQuinn’s supervisor, Oxford anthropologist Harvey Whitehouse.

Rituals are a human universal — “the glue that holds social groups together”, explains Whitehouse, who leads the team of anthropologists, psychologists, historians, economists and archaeologists from 12 universities in the United Kingdom, the United States and Canada. Rituals can vary enormously, from the recitation of prayers in church, to the sometimes violent and humiliating initiations of US college fraternity pledges, to the bleeding of a young man’s penis with bamboo razors and pig incisors in purity rituals among the Ilahita Arapesh of New Guinea. But beneath that diversity, Whitehouse believes, rituals are always about building community — which arguably makes them central to understanding how civilization itself began.

To explore these possibilities, and to tease apart how this social glue works, Whitehouse’s project will combine fieldwork such as McQuinn’s with archaeological digs and laboratory studies around the world, from Vancouver, Canada, to the island archipelago of Vanuatu in the south Pacific Ocean. “This is the most wide-ranging scientific project on rituals attempted to date,” says Scott Atran, director of anthropological research at the CNRS, the French national research organization, in Paris, and an adviser to the project.
Human rites

A major aim of the investigation is to test Whitehouse’s theory that rituals come in two broad types, which have different effects on group bonding. Routine actions such as prayers at church, mosque or synagogue, or the daily pledge of allegiance recited in many US elementary schools, are rituals operating in what Whitehouse calls the ‘doctrinal mode’. He argues that these rituals, which are easily transmitted to children and strangers, are well suited to forging religions, tribes, cities and nations — broad-based communities that do not depend on face-to-face contact.

Rare, traumatic activities such as beating, scarring or self-mutilation, by contrast, are rituals operating in what Whitehouse calls the ‘imagistic mode’. “Traumatic rituals create strong bonds among those who experience them together,” he says, which makes them especially suited to creating small, intensely committed groups such as cults, military platoons or terrorist cells. “With the imagistic mode, we never find groups of the same kind of scale, uniformity, centralization or hierarchical structure that typifies the doctrinal mode,” he says.

Whitehouse has been developing this theory of ‘divergent modes of ritual and religion’ since the late 1980s, based on his field work in Papua New Guinea and elsewhere. His ideas have attracted the attention of psychologists, archaeologists and historians.

Until recently, however, the theory was largely based on selected ethnographic and historical case studies, leaving it open to the charge of cherry-picking. The current rituals project is an effort by Whitehouse and his colleagues to answer that charge with deeper, more systematic data.

The pursuit of such data sent McQuinn to Libya. His strategy was to look at how the defining features of the imagistic and doctrinal modes — emotionally intense experiences shared among a small number of people, compared with routine, daily practices that large numbers of people engage in — fed into the evolution of rebel fighting groups from small bands to large brigades.

At first, says McQuinn, neighbourhood friends formed small groups comprising “the number of people you could fit in a car”. Later, fighters began living together in groups of 25–40 in disused buildings and the mansions of rich supporters. Finally, after Gaddafi’s forces were pushed out of Misrata, much larger and hierarchically organized brigades emerged that patrolled long stretches of the defensive border of the city. There was even a Misratan Union of Revolutionaries, which by November 2011 had registered 236 rebel brigades.

McQuinn interviewed more than 300 fighters from 21 of these rebel groups, which varied in size from 12 to just over 1,000 members. He found that the early, smaller brigades tended to form around pre-existing personal ties, and became more cohesive and the members more committed to each other as they collectively experienced the fear and excitement of fighting a civil war on the streets of Misrata.

But six of the groups evolved into super-brigades of more than 750 fighters, becoming “something more like a corporate entity with their own organizational rituals”, says McQuinn. A number of the group leaders had run successful businesses, and would bring everyone together each day for collective training, briefings and to reiterate their moral codes of conduct — the kinds of routine group activities characteristic of the doctrinal mode. “These daily practices moved people from being ‘our little group’ to ‘everyone training here is part of our group’,” says McQuinn.

McQuinn and Whitehouse’s work with Libyan fighters underscores how small groups can be tightly fused by the shared trauma of war, just as imagistic rituals induce terror to achieve the same effect. Whitehouse says that he is finding the same thing in as-yet-unpublished studies of the scary, painful and humiliating ‘hazing’ rituals of fraternity and sorority houses on US campuses, as well as in surveys of Vietnam veterans showing how shared trauma shaped loyalty to their fellow soldiers. [Continue reading...]

When people talk about nation-building, they talk about the need to establish security, the rule of law and the development of democratic institutions. They focus on political and civil structures through which social stability takes on a recognizable form — the operation for instance of effective court systems and law enforcement authorities that do not abuse their powers. But what makes all this work, or fail to work, is a sufficient level of social cohesion and if that is lacking, the institutional structures will probably be of little value.

Over the last year and a half, American interest in Libya seems to have been reduced to analysis about what happened on one day in Benghazi. But what might help Libya much more than America’s obsessive need to spot terrorists would be to focus instead on things like promoting football. A win for the national team could work wonders.

*

In the video below, Harvey Whitehouse describes the background to his research.

facebooktwittermail

Bees translate polarized light into a navigational dance

bee

Queensland Brain Institute: QBI scientists at The University of Queensland have found that honeybees use the pattern of polarised light in the sky invisible to humans to direct one another to a honey source.

The study, conducted in Professor Mandyam Srinivasan’s laboratory at the Queensland Brain Institute, a member of the Australian Research Council Centre of Excellence in Vision Science (ACEVS), demonstrated that bees navigate to and from honey sources by reading the pattern of polarised light in the sky.

“The bees tell each other where the nectar is by converting their polarised ‘light map’ into dance movements,” Professor Srinivasan said.

“The more we find out how honeybees make their way around the landscape, the more awed we feel at the elegant way they solve very complicated problems of navigation that would floor most people – and then communicate them to other bees,” he said.

The discovery shines new light on the astonishing navigational and communication skills of an insect with a brain the size of a pinhead.

The researchers allowed bees to fly down a tunnel to a sugar source, shining only polarised light from above, either aligned with the tunnel or at right angles to the tunnel.

They then filmed what the bees ‘told’ their peers, by waggling their bodies when they got back to the hive.

“It is well known that bees steer by the sun, adjusting their compass as it moves across the sky, and then convert that information into instructions for other bees by waggling their body to signal the direction of the honey,” Professor Srinivasan said.

“Other laboratories have shown from studying their eyes that bees can see a pattern of polarised light in the sky even when the sun isn’t shining: the big question was could they translate the navigational information it provides into their waggle dance.”

The researchers conclude that even when the sun is not shining, bees can tell one another where to find food by reading and dancing to their polarised sky map.

In addition to revealing how bees perform their remarkable tasks, Professor Srinivasan says it also adds to our understanding of some of the most basic machinery of the brain itself.

Professor Srinivasan’s team conjectures that flight under polarised illumination activates discrete populations of cells in the insect’s brain.

When the polarised light was aligned with the tunnel, one pair of ‘place cells’ – neurons important for spatial navigation – became activated, whereas when the light was oriented across the tunnel a different pair of place cells was activated.

The researchers suggest that depending on which set of cells is activated, the bee can work out if the food source lies in a direction toward or opposite the direction of the sun, or in a direction ninety degrees to the left or right of it.

The study, “Honeybee navigation: critically examining the role of polarization compass”, is published in the 6 January 2014 issue of the Philosophical Transactions of the Royal Society B.

facebooktwittermail