We are all confident idiots

David Dunning writes: Last March, during the enormous South by Southwest music festival in Austin, Texas, the late-night talk show Jimmy Kimmel Live! sent a camera crew out into the streets to catch hipsters bluffing. “People who go to music festivals pride themselves on knowing who the next acts are,” Kimmel said to his studio audience, “even if they don’t actually know who the new acts are.” So the host had his crew ask festival-goers for their thoughts about bands that don’t exist.

“The big buzz on the street,” said one of Kimmel’s interviewers to a man wearing thick-framed glasses and a whimsical T-shirt, “is Contact Dermatitis. Do you think he has what it takes to really make it to the big time?”

“Absolutely,” came the dazed fan’s reply.

The prank was an installment of Kimmel’s recurring “Lie Witness News” feature, which involves asking pedestrians a variety of questions with false premises. In another episode, Kimmel’s crew asked people on Hollywood Boulevard whether they thought the 2014 film Godzilla was insensitive to survivors of the 1954 giant lizard attack on Tokyo; in a third, they asked whether Bill Clinton gets enough credit for ending the Korean War, and whether his appearance as a judge on America’s Got Talent would damage his legacy. “No,” said one woman to this last question. “It will make him even more popular.”

One can’t help but feel for the people who fall into Kimmel’s trap. Some appear willing to say just about anything on camera to hide their cluelessness about the subject at hand (which, of course, has the opposite effect). Others seem eager to please, not wanting to let the interviewer down by giving the most boringly appropriate response: I don’t know. But for some of these interviewees, the trap may be an even deeper one. The most confident-sounding respondents often seem to think they do have some clue—as if there is some fact, some memory, or some intuition that assures them their answer is reasonable. [Continue reading...]

facebooktwittermail

Cooperation is what makes us human

Kat McGowan writes: Tales about the origins of our species always start off like this: A small band of hunter-gatherers roams the savannah, loving, warring, and struggling for survival under the African sun. They do not start like this: A fat guy falls off a New York City subway platform onto the tracks.

But what happens next is a quintessential story of who we are as human beings.

On Feb. 17, 2013, around 2:30 a.m., Garrett O’Hanlon, a U.S. Air Force Academy cadet third class, was out celebrating his 22nd birthday in New York City. He and his sister were in the subway waiting for a train when a sudden silence came over the platform, followed by a shriek. People pointed down to the tracks.

O’Hanlon turned and saw a man sprawled facedown on the tracks. “The next thing that happened, I was on the tracks, running toward him,” he says. “I honestly didn’t have a thought process.”

O’Hanlon grabbed the unconscious man by the shoulders, lifting his upper body off the tracks, but struggled to move him. He was deadweight. According to the station clock, the train would arrive in less than two minutes. From the platform, O’Hanlon’s sister was screaming at him to save himself.

Suddenly other arms were there: Personal trainer Dennis Codrington Jr. and his friend Matt Foley had also jumped down to help. “We grabbed him, one by the legs, one by the shoulders, one by the chest,” O’Hanlon says. They got the man to the edge of the platform, where a dozen or more people muscled him up and over. More hands seized the rescuers’ arms and shoulders, helping them up to safety as well.

In the aftermath of the rescue, O’Hanlon says he has been surprised that so many people have asked him why he did it. “I get stunned by the question,” he says. In his view, anybody else would’ve done the same thing. “I feel like it’s a normal reaction,” he says. “To me that’s just what people do.”

More precisely, it is something only people do, according to developmental psychologist Michael Tomasello, codirector of the Max Planck Institute for Evolutionary Anthropology.

For decades Tomasello has explored what makes humans distinctive. His conclusion? We cooperate. Many species, from ants to orcas to our primate cousins, cooperate in the wild. But Tomasello has identified a special form of cooperation. In his view, humans alone are capable of shared intentionality—they intuitively grasp what another person is thinking and act toward a common goal, as the subway rescuers did. This supremely human cognitive ability, Tomasello says, launched our species on its extraordinary trajectory. It forged language, tools, and cultures—stepping-stones to our colonization of every corner of the planet. [Continue reading...]

facebooktwittermail

The faster we go, the more time we lose

Mark C. Taylor writes: “Sleeker. Faster. More Intuitive” (The New York Times); “Welcome to a world where speed is everything” (Verizon FiOS); “Speed is God, and time is the devil” (chief of Hitachi’s portable-computer division). In “real” time, life speeds up until time itself seems to disappear—fast is never fast enough, everything has to be done now, instantly. To pause, delay, stop, slow down is to miss an opportunity and to give an edge to a competitor. Speed has become the measure of success—faster chips, faster computers, faster networks, faster connectivity, faster news, faster communications, faster transactions, faster deals, faster delivery, faster product cycles, faster brains, faster kids. Why are we so obsessed with speed, and why can’t we break its spell?

The cult of speed is a modern phenomenon. In “The Futurist Manifesto” in 1909, Filippo Tommaso Marionetti declared, “We say that the splendor of the world has been enriched by a new beauty: the beauty of speed.” The worship of speed reflected and promoted a profound shift in cultural values that occurred with the advent of modernity and modernization. With the emergence of industrial capitalism, the primary values governing life became work, efficiency, utility, productivity, and competition. When Frederick Winslow Taylor took his stopwatch to the factory floor in the early 20th century to increase workers’ efficiency, he began a high-speed culture of surveillance so memorably depicted in Charlie Chaplin’s Modern Times. Then, as now, efficiency was measured by the maximization of rapid production through the programming of human behavior.

With the transition from mechanical to electronic technologies, speed increased significantly. The invention of the telegraph, telephone, and stock ticker liberated communication from the strictures imposed by the physical means of conveyance. Previously, messages could be sent no faster than people, horses, trains, or ships could move. By contrast, immaterial words, sounds, information, and images could be transmitted across great distances at very high speed. During the latter half of the 19th century, railway and shipping companies established transportation networks that became the backbone of national and international information networks. When the trans-Atlantic cable (1858) and transcontinental railroad (1869) were completed, the foundation for the physical infrastructure of today’s digital networks was in place.

Fast-forward 100 years. During the latter half of the 20th century, information, communications, and networking technologies expanded rapidly, and transmission speed increased exponentially. But more than data and information were moving faster. Moore’s Law, according to which the speed of computer chips doubles every two years, now seems to apply to life itself. Plugged in 24/7/365, we are constantly struggling to keep up but are always falling further behind. The faster we go, the less time we seem to have. As our lives speed up, stress increases, and anxiety trickles down from managers to workers, and parents to children. [Continue reading...]

facebooktwittermail

Richard Dawkins has lost it: ignorant sexism gives atheists a bad name

Adam Lee writes: I became an atheist on my own, but it was Richard Dawkins who strengthened and confirmed my decision. For a long time, I admired his insightful science writing, his fierce polemics, his uncompromising passion for the truth. When something I’d written got a (brief) mention in The God Delusion, it was one of the high points of my life.

So, I’m not saying this is easy, but I have to say it: Richard Dawkins, I’m just not that into you anymore.

The atheist movement – a loosely-knit community of conference-goers, advocacy organizations, writers and activists – has been wracked by infighting the last few years over its persistent gender imbalance and the causes of it. Many female atheists have explained that they don’t get more involved because of the casual sexism endemic to the movement: parts of it see nothing problematic about hosting conferences with all-male speakers or having all-male leadership – and that’s before you get to the vitriolic and dangerous sexual harassment, online and off, that’s designed to intimidate women into silence.

Richard Dawkins has involved himself in some of these controversies, and rarely for the better – as with his infamous “Dear Muslima” letter in 2011, in which he essentially argued that, because women in Muslim countries suffer more from sexist mistreatment, women in the west shouldn’t speak up about sexual harassment or physical intimidation. There was also his sneer at women who advocate anti-sexual harassment policies.

But over the last few months, Dawkins showed signs of détente with his feminist critics – even progress. He signed a joint letter with the writer Ophelia Benson, denouncing and rejecting harassment; he even apologized for the “Dear Muslima” letter. On stage at a conference in Oxford in August, Dawkins claimed to be a feminist and said that everyone else should be, too.

Then another prominent male atheist, Sam Harris, crammed his foot in his mouth and said that atheist activism lacks an “estrogen vibe” and was “to some degree intrinsically male”. And, just like that, the brief Dawkins Spring was over. [Continue reading...]

facebooktwittermail

Ants are cool but teach us nothing

E.O. Wilson writes: For nearly seven decades, starting in boyhood, I’ve studied hundreds of kinds of ants around the world, and this qualifies me, I believe, to offer some advice on ways their lives can be applied to ours. I’ll start with the question I’m most often asked: “What can I do about the ants in my kitchen?” My response comes from the heart: Watch your step, be careful of little lives. Ants especially like honey, tuna and cookie crumbs. So put down bits of those on the floor, and watch as the first scout finds the bait and reports back to her colony by laying an odor trail. Then, as a little column follows her out to the food, you will see social behavior so strange it might be on another planet. Think of kitchen ants not as pests or bugs, but as your personal guest superorganism.

Another question I hear a lot is, “What can we learn of moral value from the ants?” Here again I will answer definitively: nothing. Nothing at all can be learned from ants that our species should even consider imitating. For one thing, all working ants are female. Males are bred and appear in the nest only once a year, and then only briefly. They are pitiful creatures with wings, huge eyes, small brains and genitalia that make up a large portion of their rear body segment. They have only one function in life: to inseminate the virgin queens during the nuptial season. They are built to be robot flying sexual missiles. Upon mating or doing their best to mate, they are programmed to die within hours, usually as victims of predators.

Many kinds of ants eat their dead — and their injured, too. You may have seen ant workers retrieve nestmates that you have mangled or killed underfoot (accidentally, I hope), thinking it battlefield heroism. The purpose, alas, is more sinister. [Continue reading...]

facebooktwittermail

Will misogyny bring down the atheist movement?

Mark Oppenheimer writes: Several women told me that women new to the movement were often warned about the intentions of certain older men, especially [Michael] Shermer [the founder of Skeptic magazine]. Two more women agreed to go on the record, by name, with their Shermer stories… These stories help flesh out a man who, whatever his progressive views on science and reason, is decidedly less evolved when it comes to women.

Yet Shermer remains a leader in freethought — arguably the leader. And in his attitudes, he is hardly an exception. Hitchens, the best-selling author of God Is Not Great, who died in 2011, wrote a notorious Vanity Fair article called “Why Women Aren’t Funny.” Richard Dawkins, another author whose books have brought atheism to the masses, has alienated many women — and men — by belittling accusations of sexism in the movement; he seems to go out of his way to antagonize feminists generally, and just this past July 29 he tweeted, “Date rape is bad. Stranger rape at knifepoint is worse. If you think that’s an endorsement of date rape, go away and learn how to think.” And Penn Jillette, the talking half of the Penn and Teller duo, famously revels in using words like “cunt.”

The reality of sexism in freethought is not limited to a few famous leaders; it has implications throughout the small but quickly growing movement. Thanks to the internet, and to popular authors like Dawkins, Hitchens, and Sam Harris, atheism has greater visibility than at any time since the 18th-century Enlightenment. Yet it is now cannibalizing itself. For the past several years, Twitter, Facebook, Reddit, and online forums have become hostile places for women who identify as feminists or express concern about widely circulated tales of sexism in the movement. Some women say they are now harassed or mocked at conventions, and the online attacks — which include Jew-baiting, threats of anal rape, and other pleasantries — are so vicious that two activists I spoke with have been diagnosed with post-traumatic stress disorder. One of these women has been bedridden for two years.

To those outside the community, freethought would seem an unlikely candidate for this sort of internal strife. Aren’t atheists and agnostics supposed to be liberal, forward-thinking types? But from the beginning, there has been a division in freethought between the humanists, who see atheism as one part of a larger progressive vision for society, and the libertarians, for whom the banishment of God sits comfortably with capitalism, gun rights, and free-speech absolutism. One group sees men like Michael Shermer as freethought’s big problem, while the other sees defending them as crucial to freethought’s mission. [Continue reading...]

facebooktwittermail

ISIS is about to destroy Biblical history in Iraq

Christopher Dickey reports that soon after ISIS took control of Mosul: the minions of the self-appointed caliph of the freshly self-declared Islamic State, Abu Bakr al-Baghdadi, paid a visit to the Mosul Museum. It has been closed for years for restoration, ever since it was looted along with many of Iraq’s other institutions in the wake of the culturally oblivious American-led invasion of 2003. But the Mosul Museum was on the verge of reopening, at last, and the full collection had been stored there.

“These groups of terrorists—their arrival was a brutal shock, with no warning,” Iraqi National Museum Director Qais Hussein Rashid told me when he visited Paris last week with a mission pleading for international help. “We were not able to take preventive measures.”

Indeed, museum curators and staff were no better prepared than any other part of the Iraqi government. They could have learned from al-Baghdadi’s operations in neighboring Syria that a major source of revenue for his insurgency has been the sale of looted antiquities on the black market. As reported in The Guardian, a windfall of intelligence just before Mosul fell revealed that al-Baghdadi had accumulated a $2 billion war chest, in part by selling off ancient artifacts from captured Syrian sites. But the Iraqi officials concerned with antiquities said the Iraqi intelligence officers privy to that information have not shared it with them.

So the risk now — the virtual certainty, in fact — is that irreplaceable history will be annihilated or sold into the netherworld of corrupt and cynical collectors. And it was plain when I met with Rashid and his colleagues that they are desperate to stop it, but have neither the strategy nor the resources to do so. [Continue reading...]

facebooktwittermail

Maya Angelou: American titan who lived as though there were no tomorrow

Following the death of Maya Angelou, Gary Younge writes: By the time she reached 40 she had been a professional dancer, prostitute, madam, lecturer, activist, singer and editor. She had worked with Martin Luther King and Malcolm X, lived in Ghana and Egypt, toured Europe with a dance troupe and settled in pretty much every region of the United States. And then she wrote about it, the whole time crafting a path as a poet, epigrammist and performer. “My life has been long,” she wrote in one her last books. “And believing that life loves the liver of it, I have dared to try many things, sometimes trembling, but daring still.”

In a subsequent interview I described her as the “Desiderata in human form” and “a professional hopemonger”. She lived as though there were no tomorrow. And now that there really is no tomorrow, for her, we are left to contemplate – for us as well as her – where daring can get you.

But with her passing, America has not just lost a talented Renaissance woman and gifted raconteur. It has lost a connection to its recent past that had helped it make sense of its present. At a time when so many Americans seek to travel ‘color blind’, and free from the baggage of the nation’s racial history, here she stood, tall, straight and true: a black woman from the south intimately connected to the transformative people and politics who helped shape much of America’s racial landscape.

A woman determined to give voice to both frustration and a militancy without being so consumed by either that she could not connect with those who did not instinctively relate to it. A woman who, in her own words, was determined to go through life with “passion, compassion, humor and some style”, and would use all those attributes and more to remind America of where this frustration and militancy was coming from.

She described the 9/11 attacks as a “hate crime”, and said: “Living in a state of terror was new to many white people in America, but black people have been living in a state of terror in this country for more than 400 years.” [Continue reading...]

facebooktwittermail

The cloud of unknowing

Karl Taro Greenfeld writes: I can’t help it. Every few weeks, my wife mentions the latest book her book club is reading, and no matter what it is, whether I’ve read it or not, I offer an opinion of the work, based entirely on … what, exactly? Often, these are books I’ve not even read a review or essay about, yet I freely hold forth on the grandiosity of Cheryl Strayed or the restrained sentimentality of Edwidge Danticat. These data motes are gleaned, apparently, from the ether — or, more realistically, from various social media feeds.

What was Solange Knowles’s elevator attack on Jay-Z about? I didn’t watch the security-camera video on TMZ — it would have taken too long — but I scrolled through enough chatter to know that Solange had scrubbed her Instagram feed of photos of her sister, Beyoncé. How about this season of “Game of Thrones” and that nonconsensual intercourse in the crypt? I don’t watch the show, but I’ve scanned the recaps on Vulture.com, and I am prepared to argue that this was deeply offensive. Is Pope Francis a postmodern pontiff? I’ve never listened to one of his homilies nor watched his recent “60 Minutes” appearance, but I’ve seen plenty of his @Pontifex tweets retweeted, so I’m ready to say his position on inequality and social justice is remarkably progressive.

It’s never been so easy to pretend to know so much without actually knowing anything. We pick topical, relevant bits from Facebook, Twitter or emailed news alerts, and then regurgitate them. Instead of watching “Mad Men” or the Super Bowl or the Oscars or a presidential debate, you can simply scroll through someone else’s live-tweeting of it, or read the recaps the next day. Our cultural canon is becoming determined by whatever gets the most clicks.

In his 1987 book “Cultural Literacy: What Every American Needs to Know,” E. D. Hirsch Jr. listed 5,000 essential concepts and names — 1066, Babbitt, Pickwickian — that educated people should be familiar with. (Or at least that’s what I believe he wrote, not having actually read the book.) Mr. Hirsch’s book, along with its contemporary “The Closing of the American Mind” by Allan Bloom, made the point that cultural literacy — Mr. Bloom’s canon — was the bedrock of our agreed-upon values.

What we all feel now is the constant pressure to know enough, at all times, lest we be revealed as culturally illiterate. So that we can survive an elevator pitch, a business meeting, a visit to the office kitchenette, a cocktail party, so that we can post, tweet, chat, comment, text as if we have seen, read, watched, listened. What matters to us, awash in petabytes of data, is not necessarily having actually consumed this content firsthand but simply knowing that it exists — and having a position on it, being able to engage in the chatter about it. We come perilously close to performing a pastiche of knowledgeability that is really a new model of know-nothingness. [Continue reading...]

facebooktwittermail

The mounting casualties in the war of the Anthropocene

Justin E.H. Smith writes: There is a great die-off under way, one that may justly be compared to the disappearance of dinosaurs at the end of the Cretaceous, or the sudden downfall of so many great mammals at the beginning of the Holocene. But how far can such a comparison really take us in assessing the present moment?

The hard data tell us that what is happening to animals right now is part of the same broad historical process that has swept up humans: We are all being homogenized, subjected to uniform standards, domesticated. A curiosity that might help to drive this home: At present, the total biomass of mammals raised for food vastly exceeds the biomass of all mammalian wildlife on the planet (it also exceeds that of the human species itself). This was certainly not the case 10,000 or so years ago, at the dawn of the age of pastoralism.

It is hard to know where exactly, or even inexactly, to place the boundary between prehistory and history. Indeed, some authors argue that the very idea of prehistory is a sort of artificial buffer zone set up to protect properly human society from the vast expanse of mere nature that preceded us. But if we must set up a boundary, I suggest the moment when human beings began to dominate and control other large mammals for their own, human ends.

We tend to think about history as human history. Yet a suitably wide-focused perspective reveals that nothing in the course of human affairs makes complete sense without some account of animal actors. History has, in fact, been a question of human-animal interaction all along. Cherchez la vache is how the anthropologist E.E. Evans-­Pritchard argued that the social life of the cattle-herding Nuer of southern Sudan might best be summed up — “look for the cow” — but one could probably, without much stretching, extend that principle to human society in general. The cattle that now outweigh us are a mirror of our political and economic crisis, just as cattle were once a mirror of the sociocosmic harmony that characterized Nuer life. [Continue reading...]

facebooktwittermail

Astra Taylor: Misogyny and the cult of internet openness

In December, and again in February, at the Google Bus blockades in San Francisco, one thing struck me forcefully: the technology corporation employees waiting for their buses were all staring so intently at their phones that they apparently didn’t notice the unusual things going on around them until their buses were surrounded. Sometimes I feel like I’m living in a science-fiction novel, because my region is so afflicted with people who stare at the tiny screens in their hands on trains, in restaurants, while crossing the street, and too often while driving. San Francisco is, after all, where director Phil Kaufman set the 1978 remake of Invasion of the Body Snatchers, the movie wherein a ferny spore-spouting form of alien life colonizes human beings so that they become zombie-like figures.

In the movies, such colonization took place secretly, or by force, or both: it was a war, and (once upon a time) an allegory for the Cold War and a possible communist takeover. Today, however — Hypercapitalism Invades! — we not only choose to carry around those mobile devices, but pay corporations hefty monthly fees to do so. In return, we get to voluntarily join the great hive of people being in touch all the time, so much so that human nature seems in the process of being remade, with the young immersed in a kind of contact that makes solitude seem like polar ice, something that’s melting away.

We got phones, and then Smart Phones, and then Angry Birds and a million apps — and a golden opportunity to be tracked all the time thanks to the GPS devices in those phones. Your cell phone is the shackle that chains you to corporate America (and potentially to government surveillance as well) and like the ankle bracelets that prisoners wear, it’s your monitor. It connects you to the Internet and so to the brave new world that has such men as Larry Ellison and Mark Zuckerberg in it. That world — maybe not so brave after all — is the subject of Astra Taylor’s necessary, alarming, and exciting new book, The People’s Platform: Taking Back Power and Culture in the Digital Age.

The Internet arose with little regulation, little public decision-making, and a whole lot of fantasy about how it was going to make everyone powerful and how everything would be free. Free, as in unregulated and open, got confused with free, as in not getting paid, and somehow everyone from Facebook to Arianna Huffington created massively lucrative sites (based on advertising dollars) in which the people who made the content went unpaid. Just as Russia woke up with oil oligarchs spreading like mushrooms after a night’s heavy rain, so we woke up with new titans of the tech industry throwing their billionaire weight around. The Internet turns out to be a superb mechanism for consolidating money and power and control, even as it gives toys and minor voices to the rest of us.

As Taylor writes in her book, “The online sphere inspires incessant talk of gift economies and public-spiritedness and democracy, but commercialization and privatization and inequality lurk beneath the surface. This contradiction is captured in a single word: ‘open,’ a concept capacious enough to contain both the communal and capitalistic impulses central to Web 2.0.” And she goes on to discuss, “the tendency of open systems to amplify inequality — and new media thinkers’ glib disregard for this fundamental characteristic.”  Part of what makes her book exceptional, in fact, is its breadth. It reviews much of the existing critique of the Internet and connects the critiques of specific aspects of it into an overview of how a phenomenon supposed to be wildly democratic has become wildly not that way at all.

And at a certain juncture, she turns to gender. Though far from the only weak point of the Internet as an egalitarian space — after all, there’s privacy (lack of), the environment (massive server farms), and economics (tax cheats, “content providers” like musicians fleeced) — gender politics, as she shows in today’s post adapted from her book, is one of the most spectacular problems online. Let’s imagine this as science fiction: a group of humans apparently dissatisfied with how things were going on Earth — where women were increasing their rights, representation, and participation — left our orbit and started their own society on their own planet. The new planet wasn’t far away or hard to get to (if you could afford the technology): it was called the Internet. We all know it by name; we all visit it; but we don’t name the society that dominates it much.

Taylor does: the dominant society, celebrating itself and pretty much silencing everyone else, makes the Internet bear a striking resemblance to Congress in 1850 or a gentlemen’s club (minus any gentleness). It’s a gated community, and as Taylor describes today, the security detail is ferocious, patrolling its borders by trolling and threatening dissident voices, and just having a female name or being identified as female is enough to become a target of hate and threats.

Early this year, a few essays were published on Internet misogyny that were so compelling I thought 2014 might be the year we revisit these online persecutions, the way that we revisited rape in 2013, thanks to the Steubenville and New Delhi assault cases of late 2012. But the subject hasn’t (yet) quite caught fire, and so not much gets said and less gets done about this dynamic new machinery for privileging male and silencing female voices. Which is why we need to keep examining and discussing this, as well as the other problems of the Internet. And why you need to read Astra Taylor’s book. This excerpt is part of her diagnosis of the problems; the book ends with ideas about a cure. Rebecca Solnit

Open systems and glass ceilings
The disappearing woman and life on the internet
By Astra Taylor

The Web is regularly hailed for its “openness” and that’s where the confusion begins, since “open” in no way means “equal.” While the Internet may create space for many voices, it also reflects and often amplifies real-world inequities in striking ways.

An elaborate system organized around hubs and links, the Web has a surprising degree of inequality built into its very architecture. Its traffic, for instance, tends to be distributed according to “power laws,” which follow what’s known as the 80/20 rule — 80% of a desirable resource goes to 20% of the population.

In fact, as anyone knows who has followed the histories of Google, Apple, Amazon, and Facebook, now among the biggest companies in the world, the Web is increasingly a winner-take-all, rich-get-richer sort of place, which means the disparate percentages in those power laws are only likely to look uglier over time.

[Read more...]

facebooktwittermail

America’s huge appetite for conspiracy theories

Conspiracy Theories and the Paranoid Style(s) of Mass Opinion,” a paper recently published in the American Journal of Political Science, finds that half of Americans consistently endorse at least one conspiracy theory.

Tom Jacobs writes: It’s easy to assume this represents widespread ignorance, but these findings suggest otherwise. Oliver and Wood report that, except for the Obama “birthers” and the 9/11 “truthers,” “respondents who endorse conspiracy theories are not less-informed about basic political facts than average citizens.”

So what does drive belief in these contrived explanations? The researchers argue the tendency to accept them is “derived from two innate psychological predispositions.”

The first, which has an evolutionary explanation, is an “unconscious cognitive bias to draw causal connections between seemingly related phenomena.” Jumping to conclusions based on weak evidence allows us to “project feelings of control in uncertain situations,” the researchers note.

The second is our “natural attraction towards melodramatic narratives as explanations for prominent events — particularly those that interpret history (in terms of) universal struggles between good and evil.”

Stories that fit that pattern “provide compelling explanations for otherwise confusing or ambiguous events, they write, noting that “many predominant beliefs systems … draw heavily upon the idea of unseen, intentional forces shaping contemporary events.”

“For many Americans, complicated or nuanced explanations for political events are both cognitively taxing and have limited appeal,” write Oliver and Wood. “A conspiracy narrative may provide a more accessible and convincing account of political events.”

That said, they add, “Even highly engaged or ideological segments of the population can be swayed by the power of these narratives, particularly when they coincide with their other political views.”

facebooktwittermail

Cahokia: North America’s first melting pot?

Christian Science Monitor: The first experiment in “melting pot” politics in North America appears to have emerged nearly 1,000 years ago in the bottom lands of the Mississippi River near today’s St. Louis, according to archaeologists piecing together the story of the rise and fall of the native American urban complex known as Cahokia.

During its heyday, Cahokia’s population reached an estimated 20,000 people – a level the continent north of the Rio Grande wouldn’t see again until the eve of the American Revolution and the growth of New York and Philadelphia.

Cahokia’s ceremonial center, seven miles northeast of St. Louis’s Gateway Arch, boasted 120 earthen mounds, including a broad, tiered mound some 10 stories high. In East St. Louis, one of two major satellites hosts another 50 earthen mounds, as well as residences. St. Louis hosted another 26 mounds and associated dwellings.

These are three of the four largest native-American mound centers known, “all within spitting distance of one another,” says Thomas Emerson, Illinois State Archaeologist and a member of a team testing the melting-pot idea. “That’s some kind of large, integrated complex to some degree.”

Where did all those people come from? Archaeologists have been debating that question for years, Dr. Emerson says. Unfortunately, the locals left no written record of the complex’s history. Artifacts such as pottery, tools, or body ornaments give an ambiguous answer.

Artifacts from Cahokia have been found in other native-American centers from Arkansas and northern Louisiana to Oklahoma, Iowa, and Wisconsin, just as artifacts from these areas appear in digs at Cahokia.

“Archaeologists are always struggling with this: Are artifacts moving, or are people moving?” Emerson says.

Emerson and two colleagues at the University of Illinois at Urbana-Champaign tried to tackle the question using two radioactive forms of the element strontium found in human teeth. They discovered that throughout the 300 years that native Americans occupied Cahokia, the complex appeared to receive a steady stream of immigrants who stayed. [Continue reading...]

facebooktwittermail

Throughout our existence humans have always been the most destructive creatures to roam this planet

woolly-mammoth

For those of us who see industrial civilization as the guarantor of humanity’s destruction, it’s easy to picture an idyllic era earlier in our evolution, located perhaps during the cultural flowering of the Great Leap Forward.

Communities then remained relatively egalitarian without workers enslaved in back-breaking labor, while subsistence on few material resources meant that time was neither controlled by the dictates of a stratified social hierarchy nor by the demands of survival.

When people could accord as much value to storytelling, ritual, and music-making, as they did to hunting and gathering food, we might like to think that human beings were living in balance with nature.

As George Monbiot reveals, the emerging evidence about of our early ancestors paints a much grimmer picture — one in which human nature appears to have always been profoundly destructive.

You want to know who we are? Really? You think you do, but you will regret it. This article, if you have any love for the world, will inject you with a venom – a soul-scraping sadness – without an obvious antidote.

The Anthropocene, now a popular term among scientists, is the epoch in which we live: one dominated by human impacts on the living world. Most date it from the beginning of the industrial revolution. But it might have begun much earlier, with a killing spree that commenced two million years ago. What rose onto its hind legs on the African savannahs was, from the outset, death: the destroyer of worlds.

Before Homo erectus, perhaps our first recognisably human ancestor, emerged in Africa, the continent abounded with monsters. There were several species of elephants. There were sabretooths and false sabretooths, giant hyenas and creatures like those released in The Hunger Games: amphicyonids, or bear dogs, vast predators with an enormous bite.

Prof Blaire van Valkenburgh has developed a means by which we could roughly determine how many of these animals there were. When there are few predators and plenty of prey, the predators eat only the best parts of the carcass. When competition is intense, they eat everything, including the bones. The more bones a carnivore eats, the more likely its teeth are to be worn or broken. The breakages in carnivores’ teeth were massively greater in the pre-human era.

Not only were there more species of predators, including species much larger than any found on Earth today, but they appear to have been much more abundant – and desperate. We evolved in a terrible, wonderful world – that was no match for us. [Continue reading...]

facebooktwittermail

Devasting consequences of losing ‘knowledgeable elders’ in non-human cultures

bluefin-tuna

Culture — something we generally associate with its expressions through art, music, literature and so forth — is commonly viewed as one of the defining attributes of humanity. We supposedly rose above animal instinct when we started creating bodies of knowledge, held collectively and passed down from generation to generation.

But it increasingly appears that this perspective has less to do with an appreciation of what makes us human than it has with our ignorance about non-human cultures.

Although non-human cultures don’t produce the kind of artifacts we create, the role of knowledge-sharing seems to be just as vital to the success of these societies as it is to ours. In other words, what makes these creatures what they are cannot be reduced to the structure of their DNA — it also involves a dynamic and learned element: the transmission of collective knowledge.

The survival of some species doesn’t simply depend on their capacity to replicate their DNA; it depends on their ability to pass on what they know.

Scuola Internazionale Superiore di Studi Avanzati: Small changes in a population may lead to dramatic consequences, like the disappearance of the migratory route of a species. A study carried out in collaboration with the SISSA has created a model of the behaviour of a group of individuals on the move (like a school of fish, a herd of sheep or a flock of birds, etc.) which, by changing a few simple parameters, reproduces the collective behaviour patterns observed in the wild. The model shows that small quantitative changes in the number of knowledgeable individuals and availability of food can lead to radical qualitative changes in the group’s behaviour.

Until the ’50s, bluefin tuna fishing was a thriving industry in Norway, second only to sardine fishing. Every year, bluefin tuna used to migrate from the eastern Mediterranean up to the Norwegian coasts. Suddenly, however, over no more than 4-5 years, the tuna never went back to Norway. In an attempt to solve this problem, Giancarlo De Luca from SISSA (the International School for Advanced Studies of Trieste) together with an international team of researchers (from the Centre for Theoretical Physics — ICTP — of Trieste and the Technical University of Denmark) started to devise a model based on an “adaptive stochastic network.” The physicists wanted to simulate, simplifying it, the collective behaviour of animal groups. Their findings, published in the journal Interface, show that the number of “informed individuals” in a group, sociality and the strength of the decision of the informed individuals are “critical” variables, such that even minimal fluctuations in these variables can result in catastrophic changes to the system.

“We started out by taking inspiration from the phenomenon that affected the bluefin tuna, but in actual fact we then developed a general model that can be applied to many situations of groups “on the move,” explains De Luca.

The collective behaviour of a group can be treated as an “emerging property,” that is, the result of the self-organization of each individual’s behaviour. “The majority of individuals in a group may not possess adequate knowledge, for example, about where to find rich feeding grounds” explains De Luca. “However, for the group to function, it is enough that only a minority of individuals possess that information. The others, the ones who don’t, will obey simple social rules, for example by following their neighbours.”

The tendency to comply with the norm, the number of knowledgeable individuals and the determination with which they follow their preferred route (which the researchers interpreted as being directly related to the appeal, or abundance, of the resource) are critical variables. “When the number of informed individuals falls below a certain level, or the strength of their determination to go in a certain direction falls below a certain threshold, the migratory pathway disappears abruptly.”

“In our networks the individuals are “points,” with interconnections that form and disappear in the course of the process, following some established rules. It’s a simple and general way to model the system which has the advantage of being able to be solved analytically,” comments De Luca.

So what ever happened to the Norwegian tuna? “Based on our results we formulated some hypotheses which will, however, have to be tested experimentally,” says De Luca. In the’50s Norway experienced a reduction in biomass and in the quantity of herrings, the main prey of tuna, which might have played a role in their disappearance. “This is consistent with our model, but there’s more to the story. In a short time the herring population returned to normal levels, whereas the tuna never came back. Why?”

One hypothesis is that, although the overall number of Mediterranean tuna has not changed, what has changed is the composition of the population: “The most desirable tuna specimens for the fishing industry are the larger, older individuals, which are presumably also those with the greater amount of knowledge, in other words the knowledgeable elders.” concludes De Luca.

Another curious fact: what happens if there are too many knowledgeable elders? “Too many know-alls are useless,” jokes De Luca. “In fact, above a certain number of informed individuals, the group performance does not improve so much as to justify the “cost” of their training. The best cost-benefit ratio is obtained by keeping the number of informed individuals above a certain level, provided they remain a minority of the whole population.”

facebooktwittermail

In unseen worlds, science invariably crosses paths with fantasy

f13-iconPhilip Ball writes: For centuries, scientists studied light to comprehend the visible world. Why are things colored? What is a rainbow? How do our eyes work? And what is light itself? These are questions that preoccupied scientists and philosophers since the time of Aristotle, including Roger Bacon, Isaac Newton, Michael Faraday, Thomas Young, and James Clerk Maxwell.

But in the late 19th century all that changed, and it was largely Maxwell’s doing. This was the period in which the whole focus of physics — then still emerging as a distinct scientific discipline — shifted from the visible to the invisible. Light itself was instrumental to that change. Not only were the components of light invisible “fields,” but light was revealed as merely a small slice of a rainbow extending far into the unseen.

Physics has never looked back. Today its theories and concepts are concerned largely with invisible entities: not only unseen force fields and insensible rays but particles too small to see even with the most advanced microscopes. We now know that our everyday perception grants us access to only a tiny fraction of reality. Telescopes responding to radio waves, infrared radiation, and X-rays have vastly expanded our view of the universe, while electron microscopes, X-ray beams, and other fine probes of nature’s granularity have unveiled the microworld hidden beyond our visual acuity. Theories at the speculative forefront of physics flesh out this unseen universe with parallel worlds and with mysterious entities named for their very invisibility: dark matter and dark energy.

This move beyond the visible has become a fundamental part of science’s narrative. But it’s a more complicated shift than we often appreciate. Making sense of what is unseen — of what lies “beyond the light” — has a much longer history in human experience. Before science had the means to explore that realm, we had to make do with stories that became enshrined in myth and folklore. Those stories aren’t banished as science advances; they are simply reinvented. Scientists working at the forefront of the invisible will always be confronted with gaps in knowledge, understanding, and experimental capability. In the face of those limits, they draw unconsciously on the imagery of the old stories. This is a necessary part of science, and these stories can sometimes suggest genuinely productive scientific ideas. But the danger is that we will start to believe them at face value, mistaking them for theories.

A backward glance at the history of the invisible shows how the narratives and tropes of myth and folklore can stimulate science, while showing that the truth will probably turn out to be far stranger and more unexpected than these old stories can accommodate. [Continue reading...]

facebooktwittermail

The roots of America’s narcissism epidemic

f13-iconWill Storr writes: For much of human history, our beliefs have been based on the assumption that people are fundamentally bad. Strip away a person’s smile and you’ll find a grotesque, writhing animal-thing. Human instincts have to be controlled, and religions have often been guides for containing the demons. Sigmund Freud held a similar view: Psychotherapy was his method of making the unconscious conscious, helping people restrain their bestial desires and accord with the moral laws of civilization.

In the middle of the 20th century, an alternative school of thought appeared. It was popularized by Carl Rogers, an influential psychotherapist at the University of Chicago, and it reversed the presumption of original sin. Rogers argued that people are innately decent. Children, he believed, should be raised in an environment of “unconditional positive regard”. They should be liberated from the inhibitions and restraints that prevented them from attaining their full potential.

It was a characteristically American idea — perhaps even the American idea. Underneath it all, people are good, and to get the best out of themselves, they just need to be free.

Economic change gave Rogers’s theory traction. It was the 1950s, and a nation of workmen was turning into a nation of salesmen. To make good in life, interpersonal sunniness was becoming essential. Meanwhile, rising divorce rates and the surge of women into the workplace were triggering anxieties about the lives of children born into the baby boom. Parents wanted to counteract the stresses of modern family life, and boosting their children’s self-esteem seemed like the solution.

By the early 1960s, wild thinkers in California were pushing Rogers’s idea even further. The “human potential movement” argued that most people were using just 10 percent of their intellectual capacity. It leaned on the work of Abraham Maslow, who studied exceptional people such as Albert Einstein and Eleanor Roosevelt and said there were five human needs, the most important of which was self-actualization—the realization of one’s maximum potential. Number two on the list was esteem.

At the close of the decade, the idea that self-esteem was the key to psychological riches finally exploded. The trigger was Nathaniel Branden, a handsome Canadian psychotherapist who had moved to Los Angeles as a disciple of the philosopher Ayn Rand. One of Rand’s big ideas was that that moral good would arise when humans ruthlessly pursued their own self-interest. She and Branden began a tortuous love affair, and her theories had an intense impact on the young psychotherapist. In The Psychology of Self-Esteem, published in 1969, Branden argued that self-esteem “has profound effects on a man’s thinking processes, emotions, desires, values and goals. It is the single most significant key to his behavior.” It was an international bestseller, and it propelled the self-esteem movement out of the counterculture and into the mainstream.

The year that Branden published his book, a sixteen-year-old in Euclid, Ohio named Roy Baumeister was grappling with his own self-esteem problem: his Dad. [Continue reading...]

facebooktwittermail

The great rewilding

f13-iconOrion magazine: One day, the British environmental writer George Monbiot was digging in his garden when he had a revelation—that his life had become too tidy and constrained. While exploring what it would take to re-ignite his own sense of wonder, he waded into a sea of ideas about restoration and rewilding that so captured his imagination that it became the focus of his next book. Feral: Searching for Enchantment on the Frontiers of Rewilding was published in the United Kingdom in 2013, to much acclaim, and is forthcoming in the U.S. in 2014. Orion editor Jennifer Sahn caught up with Monbiot to talk about rewilding — what it means for people, for nature, and for an environmental movement that is in great need of having far wider appeal.

***

Jennifer Sahn: It’s sort of an obvious starting place, but I think it makes sense to begin by asking how you define rewilding.

George Monbiot: Actually, there are two definitions of rewilding that appeal to me. One is the mass restoration of ecosystems. By restoration, I really mean bringing back their trophic function. Trophic function involves feeding. It’s about eating and being eaten. Trophic function is the interactions between animals and plants in the food chain. Most of our ecosystems are very impoverished as far as those interactions are concerned. They’re missing the top predators and the big herbivores, and so they’re missing a lot of their ecological dynamism. That, above all, is what I want to restore.

I see the mass restoration of ecosystems, meaning taking down the fences, blocking up the drainage ditches, enabling wildlife to spread. Reintroducing missing species, and particularly missing species which are keystone species, or ecosystem engineers. These are species which have impacts greater than their biomass alone would suggest. They create habitats, and create opportunities for many other species. Good examples would be beavers, wolves, wild boar, elephants, whales — all of which have huge ramifying effects on the ecosystem, including parts of the ecosystem with which they have no direct contact.

Otherwise, I see humans having very little continuing management role in the ecosystem. Having brought back the elements which can restore that dynamism, we then step back and stop trying to interfere. That, in a way, is the hardest thing of all — to stop believing that, without our help, everything’s going to go horribly wrong. I think in many ways we still suffer from the biblical myth of dominion where we see ourselves as the guardians or the stewards of the planet, whereas I think it does best when we have as little influence as we can get away with.

The other definition of rewilding that interests me is the rewilding of our own lives. I believe the two processes are closely intertwined—if we have spaces on our doorsteps in which nature is allowed to do its own thing, in which it can be to some extent self-willed, driven by its own dynamic processes, that, I feel, is a much more exciting and thrilling ecosystem to explore and discover, and it enables us to enrich our lives, to fill them with wonder and enchantment.

Jennifer: So you’re using rewilding in part as a reflexive verb?

George: Absolutely. Of all the species that need rewilding, I think human beings come at the top of the list. I would love to see a more intense and emotional engagement of human beings with the living world. The process of rewilding the ecosystem gives us an opportunity to make our lives richer and rawer than they tend to be in our very crowded and overcivilized and buttoned-down societies. [Continue reading...]

facebooktwittermail