Category Archives: Humanity

Cooperation is what makes us human

Kat McGowan writes: Tales about the origins of our species always start off like this: A small band of hunter-gatherers roams the savannah, loving, warring, and struggling for survival under the African sun. They do not start like this: A fat guy falls off a New York City subway platform onto the tracks.

But what happens next is a quintessential story of who we are as human beings.

On Feb. 17, 2013, around 2:30 a.m., Garrett O’Hanlon, a U.S. Air Force Academy cadet third class, was out celebrating his 22nd birthday in New York City. He and his sister were in the subway waiting for a train when a sudden silence came over the platform, followed by a shriek. People pointed down to the tracks.

O’Hanlon turned and saw a man sprawled facedown on the tracks. “The next thing that happened, I was on the tracks, running toward him,” he says. “I honestly didn’t have a thought process.”

O’Hanlon grabbed the unconscious man by the shoulders, lifting his upper body off the tracks, but struggled to move him. He was deadweight. According to the station clock, the train would arrive in less than two minutes. From the platform, O’Hanlon’s sister was screaming at him to save himself.

Suddenly other arms were there: Personal trainer Dennis Codrington Jr. and his friend Matt Foley had also jumped down to help. “We grabbed him, one by the legs, one by the shoulders, one by the chest,” O’Hanlon says. They got the man to the edge of the platform, where a dozen or more people muscled him up and over. More hands seized the rescuers’ arms and shoulders, helping them up to safety as well.

In the aftermath of the rescue, O’Hanlon says he has been surprised that so many people have asked him why he did it. “I get stunned by the question,” he says. In his view, anybody else would’ve done the same thing. “I feel like it’s a normal reaction,” he says. “To me that’s just what people do.”

More precisely, it is something only people do, according to developmental psychologist Michael Tomasello, codirector of the Max Planck Institute for Evolutionary Anthropology.

For decades Tomasello has explored what makes humans distinctive. His conclusion? We cooperate. Many species, from ants to orcas to our primate cousins, cooperate in the wild. But Tomasello has identified a special form of cooperation. In his view, humans alone are capable of shared intentionality—they intuitively grasp what another person is thinking and act toward a common goal, as the subway rescuers did. This supremely human cognitive ability, Tomasello says, launched our species on its extraordinary trajectory. It forged language, tools, and cultures—stepping-stones to our colonization of every corner of the planet. [Continue reading…]

Facebooktwittermail

The faster we go, the more time we lose

Mark C. Taylor writes: “Sleeker. Faster. More Intuitive” (The New York Times); “Welcome to a world where speed is everything” (Verizon FiOS); “Speed is God, and time is the devil” (chief of Hitachi’s portable-computer division). In “real” time, life speeds up until time itself seems to disappear—fast is never fast enough, everything has to be done now, instantly. To pause, delay, stop, slow down is to miss an opportunity and to give an edge to a competitor. Speed has become the measure of success—faster chips, faster computers, faster networks, faster connectivity, faster news, faster communications, faster transactions, faster deals, faster delivery, faster product cycles, faster brains, faster kids. Why are we so obsessed with speed, and why can’t we break its spell?

The cult of speed is a modern phenomenon. In “The Futurist Manifesto” in 1909, Filippo Tommaso Marionetti declared, “We say that the splendor of the world has been enriched by a new beauty: the beauty of speed.” The worship of speed reflected and promoted a profound shift in cultural values that occurred with the advent of modernity and modernization. With the emergence of industrial capitalism, the primary values governing life became work, efficiency, utility, productivity, and competition. When Frederick Winslow Taylor took his stopwatch to the factory floor in the early 20th century to increase workers’ efficiency, he began a high-speed culture of surveillance so memorably depicted in Charlie Chaplin’s Modern Times. Then, as now, efficiency was measured by the maximization of rapid production through the programming of human behavior.

With the transition from mechanical to electronic technologies, speed increased significantly. The invention of the telegraph, telephone, and stock ticker liberated communication from the strictures imposed by the physical means of conveyance. Previously, messages could be sent no faster than people, horses, trains, or ships could move. By contrast, immaterial words, sounds, information, and images could be transmitted across great distances at very high speed. During the latter half of the 19th century, railway and shipping companies established transportation networks that became the backbone of national and international information networks. When the trans-Atlantic cable (1858) and transcontinental railroad (1869) were completed, the foundation for the physical infrastructure of today’s digital networks was in place.

Fast-forward 100 years. During the latter half of the 20th century, information, communications, and networking technologies expanded rapidly, and transmission speed increased exponentially. But more than data and information were moving faster. Moore’s Law, according to which the speed of computer chips doubles every two years, now seems to apply to life itself. Plugged in 24/7/365, we are constantly struggling to keep up but are always falling further behind. The faster we go, the less time we seem to have. As our lives speed up, stress increases, and anxiety trickles down from managers to workers, and parents to children. [Continue reading…]

Facebooktwittermail

35,000 year-old Indonesian cave paintings suggest art came out of Africa

The Guardian reports: Paintings of wild animals and hand markings left by adults and children on cave walls in Indonesia are at least 35,000 years old, making them some of the oldest artworks known.

The rock art was originally discovered in caves on the island of Sulawesi in the 1950s, but dismissed as younger than 10,000 years old because scientists thought older paintings could not possibly survive in a tropical climate.

But fresh analysis of the pictures by an Australian-Indonesian team has stunned researchers by dating one hand marking to at least 39,900 years old, and two paintings of animals, a pig-deer or babirusa, and another animal, probably a wild pig, to at least 35,400 and 35,700 years ago respectively.

The work reveals that rather than Europe being at the heart of an explosion of creative brilliance when modern humans arrived from Africa, the early settlers of Asia were creating their own artworks at the same time or even earlier.

Archaeologists have not ruled out that the different groups of colonising humans developed their artistic skills independently of one another, but an enticing alternative is that the modern human ancestors of both were artists before they left the African continent.

“Our discovery on Sulawesi shows that cave art was made at opposite ends of the Pleistocene Eurasian world at about the same time, suggesting these practices have deeper origins, perhaps in Africa before our species left this continent and spread across the globe,” said Dr Maxime Aubert, an archaeologist at the University of Wollongong. [Continue reading…]

Facebooktwittermail

When digital nature replaces nature

Diane Ackerman writes: Last summer, I watched as a small screen in a department store window ran a video of surfing in California. That simple display mesmerized high-heeled, pin-striped, well-coiffed passersby who couldn’t take their eyes off the undulating ocean and curling waves that dwarfed the human riders. Just as our ancient ancestors drew animals on cave walls and carved animals from wood and bone, we decorate our homes with animal prints and motifs, give our children stuffed animals to clutch, cartoon animals to watch, animal stories to read. Our lives trumpet, stomp, and purr with animal tales, such as The Bat Poet, The Velveteen Rabbit, Aesop’s Fables, The Wind in the Willows, The Runaway Bunny, and Charlotte’s Web. I first read these wondrous books as a grown-up, when both the adult and the kid in me were completely spellbound. We call each other by “pet” names, wear animal-print clothes. We ogle plants and animals up close on screens of one sort or another. We may not worship or hunt the animals we see, but we still regard them as necessary physical and spiritual companions. It seems the more we exile ourselves from nature, the more we crave its miracle waters. Yet technological nature can’t completely satisfy that ancient yearning.

What if, through novelty and convenience, digital nature replaces biological nature? Gradually, we may grow used to shallower and shallower experiences of nature. Studies show that we’ll suffer. Richard Louv writes of widespread “nature deficit disorder” among children who mainly play indoors — an oddity quite new in the history of humankind. He documents an upswell in attention disorders, obesity, depression, and lack of creativity. A San Diego fourth-grader once told him: “I like to play indoors because that’s where all the electrical outlets are.” Adults suffer equally. It’s telling that hospital patients with a view of trees heal faster than those gazing at city buildings and parking lots. In studies conducted by Peter H. Kahn and his colleagues at the University of Washington, office workers in windowless cubicles were given flat-screen views of nature. They reaped the benefits of greater health, happiness, and efficiency than those without virtual windows. But they weren’t as happy, healthy, or creative as people given real windows with real views of nature.

As a species, we’ve somehow survived large and small ice ages, genetic bottlenecks, plagues, world wars, and all manner of natural disasters, but I sometimes wonder if we’ll survive our own ingenuity. At first glance, it seems like we may be living in sensory overload. The new technology, for all its boons, also bedevils us with speed demons, alluring distractors, menacing highjinks, cyber-bullies, thought-nabbers, calm-frayers, and a spiky wad of miscellaneous news. Some days it feels like we’re drowning in a twittering bog of information. But, at exactly the same time, we’re living in sensory poverty, learning about the world without experiencing it up close, right here, right now, in all its messy, majestic, riotous detail. Like seeing icebergs without the cold, without squinting in the Antarctic glare, without the bracing breaths of dry air, without hearing the chorus of lapping waves and shrieking gulls. We lose the salty smell of the cold sea, the burning touch of ice. If, reading this, you can taste those sensory details in your mind, is that because you’ve experienced them in some form before, as actual experience? If younger people never experience them, can they respond to words on the page in the same way?

The farther we distance ourselves from the spell of the present, explored by all our senses, the harder it will be to understand and protect nature’s precarious balance, let alone the balance of our own human nature. [Continue reading…]

Facebooktwittermail

The violence of faith cannot be exorcised by demonising religion

John Gray reviews Fields of Blood: Religion and the History of Violence, by Karen Armstrong: Not long after the Iranian Revolution of 1979, in which Ayatollah Ruhollah Kho­meini became supreme leader, a US official was heard to exclaim: “Who ever took religion seriously?” The official was baffled at the interruption of what he assumed was an overwhelmingly powerful historical trend. Pretty well everyone at the time took it for granted that religion was on the way out, not only as a matter of personal belief, but even more as a deciding factor in politics. Secularisation was advancing everywhere, and with increasing scientific knowledge and growing prosperity it was poised to become a universal human condition. True, there were some countries that remained stubbornly religious – including, ironically, the United States. But these were exceptions. Religion was an atavistic way of thinking which was gradually but inexorably losing its power. In universities, grandiose theories of secularisation were taught as established fact, while politicians dismissed ideas they didn’t like as “mere theology”. The unimportance of religion was part of conventional wisdom, an unthinking assumption of those who liked to see themselves as thinking people.

Today no one could ask why religion should be taken seriously. Those who used to dismiss religion are terrified by the in­tensity of its revival. Karen Armstrong, who cites the US official, describes the current state of opinion: “In the west the idea that religion is inherently violent is now taken for granted and seems self-evident.” She goes on:

As one who speaks on religion, I constantly hear how cruel and aggressive it has been, a view that, eerily, is expressed in the same way almost every time: “Religion has been the cause of all the major wars in history.” I have heard this sentence recited like a mantra by American commentators and psychiatrists, London taxi drivers and Oxford academics. It is an odd remark. Obviously the two world wars were not fought on account of religion . . . Experts in political violence or terrorism insist that people commit atrocities for a complex range of reasons. Yet so indelible is the aggressive image of religious faith in our secular consciousness that we routinely load the violent sins of the 20th century on to the back of “religion” and drive it out into the political wilderness.

The idea that religion is fading away has been replaced in conventional wisdom by the notion that religion lies behind most of the world’s conflicts. Many among the present crop of atheists hold both ideas at the same time. They will fulminate against religion, declaring that it is responsible for much of the violence of the present time, then a moment later tell you with equally dogmatic fervour that religion is in rapid decline. Of course it’s a mistake to expect logic from rationalists. More than anything else, the evangelical atheism of recent years is a symptom of moral panic. Worldwide secularisation, which was believed to be an integral part of the process of becoming modern, shows no signs of happening. Quite the contrary: in much of the world, religion is in the ascendant. For many people the result is a condition of acute cognitive dissonance. [Continue reading…]

Facebooktwittermail

Is a vulnerable world teetering on the edge of a new Dark Age?

By Joseph Camilleri, La Trobe University

We appear to have reached one of those extraordinary moments in history when people everywhere, communities and even entire nations, feel increasingly stressed and vulnerable. The same may be said of the planet as a whole.

Whether intellectually or intuitively, many are asking the same question: Where are we heading? How do we explain the long list of financial, environmental and humanitarian emergencies, epidemics, small and larger conflicts, genocides, war crimes, terrorist attacks and military interventions? Why does the international community seem powerless to prevent any of this?

There is no simple or single answer to this conundrum, but two factors can shed much light.

The first involves a global power shift and the prospect of a new Cold War. The second relates to globalisation and the crises generated by the sheer scale of cross-border flows.

Is a new Cold War in the making?

The geopolitical shift has resulted in a dangerous souring of America’s relations with Russia and China.

The dispute over Ukraine is the latest chapter in the rapidly deteriorating relationship between Washington and Moscow. In what is essentially a civil war in which over 3,000 people have been killed, the two great powers have chosen to support opposing sides in the conflict by all means short of outright intervention.

The incorporation of Crimea into Russia, Moscow’s decision to use force in Georgia in 2008 and its support for the independence of the two breakaway regions of Abkhazia and South Ossetia are part of the same dynamic.

The conduct of Russian governments in the Putin era has been at times coercive and often clumsy at home and abroad. But the United States has also much to answer for. For the last 25 years its foreign policy has been unashamedly triumphalist.

In his 1992 State of the Union address, President George Bush senior declared:

By the grace of God, America won the Cold War.

Since then we have seen the bombing of Serbia without UN Security Council approval, US withdrawal from the Anti-Ballistic Missile Treaty, the US invasion of Iraq in defiance of UN opposition, overt support for the colour revolutions on Russia’s doorstep (Ukraine, Georgia, Kyrgyzstan), and the Magnitsky Act singling out Russia for human rights violations. Western military intervention in Libya, which contrary to assurances brought about regime change, dealt a further blow to the relationship.

And now the Ukraine crisis has led to steadily expanding US and European sanctions against Russia and renewed efforts to ramp up NATO deployments and joint exercises in Eastern Europe.

Are we seeing the emergence of a new Cold War? Though ideology is now less conspicuous, the underlying structure of the conflict is remarkably similar. The trans-Atlantic alliance is once again seeking to contain and erode Russian power and influence, this time round by reaching ever closer to Russian borders.

Continue reading

Facebooktwittermail

We are more rational than those who nudge us

Steven Poole writes: Humanity’s achievements and its self-perception are today at curious odds. We can put autonomous robots on Mars and genetically engineer malarial mosquitoes to be sterile, yet the news from popular psychology, neuroscience, economics and other fields is that we are not as rational as we like to assume. We are prey to a dismaying variety of hard-wired errors. We prefer winning to being right. At best, so the story goes, our faculty of reason is at constant war with an irrational darkness within. At worst, we should abandon the attempt to be rational altogether.

The present climate of distrust in our reasoning capacity draws much of its impetus from the field of behavioural economics, and particularly from work by Daniel Kahneman and Amos Tversky in the 1980s, summarised in Kahneman’s bestselling Thinking, Fast and Slow (2011). There, Kahneman divides the mind into two allegorical systems, the intuitive ‘System 1’, which often gives wrong answers, and the reflective reasoning of ‘System 2’. ‘The attentive System 2 is who we think we are,’ he writes; but it is the intuitive, biased, ‘irrational’ System 1 that is in charge most of the time.

Other versions of the message are expressed in more strongly negative terms. You Are Not So Smart (2011) is a bestselling book by David McRaney on cognitive bias. According to the study ‘Why Do Humans Reason?’ (2011) by the cognitive scientists Hugo Mercier and Dan Sperber, our supposedly rational faculties evolved not to find ‘truth’ but merely to win arguments. And in The Righteous Mind (2012), the psychologist Jonathan Haidt calls the idea that reason is ‘our most noble attribute’ a mere ‘delusion’. The worship of reason, he adds, ‘is an example of faith in something that does not exist’. Your brain, runs the now-prevailing wisdom, is mainly a tangled, damp and contingently cobbled-together knot of cognitive biases and fear.

This is a scientised version of original sin. And its eager adoption by today’s governments threatens social consequences that many might find troubling. A culture that believes its citizens are not reliably competent thinkers will treat those citizens differently to one that respects their reflective autonomy. Which kind of culture do we want to be? And we do have a choice. Because it turns out that the modern vision of compromised rationality is more open to challenge than many of its followers accept. [Continue reading…]

Facebooktwittermail

Green modernism would destroy wilderness

Brandon Keim writes: Several years ago, I asked a biologist friend what she thought of a recently fashionable notion in environmentalist circles: that pristine nature was an illusion, and our beloved wilderness an outdated construct that didn’t actually exist. She’d just finished her shift at the local boardwalk, a volunteer-tended path through a lovely little peat bog that formed after the last ice age, near what is today eastern Maine’s largest commercial shopping area.

After a moment’s reflection, she said this was probably true, in an academic sense, but she didn’t pay it much mind. The fact remained that places such as the bog, affected by human activity, were special, and ought to be protected; other places were affected far less, but they were special and needed protection, too.

It was a simple, practical answer, from someone who’d devoted much of her life to tending the natural world. I find myself recalling it now that the ideals of conservation are under attack by the movement’s own self-appointed vanguard: the green modernists (aka the New Conservationists, post-environmentalists or eco-pragmatists), a group of influential thinkers who argue that we should embrace our planetary lordship and re-conceive Earth as a giant garden.

Get over your attachment to wilderness, they say. There’s no such thing, and thinking otherwise is downright counterproductive. As for wildness, some might exist in the margins of our gardens – designed and managed to serve human wants – but it’s not especially important. And if you appreciate wild animals and plants for their own sake? Well, get over that, too. Those sentiments are as outdated as a daguerreotype of Henry David Thoreau’s beard, dead as a dodo in an Anthropocene age characterised by humanity’s literally awesome domination of Earth.

That humanity has vast power is true. Human purposes divert roughly one-fourth of all terrestrial photosynthetic activity and half its available fresh water. We’re altering ocean currents and atmospheric patterns, and moving as much rock as the process of erosion. The sheer biomass of humanity and our domesticated animals dwarfs that of other land mammals; our plastic permeates the oceans. We’re driving other creatures extinct at rates last seen 65 million years ago, when an asteroid struck Earth and ended the age of dinosaurs.

By midcentury, there could be 10 billion humans, all demanding and deserving a quality of life presently experienced by only a few. It will be an extraordinary, planet-defining challenge. Meeting it will require, as green modernists correctly observe, new ideas and tools. It also demands a deep, abiding respect for non‑human life, no less negligible than the respect we extend to one another. Power is not the same thing as supremacy.

If humanity is to be more than a biological asteroid, nature-lovers should not ‘jettison their idealised notions of nature, parks and wilderness’ and quit ‘pursuing the protection of biodiversity for biodiversity’s sake’, as urged in a seminal essay co‑authored by Peter Kareiva, chief scientist at the Nature Conservancy, the world’s largest conservation organisation. Nor can we replace these ideals with what the science writer Emma Marris imagines as ‘a global, half-wild rambunctious garden, tended by us’.

Well-intentioned as these visions might be, they’re inadequate for the Anthropocene. We need to embrace more wilderness, not less. And though framing humanity’s role as global gardening sounds harmless, even pleasant, the idea contains a seed of industrial society’s fundamental flaw: an ethical vision in which only human interests matter. It’s a blueprint not for a garden, but for a landscaped graveyard. [Continue reading…]

Facebooktwittermail