Study suggests human proclivity for violence gets modulated but is not necessarily diminished by culture

A new study (by José Maria Gómez et al) challenges Steven Pinker’s rosy picture of the state of the world. Science magazine reports: Though group-living primates are relatively violent, the rates vary. Nearly 4.5% of chimpanzee deaths are caused by another chimp, for example, whereas bonobos are responsible for only 0.68% of their compatriots’ deaths. Based on the rates of lethal violence seen in our close relatives, Gómez and his team predicted that 2% of human deaths would be caused by another human.

To see whether that was true, the researchers dove into the scientific literature documenting lethal violence among humans, from prehistory to today. They combined data from archaeological excavations, historical records, modern national statistics, and ethnographies to tally up the number of humans killed by other humans in different time periods and societies. From 50,000 years ago to 10,000 years ago, when humans lived in small groups of hunter-gatherers, the rate of killing was “statistically indistinguishable” from the predicted rate of 2%, based on archaeological evidence, Gómez and his colleagues report today in Nature.

Later, as human groups consolidated into chiefdoms and states, rates of lethal violence shot up — as high as 12% in medieval Eurasia, for example. But in the contemporary era, when industrialized states exert the rule of law, violence is lower than our evolutionary heritage would predict, hovering around 1.3% when combining statistics from across the world. That means evolution “is not a straitjacket,” Gómez says. Culture modulates our bloodthirsty tendencies.

The study is “innovative and meticulously conducted,” says Douglas Fry, an anthropologist at the University of Alabama, Birmingham. The 2% figure is significantly lower than Harvard University psychologist Steven Pinker’s much publicized estimate that 15% of deaths are due to lethal violence among hunter-gatherers. The lower figure resonates with Fry’s extensive studies of nomadic hunter-gatherers, whom he has observed to be less violent than Pinker’s work suggests. “Along with archaeology and nomadic forager research, this [study] shoots holes in the view that the human past and human nature are shockingly violent,” Fry says. [Continue reading…]

Facebooktwittermail

Evidence rebuts Chomsky’s theory of language learning

Paul Ibbotson and Michael Tomasello write: The idea that we have brains hardwired with a mental template for learning grammar — famously espoused by Noam Chomsky of the Massachusetts Institute of Technology — has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages—and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.

The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all — such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique hu­­­man ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.

This conclusion is important because the study of language plays a central role in diverse disciplines — from poetry to artificial intelligence to linguistics itself; misguided methods lead to questionable results. Further, language is used by humans in ways no animal can match; if you understand what language is, you comprehend a little bit more about human nature. [Continue reading…]

Facebooktwittermail

The race to save a dying language

Ross Perlin writes: n 2013, at a conference on endangered languages, a retired teacher named Linda Lambrecht announced the extraordinary discovery of a previously unknown language. Lambrecht – who is Chinese-Hawaiian, 71 years old, warm but no-nonsense – called it Hawaii Sign Language, or HSL. In front of a room full of linguists, she demonstrated that its core vocabulary – words such as “mother”, “pig” and “small” – was distinct from that of other sign languages.

The linguists were immediately convinced. William O’Grady, the chair of the linguistics department at the University of Hawaii, called it “the first time in 80 years that a new language has been discovered in the United States — and maybe the last time.” But the new language found 80 years ago was in remote Alaska, whereas HSL was hiding in plain sight in Honolulu, a metropolitan area of nearly a million people. It was the kind of discovery that made the world seem larger.

The last-minute arrival of recognition and support for HSL was a powerful, almost surreal vindication for Lambrecht, whose first language is HSL. For decades, it was stigmatised or ignored; now the language has acquired an agreed-upon name, an official “language code” from the International Organization for Standardization, the attention of linguists around the world, and a three-year grant from the Endangered Languages Documentation Programme at the School of Oriental and African Studies in London.

But just as linguists were substantiating its existence, HSL stood on the brink of extinction, remembered by just a handful of signers. Unless the language made a miraculous recovery, Lambrecht feared that her announcement might turn out to be HSL’s obituary.

Three years after announcing its existence, Lambrecht is still unearthing her language sign by sign. She may be the only person in the world who still uses HSL on a regular basis, signing into a camera while a linguist named James “Woody” Woodward and a handful of graduate students from the University of Hawaii document her every move. [Continue reading…]

Facebooktwittermail

Cricket’s famous ‘legover’ moment and why getting the giggles is so contagious

By Sophie Scott, UCL

It is 25 years since cricket commentators Brian Johnston and Jonathan Agnew famously got the uncontrollable giggles on live radio, while reporting on that day’s Test Match between England and the West Indies. The pair were commentating on the wicket of England’s Ian Botham, when he stumbled on to his stumps and, as Agnew put it: “Didn’t get his leg over”.

The resulting infectious two minutes of laughter has since been voted the greatest moment of sporting commentary ever. It’s worth listening to again – see if you can help giggling along with them.

I research the neurobiology of human vocal communication, and recently I’ve been spending a lot of time looking at laughter, which is easily the most common non-verbal emotional expression which one comes across (though in some cultures laughter is rather impolite and can be less frequently encountered when out and about). There are four key features of the science of laughter that this the Botham clip illustrates.

[Read more…]

Facebooktwittermail

Your Olympic team may be an illusion

Niko Besnier and Susan Brownell write: The parade of athletes in the opening ceremony of the Olympic Games often evokes strong feelings of national pride. After the 2012 Summer Games in London, the Armenian National Committee of America sent a letter of protest to NBC’s CEO and president, Stephen Burke, to complain about the short shrift Armenia received from the commentator, who only said four words about their country: “Armenia, now walking in.” Their grievance paled, however, in comparison to the Olympics-related protest that took place in 1996. Thousands of Chinese people and organizations in the U.S. and elsewhere collected US$21,000 to buy advertisements in prominent newspapers protesting the fact that NBC commentator Bob Costas mentioned human rights abuses, doping allegations, and property rights disputes as the Chinese delegation entered the stadium for the parade.

About a billion people are expected to watch the opening ceremony of the Rio de Janeiro Olympic Games on television on August 5. For most people, the highlight will be watching their country’s athletes walk proudly into the stadium behind their national flag.

The parade of athletes displays a neat world order filled with proud, loyal citizens. But nations are not really the clear political units presented in this happy family portrait. Beneath the surface is a mess of transnational wheeling and dealing by power brokers as well as athletes seeking to get the most reward for their hard work and talent—for themselves and for their families and friends.

In the last few years, well-heeled Persian Gulf states have attracted athletes from other countries by offering them money, training facilities, and the possibility of qualifying for the Olympics more easily than in their home countries. The diminutive but oil-rich emirate of Qatar, for example, has until now played a very modest role in world sports. But in recent years the country has made huge investments in sports and adopted a liberal citizenship policy for athletes. The Qatari national handball team, which reached the finals at the men’s 2015 Handball World Championship, had only four players originating from Qatar on their 17-person squad — the rest had been recruited from overseas. By our calculation, more than half of the 38 athletes who will represent Qatar in Rio were born elsewhere. [Continue reading…]

Facebooktwittermail

Your gut bacteria predates appearance of humans, genetic study finds

The Guardian reports: The evolutionary history of the bacteria in your guts predates the appearance of humans, and mirrors that of our great ape relatives, according to a genetic study.

The research suggests that microbes in our ancestors’ intestines split into new evolutionary lineages in parallel with splits in the ape family tree.

This came as a surprise to scientists, who had thought that most of our gut bacteria came from our surroundings – what we eat, where we live, even what kind of medicine we take. The new research suggests that evolutionary history is much more important than previously thought.

“When there were no humans or gorillas, just ancestral African apes, they harboured gut bacteria. Then the apes split into different branches, and there was also a parallel divergence of different gut bacteria,” said Prof Andrew Moeller of the University of California, Berkeley who led the study, published in Science. This happened when gorillas separated somewhere between 10-15 million years ago, and again when humans split from chimps and bonobos 5 million years ago. [Continue reading…]

Facebooktwittermail

Anthropology is far from licking the problem of fossil ages

Paige Madison writes: Last September, scientists announced the discovery of a never-before-seen human relative (hominin), now known as Homo naledi, deep in a South African cave. The site yielded more than 1,500 bone fragments, an astonishing number in a field that often celebrates the identification of a single tooth. That rich fossil cache revealed much about the creatures, yet it left one glaring question unanswered: when did Homo naledi live? The scientists had no evidence for how old the fossils were. Without that information, it was very hard to know where the new species fits on the tangled human family tree, and to figure out its true meaning.

Difficulties in dating fossils have plagued anthropology since its inception. In 1856, a fossilised skeleton discovered in a small cave in the Neander Valley in Germany became the first hominin ever recognised by science. Quarry workers uncovered the fossils while clearing out a limestone cave, but before the bones were flagged as important, the workers had shovelled them out of the cave mouth. The fossils tumbled to the valley floor 20 metres below, obscuring contextual information that could have provided clues to their age – for example, how deep the skeleton was buried, and whether any fossilised animals had been found nearby.

Identifying the age of this Neanderthal (‘man from the Neander Valley’) was crucial for interpreting his significance. The skeleton had been found right around the time Charles Darwin published On the Origin of Species (1859), and its vaguely human appearance suggested it had the potential to illuminate the human past, but only if it were truly ancient. Some scientists suggested the Neanderthal was an ape-like ancestor or belonged to an ancient European race. Others dismissed him as a recent human, explaining away his strange skull shape by calling him a diseased idiot. [Continue reading…]

Facebooktwittermail

How China is rewriting the book on human origins

Jane Qiu writes: On the outskirts of Beijing, a small limestone mountain named Dragon Bone Hill rises above the surrounding sprawl. Along the northern side, a path leads up to some fenced-off caves that draw 150,000 visitors each year, from schoolchildren to grey-haired pensioners. It was here, in 1929, that researchers discovered a nearly complete ancient skull that they determined was roughly half a million years old. Dubbed Peking Man, it was among the earliest human remains ever uncovered, and it helped to convince many researchers that humanity first evolved in Asia.

Since then, the central importance of Peking Man has faded. Although modern dating methods put the fossil even earlier — at up to 780,000 years old — the specimen has been eclipsed by discoveries in Africa that have yielded much older remains of ancient human relatives. Such finds have cemented Africa’s status as the cradle of humanity — the place from which modern humans and their predecessors spread around the globe — and relegated Asia to a kind of evolutionary cul-de-sac.

But the tale of Peking Man has haunted generations of Chinese researchers, who have struggled to understand its relationship to modern humans. “It’s a story without an ending,” says Wu Xinzhi, a palaeontologist at the Chinese Academy of Sciences’ Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) in Beijing. They wonder whether the descendants of Peking Man and fellow members of the species Homo erectus died out or evolved into a more modern species, and whether they contributed to the gene pool of China today.

Keen to get to the bottom of its people’s ancestry, China has in the past decade stepped up its efforts to uncover evidence of early humans across the country. It is reanalysing old fossil finds and pouring tens of millions of dollars a year into excavations. And the government is setting up a US$1.1-million laboratory at the IVPP to extract and sequence ancient DNA. [Continue reading…]

Facebooktwittermail

Goats, sheep and cows could challenge dogs for title of ‘man’s best friend’

By Catherine Douglas, Newcastle University

Since the evolution of dogs from wolves tens of thousands of years ago, they have been selectively bred for various roles as guards, hunters, workers and companions. But dogs are not the only animal humans have domesticated, which suggests that although dogs get all the attention, there’s reason to argue other species could also deserve the title of “man’s best friend”.

Anthrozoology, the study of human-animal relationships, has established that dogs demonstrate complex communication with humans. Charles Darwin thought that dogs experienced love, but it was only in 2015 that Japanese scientists demonstrated what we all intuitively knew. Miho Nagasawa and colleagues sprayed the “love hormone” oxytocin up dogs’ noses, measured the loving gaze between dog and human, and then measured the oxytocin levels in the humans’ urine, finding them to be higher. Rest assured, dog owners, that science has verified your bond with your faithful hound.

Horses also show intentional communicative behaviour with humans, and another recent paper published in the Royal Society’s Biology Letters from researchers at Queen Mary University of London has shown that goats also demonstrate an affinity with humans. The experiments tested goats’ intelligence and ability to communicate with humans. What the team found may come as no surprise to anyone who has worked with livestock: goats are highly intelligent, capable of complex communication with humans, and are able to form bonds with us – treating us as potential partners to help in problem-solving situations.

Our attitudes to animals tend to reflect the familiarity we have with them. Dogs score higher in perceived intelligence ratings than cows, for example, yet a study in the 1970s demonstrated that in a test cows could navigate a maze as well as dogs, and only slightly less well than children. The point was made that our perception of an animal’s ability is influenced by how we test them.

[Read more…]

Facebooktwittermail

Contrary to popular belief, peace and quiet is all about the noise in your head

Daniel A Gross writes: The word “noise” comes from a Latin root meaning either queasiness or pain. According to the historian Hillel Schwartz, there’s even a Mesopotamian legend in which the gods grow so angry at the clamor of earthly humans that they go on a killing spree. (City-dwellers with loud neighbors may empathize, though hopefully not too closely.)

Dislike of noise has produced some of history’s most eager advocates of silence, as Schwartz explains in his book Making Noise: From Babel to the Big Bang and Beyond. In 1859, the British nurse and social reformer Florence Nightingale wrote, “Unnecessary noise is the most cruel absence of care that can be inflicted on sick or well.” Every careless clatter or banal bit of banter, Nightingale argued, can be a source of alarm, distress, and loss of sleep for recovering patients. She even quoted a lecture that identified “sudden noises” as a cause of death among sick children.

Surprisingly, recent research supports some of Nightingale’s zealous claims. In the mid 20th century, epidemiologists discovered correlations between high blood pressure and chronic noise sources like highways and airports. Later research seemed to link noise to increased rates of sleep loss, heart disease, and tinnitus. (It’s this line of research that hatched the 1960s-era notion of “noise pollution,” a name that implicitly refashions transitory noises as toxic and long-lasting.)

Studies of human physiology help explain how an invisible phenomenon can have such a pronounced physical effect. Sound waves vibrate the bones of the ear, which transmit movement to the snail-shaped cochlea. The cochlea converts physical vibrations into electrical signals that the brain receives. The body reacts immediately and powerfully to these signals, even in the middle of deep sleep. Neurophysiological research suggests that noises first activate the amygdalae, clusters of neurons located in the temporal lobes of the brain, associated with memory formation and emotion. The activation prompts an immediate release of stress hormones like cortisol. People who live in consistently loud environments often experience chronically elevated levels of stress hormones.

Just as the whooshing of a hundred individual cars accumulates into an irritating wall of background noise, the physical effects of noise add up. In 2011, the World Health Organization tried to quantify its health burden in Europe. It concluded that the 340 million residents of western Europe—roughly the same population as that of the United States—annually lost a million years of healthy life because of noise. It even argued that 3,000 heart disease deaths were, at their root, the result of excessive noise.

So we like silence for what it doesn’t do—it doesn’t wake, annoy, or kill us—but what does it do? When Florence Nightingale attacked noise as a “cruel absence of care,” she also insisted on the converse: Quiet is a part of care, as essential for patients as medication or sanitation. It’s a strange notion, but one that researchers have begun to bear out as true. [Continue reading…]

Facebooktwittermail

How patriotism brings people together — and divides them

Adam Piore writes: It started with one man quietly sipping a Tom Collins in the lounge car of the Cleveland-bound train.

“God bless America,” he sang, “land that I love …”

It didn’t take long. Others joined in. “Stand beside her … and guide her …” Soon the entire train car had taken up the melody, belting out the patriotic song at the top of their lungs.

It was 1940 and such spontaneous outpourings, this one described in a letter to the song’s creator Irving Berlin, were not unusual. That was the year the simple, 32-bar arrangement was somehow absorbed into the fabric of American culture, finding its way into American Legion halls, churches and synagogues, schools, and even a Louisville, Kentucky, insurance office, where the song reportedly sprang to the lips of the entire sales staff one day. The song has reemerged in times of national crisis or pride over and over, to be sung in ballparks, school assemblies, and on the steps of the United States Capitol after 9/11.

Berlin immigrated to the U.S. at age 5. His family fled Russia to escape a wave of murderous pogroms directed at Jews. His mother often murmured “God Bless America” as he was growing up. “And not casually, but with emotion which was almost exaltation,” Berlin later recalled.

“He always talked about it like a love song,” says Sheryl Kaskowitz, the author of God Bless America, the Surprising History of an Iconic Song. “It came from this really genuine love and a sense of gratitude to the U.S.”

It might seem ironic that someone born in a foreign land would compose a song that so powerfully expressed a sense of national belonging—that this song embraced by an entire nation was the expression of love from an outsider for his adopted land. In the U.S., a nation of immigrants built on the prospect of renewal, it’s not the least bit surprising. It is somehow appropriate.

Patriotism is an innate human sentiment. It is part of a deeper subconscious drive toward group formation and allegiance. It operates as much in one nation under God as it does in a football stadium. Group bonding is in our evolutionary history, our nature. According to some recent studies, the factors that make us patriotic are in our very genes.

But this allegiance—this blurring of the lines between individual and group—has a closely related flipside; it’s not always a warm feeling of connection in the Cleveland-bound lounge car. Sometimes our instinct for group identification serves as a powerful wedge to single out those among us who are different. Sometimes what makes us feel connected is not a love of home and country but a common enemy. [Continue reading…]

Facebooktwittermail

Farming invented twice in Middle East, genomes study reveals

wheat

Nature reports: Two Middle Eastern populations independently developed farming and then spread the technology to Europe, Africa and Asia, according to the genomes of 44 people who lived thousands of years ago in present-day Armenia, Turkey, Israel, Jordan and Iran.

Posted on 17 June on the bioRxiv preprint server1, the research supports archaeological evidence about the multiple origins of farming, and represents the first detailed look at the ancestry of the individuals behind one of the most important periods in human history — the Neolithic revolution.

Some 11,000 years ago, humans living in the ancient Middle East region called the Fertile Crescent shifted from a nomadic existence, based on hunting game and gathering wild plants, to a more sedentary lifestyle that would later give rise to permanent settlements. Over thousands of years, these early farmers domesticated the first crops and transformed sheep, wild boars and other creatures into domestic animals.

Dozens of studies have examined the genetics of the first European farmers, who emigrated from the Middle East beginning some 8,000 years ago, but the hot climes of the Fertile Crescent had made it difficult to obtain ancient DNA from remains found there. Advances in extracting DNA from a tiny ear bone called the petrous allowed a team led by Iosif Lazaridis and David Reich, population geneticists at Harvard Medical School in Boston, Massachusetts, to analyse the genomes of the 44 Middle Eastern individuals, who lived between 14,000 and 3,500 years ago. [Continue reading…]

Facebooktwittermail

Should chimps be considered people under the law?

Jay Schwartz writes: In late 2013, the Nonhuman Rights Project (NhRP) filed a first-ever lawsuit to free a pet chimpanzee named Tommy from the inadequate conditions provided by his owner. The NhRP, a legal group focused on animal protection, argued that Tommy is an autonomous being who is held against his will and that he is entitled to a common-law writ of habeas corpus, a legal means of determining the legality of imprisonment. Granting habeas corpus to a chimpanzee would mean viewing chimpanzees as legal persons with rights, rather than as mere things, so this case was rather controversial.

The Tommy case came to a close on December 4, 2014, as the Appellate Division of the New York State Supreme Court’s five-judge panel ruled against the NhRP. (The state’s highest court is the Court of Appeals.) Justice Karen K. Peters, the presiding judge, wrote: “Needless to say, unlike human beings, chimpanzees cannot bear any legal duties, submit to societal responsibilities or be held legally accountable for their actions. In our view, it is this incapability to bear any legal responsibilities and societal duties that renders it inappropriate to confer upon chimpanzees the legal rights … that have been afforded to human beings.” [Continue reading…]

A lot of people will regard the effort to confer legal rights to non-humans as being driven by anthropomorphism. But consider the court’s argument. Could not the exact same line of reasoning be used to argue that small children or adults with developmental disabilities be deprived of legal rights? Of course, such an argument would rightly be decried as inhuman and barbaric.

Facebooktwittermail

Earliest evidence of fire making by prehumans in Europe found

Science News reports: Prehumans living around 800,000 years ago in what’s now southeastern Spain were, literally, trailblazers. They lit small, controlled blazes in a cave, a new study finds.

Discoveries in the cave provide the oldest evidence of fire making in Europe and support proposals that members of the human genus, Homo, regularly ignited fires starting at least 1 million years ago, say paleontologist Michael Walker of the University of Murcia in Spain and his colleagues. Fire making started in Africa (SN: 5/5/12, p. 18) and then moved north to the Middle East (SN: 5/1/04, p. 276) and Europe, the researchers conclude in the June Antiquity.

If the age estimate for the Spain find holds up, the new report adds to a “surprising number” of sites from deep in the Stone Age that retain evidence of small, intentionally lit fires, says archaeologist John Gowlett of the University of Liverpool in England.

Excavations conducted since 2011 at the Spanish cave, Cueva Negra del Estrecho del Río Quípar, have uncovered more than 165 stones and stone artifacts that had been heated, as well as about 2,300 animal-bone fragments displaying signs of heating and charring. Microscopic and chemical analyses indicate that these finds had been heated to between 400° and 600° Celsius, consistent with having been burned in a fire. [Continue reading…]

Facebooktwittermail

How Neanderthal DNA helps humanity

Emily Singer writes: Early human history was a promiscuous affair. As modern humans began to spread out of Africa roughly 50,000 years ago, they encountered other species that looked remarkably like them — the Neanderthals and Denisovans, two groups of archaic humans that shared an ancestor with us roughly 600,000 years earlier. This motley mix of humans coexisted in Europe for at least 2,500 years, and we now know that they interbred, leaving a lasting legacy in our DNA. The DNA of non-Africans is made up of roughly 1 to 2 percent Neanderthal DNA, and some Asian and Oceanic island populations have as much as 6 percent Denisovan DNA.

Over the last few years, scientists have dug deeper into the Neanderthal and Denisovan sections of our genomes and come to a surprising conclusion. Certain Neanderthal and Denisovan genes seem to have swept through the modern human population — one variant, for example, is present in 70 percent of Europeans — suggesting that these genes brought great advantage to their bearers and spread rapidly.

“In some spots of our genome, we are more Neanderthal than human,” said Joshua Akey, a geneticist at the University of Washington. “It seems pretty clear that at least some of the sequences we inherited from archaic hominins were adaptive, that they helped us survive and reproduce.”

But what, exactly, do these fragments of Neanderthal and Denisovan DNA do? What survival advantage did they confer on our ancestors? Scientists are starting to pick up hints. Some of these genes are tied to our immune system, to our skin and hair, and perhaps to our metabolism and tolerance for cold weather, all of which might have helped emigrating humans survive in new lands.

“What allowed us to survive came from other species,” said Rasmus Nielsen, an evolutionary biologist at the University of California, Berkeley. “It’s not just noise, it’s a very important substantial part of who we are.” [Continue reading…]

Facebooktwittermail

How philosophy came to disdain the wisdom of oral cultures

Justin E H Smith writes: A poet, somewhere in Siberia, or the Balkans, or West Africa, some time in the past 60,000 years, recites thousands of memorised lines in the course of an evening. The lines are packed with fixed epithets and clichés. The bard is not concerned with originality, but with intonation and delivery: he or she is perfectly attuned to the circumstances of the day, and to the mood and expectations of his or her listeners.

If this were happening 6,000-plus years ago, the poet’s words would in no way have been anchored in visible signs, in text. For the vast majority of the time that human beings have been on Earth, words have had no worldly reality other than the sound made when they are spoken.

As the theorist Walter J Ong pointed out in Orality and Literacy: Technologizing the Word (1982), it is difficult, perhaps even impossible, now to imagine how differently language would have been experienced in a culture of ‘primary orality’. There would be nowhere to ‘look up a word’, no authoritative source telling us the shape the word ‘actually’ takes. There would be no way to affirm the word’s existence at all except by speaking it – and this necessary condition of survival is important for understanding the relatively repetitive nature of epic poetry. Say it over and over again, or it will slip away. In the absence of fixed, textual anchors for words, there would be a sharp sense that language is charged with power, almost magic: the idea that words, when spoken, can bring about new states of affairs in the world. They do not so much describe, as invoke.

As a consequence of the development of writing, first in the ancient Near East and soon after in Greece, old habits of thought began to die out, and certain other, previously latent, mental faculties began to express themselves. Words were now anchored and, though spellings could change from one generation to another, or one region to another, there were now physical traces that endured, which could be transmitted, consulted and pointed to in settling questions about the use or authority of spoken language.

Writing rapidly turned customs into laws, agreements into contracts, genealogical lore into history. In each case, what had once been fundamentally temporal and singular was transformed into something eternal (as in, ‘outside of time’) and general. Even the simple act of making everyday lists of common objects – an act impossible in a primary oral culture – was already a triumph of abstraction and systematisation. From here it was just one small step to what we now call ‘philosophy’. [Continue reading…]

Facebooktwittermail

The camaraderie of outrage

Harambe

“The killing of a gorilla at the Cincinnati Zoo in order to save a child who fell in its enclosure has sparked nationwide outrage,” reports CBS News.

I share the outrage.

I happen to be among those who believe that the incarceration of wild animals for the entertainment of sightseers, cannot be justified. It does little to elevate the consciousness of the people and even less the well-being of the captives. The protection of endangered species requires first and foremost the protection of endangered habitats.

Upon seeing the news of the gorilla’s death, like many others, I also thought that if a four-year boy could even get into a situation like this, there had to be negligence on the part of parents, bystanders, and/or the zoo operators. Likewise, the decision to shoot and kill the 17-year-old gorilla, Harambe (a Swahili name which means, “all pull together”) seemed very questionable.

Among the outraged voices showing up on Facebook, the most venomous attacks have been directed at Michelle Gregg, the boy’s mother.

Jan Dadaista Subert:

The crappy mother should have gotten shot instead, not the poor innocent gorilla!

Andrew Weprin:

Michelle Gregg says, “God protected my child until the authorities were able to get to him.” No, Harambe protected your child after you & God failed to stop him from climbing into the enclosure! And innocent Harambe ended up dead for his efforts, shot with a bullet that would have been better spent on you, for failing to look after your own child and being the cause of all this!

The creator of a Facebook page, Justice For Harambe (which has already received over 60,000 likes), propagated the claim that Gregg was planning to sue to zoo, and yet when asked to support this claim with some evidence simply said: “Educated guess.” The page’s stated objective is: “We wish to see charges brought against those responsible!!”

The outrage directed at Gregg has prompted a smaller wave of outrage coming from those who underline the fact that even when under the supervision of the most attentive of parents, small children do have a talent for slipping out of sight.

Meanwhile, the United Nations refugee agency announced on Sunday that at least 700 people are believed to have drowned in the Mediterranean this week as tens of thousands of refugees continue to seek safety in Europe.

The latest chapter in the worst humanitarian crisis since World War II has prompted very little outrage on this side of the Atlantic.

For observers of social media in the U.S., it’s hard to avoid concluding that the life of a gorilla is commonly regarded here as being more precious than the lives of countless human beings.

Although to some extent it’s heartening that this much concern is being shown about the premature death of a gorilla, it’s disturbing that over the last year and longer there has been such widespread indifference shown towards millions of people in desperate need.

Is there really such a compassion deficit in America, or does this reveal more about the psychology of rage?

My guess is that among those now seeking justice for Harambe, prior to this weekend many had not paid a great deal of interest in the welfare of western lowland gorillas.

The guiding emotions here were outrage at what seemed like the unnecessary loss of an innocent life, and a certain sympathy with fellow primates which all children feel and most adults have learned to sublimate.

The great apes fascinate us because on some level we recognize them as kin. We don’t just look at them; we see them with reflective awareness looking at us.

Yet why would a sense of kinship be able to extend outside our own species while falling short among other members of the human race?

What is at play here seems to have less to do with who or what we identify with than it does with the pathways that facilitate our connections.

It turns out that in the age of social media, outrage has become such a potent force because it allows strangers to bond.

Teddy Wayne writes:

A 2013 study, from Beihang University in Beijing, of Weibo, a Twitter-like site, found that anger is the emotion that spreads the most easily over social media. Joy came in a distant second. The main difference, said Ryan Martin, a psychology professor at the University of Wisconsin, Green Bay, who studies anger, is that although we tend to share the happiness only of people we are close to, we are willing to join in the rage of strangers. As the study suggests, outrage is lavishly rewarded on social media, whether through supportive comments, retweets or Facebook likes. People prone to Internet outrage are looking for validation, Professor Martin said. “They want to hear that others share it,” he said, “because they feel they’re vindicated and a little less lonely and isolated in their belief.”

Harambe’s death pulled strangers together in their shared anger. The sad and stern face of a silverback resonated across a population which, struggling to find common ground through things we can affirm, finds it much more easily in our discontent.

Facebooktwittermail

The unexpected sophistication of Neanderthals

Discover reports: Circular structures discovered in a French cave continue to build the case that Neanderthals were more intelligent than we give them credit for.

Deep inside the Bruniquel Cave, researchers discovered two rings of stalactites and stalagmites that appeared to have been deliberately stacked and arranged to form a structure. The site also contained charred animal bones, which may have served as torches to illuminate the dark depths of the cave or keep bears at bay. The thing is, a new dating analysis suggests these structures were built more than 170,000 years ago, long before Homo sapiens arrived in the area. That means Neanderthals were the likely architects, and we didn’t expect them to be such adept builders and cave explorers.

The structures in Bruniquel were first discovered in 1990 and dated at the time to roughly 50,000 years ago based on carbon dating techniques. However, in 2013, Sophie Verheyden of the Royal Belgian Institute of Natural Sciences conducted a new study, drilling into the stalactites and stalagmites to measure differences between layers of rock that accumulated before and after they were felled. Her analysis, published Wednesday in Nature, revealed an astounding age of roughly 176,500 years, more than three times the previous estimate. By contrast, the oldest known human cave art is only around 42,000 years old. [Continue reading…]

Facebooktwittermail