Paul Ibbotson and Michael Tomasello write: The idea that we have brains hardwired with a mental template for learning grammar — famously espoused by Noam Chomsky of the Massachusetts Institute of Technology — has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages—and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.
The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all — such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique human ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.
This conclusion is important because the study of language plays a central role in diverse disciplines — from poetry to artificial intelligence to linguistics itself; misguided methods lead to questionable results. Further, language is used by humans in ways no animal can match; if you understand what language is, you comprehend a little bit more about human nature. [Continue reading…]
Ross Perlin writes: n 2013, at a conference on endangered languages, a retired teacher named Linda Lambrecht announced the extraordinary discovery of a previously unknown language. Lambrecht – who is Chinese-Hawaiian, 71 years old, warm but no-nonsense – called it Hawaii Sign Language, or HSL. In front of a room full of linguists, she demonstrated that its core vocabulary – words such as “mother”, “pig” and “small” – was distinct from that of other sign languages.
The linguists were immediately convinced. William O’Grady, the chair of the linguistics department at the University of Hawaii, called it “the first time in 80 years that a new language has been discovered in the United States — and maybe the last time.” But the new language found 80 years ago was in remote Alaska, whereas HSL was hiding in plain sight in Honolulu, a metropolitan area of nearly a million people. It was the kind of discovery that made the world seem larger.
The last-minute arrival of recognition and support for HSL was a powerful, almost surreal vindication for Lambrecht, whose first language is HSL. For decades, it was stigmatised or ignored; now the language has acquired an agreed-upon name, an official “language code” from the International Organization for Standardization, the attention of linguists around the world, and a three-year grant from the Endangered Languages Documentation Programme at the School of Oriental and African Studies in London.
But just as linguists were substantiating its existence, HSL stood on the brink of extinction, remembered by just a handful of signers. Unless the language made a miraculous recovery, Lambrecht feared that her announcement might turn out to be HSL’s obituary.
Three years after announcing its existence, Lambrecht is still unearthing her language sign by sign. She may be the only person in the world who still uses HSL on a regular basis, signing into a camera while a linguist named James “Woody” Woodward and a handful of graduate students from the University of Hawaii document her every move. [Continue reading…]
It is 25 years since cricket commentators Brian Johnston and Jonathan Agnew famously got the uncontrollable giggles on live radio, while reporting on that day’s Test Match between England and the West Indies. The pair were commentating on the wicket of England’s Ian Botham, when he stumbled on to his stumps and, as Agnew put it: “Didn’t get his leg over”.
The resulting infectious two minutes of laughter has since been voted the greatest moment of sporting commentary ever. It’s worth listening to again – see if you can help giggling along with them.
I research the neurobiology of human vocal communication, and recently I’ve been spending a lot of time looking at laughter, which is easily the most common non-verbal emotional expression which one comes across (though in some cultures laughter is rather impolite and can be less frequently encountered when out and about). There are four key features of the science of laughter that this the Botham clip illustrates.
Niko Besnier and Susan Brownell write: The parade of athletes in the opening ceremony of the Olympic Games often evokes strong feelings of national pride. After the 2012 Summer Games in London, the Armenian National Committee of America sent a letter of protest to NBC’s CEO and president, Stephen Burke, to complain about the short shrift Armenia received from the commentator, who only said four words about their country: “Armenia, now walking in.” Their grievance paled, however, in comparison to the Olympics-related protest that took place in 1996. Thousands of Chinese people and organizations in the U.S. and elsewhere collected US$21,000 to buy advertisements in prominent newspapers protesting the fact that NBC commentator Bob Costas mentioned human rights abuses, doping allegations, and property rights disputes as the Chinese delegation entered the stadium for the parade.
About a billion people are expected to watch the opening ceremony of the Rio de Janeiro Olympic Games on television on August 5. For most people, the highlight will be watching their country’s athletes walk proudly into the stadium behind their national flag.
The parade of athletes displays a neat world order filled with proud, loyal citizens. But nations are not really the clear political units presented in this happy family portrait. Beneath the surface is a mess of transnational wheeling and dealing by power brokers as well as athletes seeking to get the most reward for their hard work and talent—for themselves and for their families and friends.
In the last few years, well-heeled Persian Gulf states have attracted athletes from other countries by offering them money, training facilities, and the possibility of qualifying for the Olympics more easily than in their home countries. The diminutive but oil-rich emirate of Qatar, for example, has until now played a very modest role in world sports. But in recent years the country has made huge investments in sports and adopted a liberal citizenship policy for athletes. The Qatari national handball team, which reached the finals at the men’s 2015 Handball World Championship, had only four players originating from Qatar on their 17-person squad — the rest had been recruited from overseas. By our calculation, more than half of the 38 athletes who will represent Qatar in Rio were born elsewhere. [Continue reading…]
The research suggests that microbes in our ancestors’ intestines split into new evolutionary lineages in parallel with splits in the ape family tree.
This came as a surprise to scientists, who had thought that most of our gut bacteria came from our surroundings – what we eat, where we live, even what kind of medicine we take. The new research suggests that evolutionary history is much more important than previously thought.
“When there were no humans or gorillas, just ancestral African apes, they harboured gut bacteria. Then the apes split into different branches, and there was also a parallel divergence of different gut bacteria,” said Prof Andrew Moeller of the University of California, Berkeley who led the study, published in Science. This happened when gorillas separated somewhere between 10-15 million years ago, and again when humans split from chimps and bonobos 5 million years ago. [Continue reading…]
Paige Madison writes: Last September, scientists announced the discovery of a never-before-seen human relative (hominin), now known as Homo naledi, deep in a South African cave. The site yielded more than 1,500 bone fragments, an astonishing number in a field that often celebrates the identification of a single tooth. That rich fossil cache revealed much about the creatures, yet it left one glaring question unanswered: when did Homo naledi live? The scientists had no evidence for how old the fossils were. Without that information, it was very hard to know where the new species fits on the tangled human family tree, and to figure out its true meaning.
Difficulties in dating fossils have plagued anthropology since its inception. In 1856, a fossilised skeleton discovered in a small cave in the Neander Valley in Germany became the first hominin ever recognised by science. Quarry workers uncovered the fossils while clearing out a limestone cave, but before the bones were flagged as important, the workers had shovelled them out of the cave mouth. The fossils tumbled to the valley floor 20 metres below, obscuring contextual information that could have provided clues to their age – for example, how deep the skeleton was buried, and whether any fossilised animals had been found nearby.
Identifying the age of this Neanderthal (‘man from the Neander Valley’) was crucial for interpreting his significance. The skeleton had been found right around the time Charles Darwin published On the Origin of Species (1859), and its vaguely human appearance suggested it had the potential to illuminate the human past, but only if it were truly ancient. Some scientists suggested the Neanderthal was an ape-like ancestor or belonged to an ancient European race. Others dismissed him as a recent human, explaining away his strange skull shape by calling him a diseased idiot. [Continue reading…]
Jane Qiu writes: On the outskirts of Beijing, a small limestone mountain named Dragon Bone Hill rises above the surrounding sprawl. Along the northern side, a path leads up to some fenced-off caves that draw 150,000 visitors each year, from schoolchildren to grey-haired pensioners. It was here, in 1929, that researchers discovered a nearly complete ancient skull that they determined was roughly half a million years old. Dubbed Peking Man, it was among the earliest human remains ever uncovered, and it helped to convince many researchers that humanity first evolved in Asia.
Since then, the central importance of Peking Man has faded. Although modern dating methods put the fossil even earlier — at up to 780,000 years old — the specimen has been eclipsed by discoveries in Africa that have yielded much older remains of ancient human relatives. Such finds have cemented Africa’s status as the cradle of humanity — the place from which modern humans and their predecessors spread around the globe — and relegated Asia to a kind of evolutionary cul-de-sac.
But the tale of Peking Man has haunted generations of Chinese researchers, who have struggled to understand its relationship to modern humans. “It’s a story without an ending,” says Wu Xinzhi, a palaeontologist at the Chinese Academy of Sciences’ Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) in Beijing. They wonder whether the descendants of Peking Man and fellow members of the species Homo erectus died out or evolved into a more modern species, and whether they contributed to the gene pool of China today.
Keen to get to the bottom of its people’s ancestry, China has in the past decade stepped up its efforts to uncover evidence of early humans across the country. It is reanalysing old fossil finds and pouring tens of millions of dollars a year into excavations. And the government is setting up a US$1.1-million laboratory at the IVPP to extract and sequence ancient DNA. [Continue reading…]
Since the evolution of dogs from wolves tens of thousands of years ago, they have been selectively bred for various roles as guards, hunters, workers and companions. But dogs are not the only animal humans have domesticated, which suggests that although dogs get all the attention, there’s reason to argue other species could also deserve the title of “man’s best friend”.
Anthrozoology, the study of human-animal relationships, has established that dogs demonstrate complex communication with humans. Charles Darwin thought that dogs experienced love, but it was only in 2015 that Japanese scientists demonstrated what we all intuitively knew. Miho Nagasawa and colleagues sprayed the “love hormone” oxytocin up dogs’ noses, measured the loving gaze between dog and human, and then measured the oxytocin levels in the humans’ urine, finding them to be higher. Rest assured, dog owners, that science has verified your bond with your faithful hound.
Horses also show intentional communicative behaviour with humans, and another recent paper published in the Royal Society’s Biology Letters from researchers at Queen Mary University of London has shown that goats also demonstrate an affinity with humans. The experiments tested goats’ intelligence and ability to communicate with humans. What the team found may come as no surprise to anyone who has worked with livestock: goats are highly intelligent, capable of complex communication with humans, and are able to form bonds with us – treating us as potential partners to help in problem-solving situations.
Our attitudes to animals tend to reflect the familiarity we have with them. Dogs score higher in perceived intelligence ratings than cows, for example, yet a study in the 1970s demonstrated that in a test cows could navigate a maze as well as dogs, and only slightly less well than children. The point was made that our perception of an animal’s ability is influenced by how we test them.
Daniel A Gross writes: The word “noise” comes from a Latin root meaning either queasiness or pain. According to the historian Hillel Schwartz, there’s even a Mesopotamian legend in which the gods grow so angry at the clamor of earthly humans that they go on a killing spree. (City-dwellers with loud neighbors may empathize, though hopefully not too closely.)
Dislike of noise has produced some of history’s most eager advocates of silence, as Schwartz explains in his book Making Noise: From Babel to the Big Bang and Beyond. In 1859, the British nurse and social reformer Florence Nightingale wrote, “Unnecessary noise is the most cruel absence of care that can be inflicted on sick or well.” Every careless clatter or banal bit of banter, Nightingale argued, can be a source of alarm, distress, and loss of sleep for recovering patients. She even quoted a lecture that identified “sudden noises” as a cause of death among sick children.
Surprisingly, recent research supports some of Nightingale’s zealous claims. In the mid 20th century, epidemiologists discovered correlations between high blood pressure and chronic noise sources like highways and airports. Later research seemed to link noise to increased rates of sleep loss, heart disease, and tinnitus. (It’s this line of research that hatched the 1960s-era notion of “noise pollution,” a name that implicitly refashions transitory noises as toxic and long-lasting.)
Studies of human physiology help explain how an invisible phenomenon can have such a pronounced physical effect. Sound waves vibrate the bones of the ear, which transmit movement to the snail-shaped cochlea. The cochlea converts physical vibrations into electrical signals that the brain receives. The body reacts immediately and powerfully to these signals, even in the middle of deep sleep. Neurophysiological research suggests that noises first activate the amygdalae, clusters of neurons located in the temporal lobes of the brain, associated with memory formation and emotion. The activation prompts an immediate release of stress hormones like cortisol. People who live in consistently loud environments often experience chronically elevated levels of stress hormones.
Just as the whooshing of a hundred individual cars accumulates into an irritating wall of background noise, the physical effects of noise add up. In 2011, the World Health Organization tried to quantify its health burden in Europe. It concluded that the 340 million residents of western Europe—roughly the same population as that of the United States—annually lost a million years of healthy life because of noise. It even argued that 3,000 heart disease deaths were, at their root, the result of excessive noise.
So we like silence for what it doesn’t do—it doesn’t wake, annoy, or kill us—but what does it do? When Florence Nightingale attacked noise as a “cruel absence of care,” she also insisted on the converse: Quiet is a part of care, as essential for patients as medication or sanitation. It’s a strange notion, but one that researchers have begun to bear out as true. [Continue reading…]
Adam Piore writes: It started with one man quietly sipping a Tom Collins in the lounge car of the Cleveland-bound train.
“God bless America,” he sang, “land that I love …”
It didn’t take long. Others joined in. “Stand beside her … and guide her …” Soon the entire train car had taken up the melody, belting out the patriotic song at the top of their lungs.
It was 1940 and such spontaneous outpourings, this one described in a letter to the song’s creator Irving Berlin, were not unusual. That was the year the simple, 32-bar arrangement was somehow absorbed into the fabric of American culture, finding its way into American Legion halls, churches and synagogues, schools, and even a Louisville, Kentucky, insurance office, where the song reportedly sprang to the lips of the entire sales staff one day. The song has reemerged in times of national crisis or pride over and over, to be sung in ballparks, school assemblies, and on the steps of the United States Capitol after 9/11.
Berlin immigrated to the U.S. at age 5. His family fled Russia to escape a wave of murderous pogroms directed at Jews. His mother often murmured “God Bless America” as he was growing up. “And not casually, but with emotion which was almost exaltation,” Berlin later recalled.
“He always talked about it like a love song,” says Sheryl Kaskowitz, the author of God Bless America, the Surprising History of an Iconic Song. “It came from this really genuine love and a sense of gratitude to the U.S.”
It might seem ironic that someone born in a foreign land would compose a song that so powerfully expressed a sense of national belonging—that this song embraced by an entire nation was the expression of love from an outsider for his adopted land. In the U.S., a nation of immigrants built on the prospect of renewal, it’s not the least bit surprising. It is somehow appropriate.
Patriotism is an innate human sentiment. It is part of a deeper subconscious drive toward group formation and allegiance. It operates as much in one nation under God as it does in a football stadium. Group bonding is in our evolutionary history, our nature. According to some recent studies, the factors that make us patriotic are in our very genes.
But this allegiance—this blurring of the lines between individual and group—has a closely related flipside; it’s not always a warm feeling of connection in the Cleveland-bound lounge car. Sometimes our instinct for group identification serves as a powerful wedge to single out those among us who are different. Sometimes what makes us feel connected is not a love of home and country but a common enemy. [Continue reading…]
Nature reports: Two Middle Eastern populations independently developed farming and then spread the technology to Europe, Africa and Asia, according to the genomes of 44 people who lived thousands of years ago in present-day Armenia, Turkey, Israel, Jordan and Iran.
Posted on 17 June on the bioRxiv preprint server1, the research supports archaeological evidence about the multiple origins of farming, and represents the first detailed look at the ancestry of the individuals behind one of the most important periods in human history — the Neolithic revolution.
Some 11,000 years ago, humans living in the ancient Middle East region called the Fertile Crescent shifted from a nomadic existence, based on hunting game and gathering wild plants, to a more sedentary lifestyle that would later give rise to permanent settlements. Over thousands of years, these early farmers domesticated the first crops and transformed sheep, wild boars and other creatures into domestic animals.
Dozens of studies have examined the genetics of the first European farmers, who emigrated from the Middle East beginning some 8,000 years ago, but the hot climes of the Fertile Crescent had made it difficult to obtain ancient DNA from remains found there. Advances in extracting DNA from a tiny ear bone called the petrous allowed a team led by Iosif Lazaridis and David Reich, population geneticists at Harvard Medical School in Boston, Massachusetts, to analyse the genomes of the 44 Middle Eastern individuals, who lived between 14,000 and 3,500 years ago. [Continue reading…]
Jay Schwartz writes: In late 2013, the Nonhuman Rights Project (NhRP) filed a first-ever lawsuit to free a pet chimpanzee named Tommy from the inadequate conditions provided by his owner. The NhRP, a legal group focused on animal protection, argued that Tommy is an autonomous being who is held against his will and that he is entitled to a common-law writ of habeas corpus, a legal means of determining the legality of imprisonment. Granting habeas corpus to a chimpanzee would mean viewing chimpanzees as legal persons with rights, rather than as mere things, so this case was rather controversial.
The Tommy case came to a close on December 4, 2014, as the Appellate Division of the New York State Supreme Court’s five-judge panel ruled against the NhRP. (The state’s highest court is the Court of Appeals.) Justice Karen K. Peters, the presiding judge, wrote: “Needless to say, unlike human beings, chimpanzees cannot bear any legal duties, submit to societal responsibilities or be held legally accountable for their actions. In our view, it is this incapability to bear any legal responsibilities and societal duties that renders it inappropriate to confer upon chimpanzees the legal rights … that have been afforded to human beings.” [Continue reading…]
A lot of people will regard the effort to confer legal rights to non-humans as being driven by anthropomorphism. But consider the court’s argument. Could not the exact same line of reasoning be used to argue that small children or adults with developmental disabilities be deprived of legal rights? Of course, such an argument would rightly be decried as inhuman and barbaric.