Ben Panko writes: Thirteen years ago, a deadly strain of avian flu known as H5N1 was tearing through Asia’s bird populations. In January 2004, Chinese scientists reported that pigs too had become infected with the virus—an alarming development, since pigs are susceptible to human viruses and could potentially act as a “mixing vessel” that would allow the virus to jump to humans. “Urgent attention should be paid to the pandemic preparedness of these two subtypes of influenza,” the scientists wrote in their study.
Yet at the time, little attention was paid outside of China—because the study was published only in Chinese, in a small Chinese journal of veterinary medicine.
It wasn’t until August of that year that the World Health Organization and the United Nations learned of the study’s results and rushed to have it translated. Those scientists and policy makers ran headlong into one of science’s biggest unsolved dilemmas: language. A new study in the journal PLOS Biology sheds light on how widespread the gulf can be between English-language science and any-other-language science, and how that gap can lead to situations like the avian flu case, or worse. [Continue reading…]
After one of the most divisive presidential elections in American history, many of us may be anxious about dinner-table dialogue with family and friends this Thanksgiving. There is no denying that the way we communicate about politics has fundamentally changed with the proliferation of technology and social media. Twitter bots, fake news and echo chambers are just a few of the highlights from this election season. Much of how we’re conversing online can’t – and shouldn’t – be replicated around the family table. We are getting out of practice at conducting meaningful, respectful conversation.
There’s not a quick fix. We need more empathic communication – the slow, deep (inter)personal discourse that can nurture identity and build and strengthen relationships. Yet contemporary communication platforms can make it harder to build empathy with conversational partners. Even the phrase “conversational partners” seems unfitting in the world of 140-character limits, followers, likes and shares. In many ways, our devices help us talk at (@?) instead of with one another.
Literally meaning “in-feeling,” empathy is a process of internalizing another person’s perspective. Empathy-building is unselfish; you suspend your own sensibilities and try to fully imagine and embrace those of someone else. You can gain empathy by learning about other cultures from different media, by experiencing what others have gone through personally, or by having deep conversations with others.
My research into cross-cultural communications has taught me that empathy is not only the key to feeling connected – “I understand you” – but also the foundation for changing our narratives about one another – “now I see we are not so different.” That’s an important point to remember after such a difficult political experience. Building empathy requires communication, specifically talking to one another. But, not just any talking will suffice – especially not the type of talking promoted by today’s highly popular communication technologies.
Ivan Krastev writes: In our increasingly Anglophone world, Americans have become nakedly transparent to English speakers everywhere, yet the world remains bafflingly and often frighteningly opaque to monolingual Americans. While the world devours America’s movies and follows its politics closely, Americans know precious little about how non-Americans think and live. Americans have never heard of other countries’ movie stars and have only the vaguest notions of what their political conflicts are about.
This gross epistemic asymmetry is a real weakness. When WikiLeaks revealed the secret cables of the American State Department or leaked the emails of the Clinton campaign, it became a global news sensation and a major embarrassment for American diplomacy. Leaking Chinese diplomatic cables or Russian officials’ emails could never become a worldwide human-interest story, simply because only a relative handful of non-Chinese or non-Russians could read them, let alone make sense of them. [Continue reading…]
Although I’m pessimistic about the prospects of the meek inheriting the earth, the bi-lingual are in a very promising position. And Anglo-Americans should never forget that this is after all a country with a Spanish name. As for where I stand personally, I’m with the bi-lingual camp in spirit even if my own claim to be bi-lingual is a bit tenuous — an English-speaker who understands American-English but speaks British-English; does that count?
Julie Sedivy writes: What defines who we are? Our habits? Our aesthetic tastes? Our memories? If pressed, I would answer that if there is any part of me that sits at my core, that is an essential part of who I am, then surely it must be my moral center, my deep-seated sense of right and wrong.
And yet, like many other people who speak more than one language, I often have the sense that I’m a slightly different person in each of my languages — more assertive in English, more relaxed in French, more sentimental in Czech. Is it possible that, along with these differences, my moral compass also points in somewhat different directions depending on the language I’m using at the time?
Psychologists who study moral judgments have become very interested in this question. Several recent studies have focused on how people think about ethics in a non-native language — as might take place, for example, among a group of delegates at the United Nations using a lingua franca to hash out a resolution. The findings suggest that when people are confronted with moral dilemmas, they do indeed respond differently when considering them in a foreign language than when using their native tongue.
In a 2014 paper led by Albert Costa, volunteers were presented with a moral dilemma known as the “trolley problem”: imagine that a runaway trolley is careening toward a group of five people standing on the tracks, unable to move. You are next to a switch that can shift the trolley to a different set of tracks, thereby sparing the five people, but resulting in the death of one who is standing on the side tracks. Do you pull the switch?
Most people agree that they would. But what if the only way to stop the trolley is by pushing a large stranger off a footbridge into its path? People tend to be very reluctant to say they would do this, even though in both scenarios, one person is sacrificed to save five. But Costa and his colleagues found that posing the dilemma in a language that volunteers had learned as a foreign tongue dramatically increased their stated willingness to shove the sacrificial person off the footbridge, from fewer than 20% of respondents working in their native language to about 50% of those using the foreign one. [Continue reading…]
Paul Ibbotson and Michael Tomasello write: The idea that we have brains hardwired with a mental template for learning grammar — famously espoused by Noam Chomsky of the Massachusetts Institute of Technology — has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages—and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.
The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all — such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique human ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.
This conclusion is important because the study of language plays a central role in diverse disciplines — from poetry to artificial intelligence to linguistics itself; misguided methods lead to questionable results. Further, language is used by humans in ways no animal can match; if you understand what language is, you comprehend a little bit more about human nature. [Continue reading…]
Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.
But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.
More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.
Ross Perlin writes: n 2013, at a conference on endangered languages, a retired teacher named Linda Lambrecht announced the extraordinary discovery of a previously unknown language. Lambrecht – who is Chinese-Hawaiian, 71 years old, warm but no-nonsense – called it Hawaii Sign Language, or HSL. In front of a room full of linguists, she demonstrated that its core vocabulary – words such as “mother”, “pig” and “small” – was distinct from that of other sign languages.
The linguists were immediately convinced. William O’Grady, the chair of the linguistics department at the University of Hawaii, called it “the first time in 80 years that a new language has been discovered in the United States — and maybe the last time.” But the new language found 80 years ago was in remote Alaska, whereas HSL was hiding in plain sight in Honolulu, a metropolitan area of nearly a million people. It was the kind of discovery that made the world seem larger.
The last-minute arrival of recognition and support for HSL was a powerful, almost surreal vindication for Lambrecht, whose first language is HSL. For decades, it was stigmatised or ignored; now the language has acquired an agreed-upon name, an official “language code” from the International Organization for Standardization, the attention of linguists around the world, and a three-year grant from the Endangered Languages Documentation Programme at the School of Oriental and African Studies in London.
But just as linguists were substantiating its existence, HSL stood on the brink of extinction, remembered by just a handful of signers. Unless the language made a miraculous recovery, Lambrecht feared that her announcement might turn out to be HSL’s obituary.
Three years after announcing its existence, Lambrecht is still unearthing her language sign by sign. She may be the only person in the world who still uses HSL on a regular basis, signing into a camera while a linguist named James “Woody” Woodward and a handful of graduate students from the University of Hawaii document her every move. [Continue reading…]
By Gaia Vince, Mosaic
If you read a sentence (such as this one) about kicking a ball, neurons related to the motor function of your leg and foot will be activated in your brain. Similarly, if you talk about cooking garlic, neurons associated with smelling will fire up. Since it is almost impossible to do or think about anything without using language – whether this entails an internal talk-through by your inner voice or following a set of written instructions – language pervades our brains and our lives like no other skill.
For more than a century, it’s been established that our capacity to use language is usually located in the left hemisphere of the brain, specifically in two areas: Broca’s area (associated with speech production and articulation) and Wernicke’s area (associated with comprehension). Damage to either of these, caused by a stroke or other injury, can lead to language and speech problems or aphasia, a loss of language.
In the past decade, however, neurologists have discovered it’s not that simple: language is not restricted to two areas of the brain or even just to one side, and the brain itself can grow when we learn new languages.
Holly Root-Gutteridge writes: Dialects, or regional differences in the form and use of vocalisations, have been observed in birds, bats, chimpanzees and now an increasingly long list of other species. This has been most beautifully heard in whales, where the songs of humpbacks are transmitted across hundreds of miles, telling a listener which part of the ocean the whale lives in, and tracing its family group by the influences on song formations. The bioacousticians Katharine Payne and Roger Payne first listened to the whales on underwater microphone recordings in the 1960s, and used musical notation to explore the changes that occurred in each male’s song, year on year. Whalesong, heard by humans as long ago as Aristotle, became the subject of intense study and public interest. Their research showed that there were geographic differences in humpback whale songs and that we could tell apart populations just by using those songs, which change throughout their lives. So the whales were controlling their singing and subject to cultural influences. The Paynes had found dialects in whale song. Would we find the same for canids?
Despite their cultural popularity, wolf howls haven’t been the subject of focussed research until recently. Now, following the lead of marine biologists and ornithologists, and with improved sound recording equipment and analysis programs, researchers can study them in depth. The first step in understanding what animals are saying to one another is to figure out what aspects of the voice are functional and what parts are formed by the structure of the throat and mouth, or what is the piano and what is the tune. Studies since the 1960s have shown that the howls that have haunted our dreams for centuries can tell us a lot about the particular wolf vocalising. Like humans, each wolf has its own voice. Each pack also shares howl similarities, making different families sound distinct from each other (wolves respond more favourably to familiar howls). This much we knew. What we didn’t know was whether the differences seen between packs were true of subspecies or of species, and if an Indian wolf howl would be distinct from a Canadian one.
More questions follow. If howls from different subspecies are different, do the howls convey the same message? Is there a shared culture of howl-meanings, where an aggressive howl from a European wolf means the same thing as an aggressive howl of a Himalayan? And can a coyote differentiate between a red wolf howling with aggressive intent and one advertising the desire to mate? Even without grammar or syntax, howls can convey intent, and if the shape of the howl changes enough while the intent remains constant, the foundations of distinctive culture can begin to appear. [Continue reading…]
Ryan Ruby writes: For a word that literally means definition, the aphorism is a rather indefinite genre. It bears a family resemblance to the fragment, the proverb, the maxim, the hypomnema, the epigram, the mantra, the parable, and the prose poem. Coined sometime between the fifth and third centuries BC as the title for one of the books of the Corpus Hippocraticum, the Aphorismi were originally a compendium of the latest medical knowledge. The penultimate aphorism, “In chronic disease an excessive flux from the bowels is bad,” is more representative of the collection’s contents than the first — “Life is short, art is long” — for which it is best known.
But in those six words lies a clue to the particular space aphorisms were supposed to define. Thanks to a semantic slippage between the Greek word techne and its English translation (via the Latin ars), the saying is often taken to mean that the works of human beings outlast their days. But in its original context, Hippocrates or his editors probably intended something more pragmatic: the craft of medicine takes a long time to learn, and physicians have a short time in which to learn it. Although what aphorisms have in common with the forms listed above is their brevity, what is delimited by the aphorism is not the number of words in which ideas are expressed but the scope of their inquiry. Unlike Hebrew proverbs, in which the beginning of wisdom is the fear of God, the classical aphorism is a secular genre concerned with the short span of time we are allotted on earth. Books of aphorisms are also therapeutic in nature, collections of practical wisdom through which we can rid ourselves of unnecessary suffering and achieve what Hippocrates’ contemporary Socrates called eudaimonia, the good life.
This is certainly what the Stoic philosopher Arrian had in mind when he whittled down the discourses of his master, Epictetus, into a handbook of aphorisms. The Enchiridion is composed of that mixture of propositional assertion and assertive imperative that is now a hallmark of the form. In it, Epictetus, a former slave, outlines the Stoic view that, while “some things are in our control,” most things are ruled by fate. The way to the good life is to bring what is up to us — our attitudes, judgments, and desires — into harmony with what is not up to us: what happens to our bodies, possessions, and reputations. If we accept that what does happen must happen, we will never be disappointed by vain hopes or sudden misfortunes. Our dispositions, not our destinies, are the real source of our unhappiness. [Continue reading…]
UC Berkeley reports: What if a map of the brain could help us decode people’s inner thoughts?
UC Berkeley scientists have taken a step in that direction by building a “semantic atlas” that shows in vivid colors and multiple dimensions how the human brain organizes language. The atlas identifies brain areas that respond to words that have similar meanings.
The findings, published in the journal Nature, are based on a brain imaging study that recorded neural activity while study volunteers listened to stories from the “Moth Radio Hour.” They show that at least one-third of the brain’s cerebral cortex, including areas dedicated to high-level cognition, is involved in language processing.
Notably, the study found that different people share similar language maps: “The similarity in semantic topography across different subjects is really surprising,” said study lead author Alex Huth, a postdoctoral researcher in neuroscience at UC Berkeley. Click here for Huth’s online brain viewer. [Continue reading…]
Daniel Oberhaus writes: When twelve men gathered at the Green Bank Observatory in West Virginia to discuss the art and science of alien hunting in 1961, the Order of the Dolphin was born. A number of the brightest minds from a range of scientific disciplines, including three Nobel laureates, a young Carl Sagan, and an eccentric neuroscientist named John Lilly — who was best known for trying to talk to dolphins — were in attendance.
It was Lilly’s research that inspired the group’s name: If humans couldn’t even communicate with animals that shared most of our evolutionary history, he believed, they were a bit daft to think they could recognize signals from a distant planet. With that in mind, the Order of the Dolphin set out to determine what our ocean-going compatriots here on Earth might be able to teach us about talking to extraterrestrials.
Lilly’s work on interspecies communication has since gone in and out of vogue several times within the SETI (Search for Extraterrestrial Intelligence) community. Today, it’s back in fashion, thanks to new applications of information theory and to technological advancements, such as the Cetacean Hearing and Telemetry (CHAT) device, a submersible computer interface that establishes basic communication with dolphins. The return to dolphins as a model for alien intelligence came in 1999, when SETI Institute astronomer Laurance Doyle proposed using information theory to analyze animal communication systems, particularly the whistle repertoire of bottlenose dolphins. [Continue reading…]
Salman Rushdie writes: As we honour the four hundredth anniversaries of the deaths of William Shakespeare and Miguel de Cervantes Saavedra, it may be worth noting that while it’s generally accepted that the two giants died on the same date, 23 April 1616, it actually wasn’t the same day. By 1616 Spain had moved on to using the Gregorian calendar, while England still used the Julian, and was 11 days behind. (England clung to the old Julian dating system until 1752, and when the change finally came, there were riots and, it’s said, mobs in the streets shouting, “Give us back our 11 days!”) Both the coincidence of the dates and the difference in the calendars would, one suspects, have delighted the playful, erudite sensibilities of the two fathers of modern literature.
We don’t know if they were aware of each other, but they had a good deal in common, beginning right there in the “don’t know” zone, because they are both men of mystery; there are missing years in the record and, even more tellingly, missing documents. Neither man left behind much personal material. Very little to nothing in the way of letters, work diaries, abandoned drafts; just the colossal, completed oeuvres. “The rest is silence.” Consequently, both men have been prey to the kind of idiot theories that seek to dispute their authorship.
A cursory internet search “reveals”, for example, that not only did Francis Bacon write Shakespeare’s works, he wrote Don Quixote as well. (My favourite crazy Shakespeare theory is that his plays were not written by him but by someone else of the same name.) And of course Cervantes faced a challenge to his authorship in his own lifetime, when a certain pseudonymous Alonso Fernández de Avellaneda, whose identity is also uncertain, published his fake sequel to Don Quixote and goaded Cervantes into writing the real Book II, whose characters are aware of the plagiarist Avellaneda and hold him in much contempt. [Continue reading…]
Stephen Greenblatt writes: A few years ago, during a merciful remission in the bloodshed and mayhem that has for so many years afflicted Afghanistan, a young Afghan poet, Qais Akbar Omar, had an idea. It was, he brooded, not only lives and livelihood that had been ruthlessly attacked by the Taliban, it was also culture. The international emblem of that cultural assault was the dynamiting of the Bamiyan Buddhas, but the damage extended to painting, music, dance, fiction, film, and poetry. It extended as well to the subtle web of relations that link one culture to another across boundaries and make us, each in our provincial worlds, feel that we are part of a larger humanity. This web is not only a contemporary phenomenon, the result of modern technology; it is as old as culture itself, and it has been particularly dense and vital in Afghanistan with its ancient trade routes and its endless succession of would-be conquerors.
Omar thought that the time was ripe to mark the restoration of civil society and repair some of the cultural damage. He wanted to stage a play with both men and women actors performing in public in an old garden in Kabul. He chose a Shakespeare play. No doubt the choice had something to do with the old imperial presence of the British in Afghanistan, but it was not only this particular history that was at work. Shakespeare is the embodiment worldwide of a creative achievement that does not remain within narrow boundaries of the nation-state or lend itself to the secure possession of a particular faction or speak only for this or that chosen group. He is the antithesis of intolerant provinciality and fanaticism. He could make with effortless grace the leap from Stratford to Kabul, from English to Dari.
Omar did not wish to put on a tragedy; his country, he thought, had suffered through quite enough tragedy of its own. Considering possible comedies, he shied away from those that involved cross-dressing. It was risky enough simply to have men and women perform together on stage. In the end he chose Love’s Labour’s Lost, a comedy that arranged the sexes in distinct male and female groups, had relatively few openly transgressive or explicitly erotic moments, and decorously deferred the final consummation of desire into an unstaged future. As a poet, Omar was charmed by the play’s gorgeous language, language that he felt could be rendered successfully in Dari.
The complex story of the mounting of the play is told in semifictionalized form in a 2015 book Omar coauthored with Stephen Landrigan, A Night in the Emperor’s Garden. Measured by the excitement it generated, this production of Love’s Labor’s Lost was a great success. The overflow crowds on the opening night gave way to ever-larger crowds clamoring to get in, along with worldwide press coverage.
But the attention came at a high price. The Taliban took note of Shakespeare in Kabul and what it signified. In the wake of the production, virtually everyone involved in it began to receive menacing messages. Spouses, children, and the extended families of the actors were not exempt from harrassment and warnings. The threats were not idle. The husband of one of the performers answered a loud knock on the door one night and did not return. His mutilated body was found the next morning.
What had seemed like a vigorous cultural renaissance in Afghanistan quickly faded and died. In the wake of the resurgence of the Taliban, Qais Akbar Omar and all the others who had had the temerity to mount Shakespeare’s delicious comedy of love were in terrible trouble. They are now, every one of them, in exile in different parts of the world.
Love’s labors lost indeed. But the subtitle of Omar’s account—“A True Story of Hope and Resilience in Afghanistan”—is not or at least not only ironic. The humane, inexhaustible imaginative enterprise that Shakespeare launched more than four hundred years ago in one small corner of the world is more powerful than all the oppressive forces that can be gathered against it. [Continue reading…]
Kensy Cooperrider writes: ome metaphors end up forgotten by all but the most dedicated historians, while others lead long, productive lives. It’s only a select few, though, that become so entwined with how we understand the world that we barely even recognize them as metaphors, seeing them instead as something real. Of course, why some fizzle and others flourish can be tricky to account for, but their career in science provides some clues.
Metaphors, as we all by now know, aren’t just ornamental linguistic flourishes — they’re basic building blocks of everyday reasoning. And they’re at their most potent when they recast a difficult-to-understand phenomenon as something familiar: The brain becomes a computer; the atom, a tiny solar system; space-time, a fabric. Metaphors that tap into something familiar are the ones that generally gain traction.
Charles Darwin gave us both kinds, big winners and total flops. Natural selection, his best-known metaphor, is still a fixture of evolutionary biology. Though it’s not always recognized as a metaphor today, that’s exactly what it was to Darwin and his contemporaries. After all, evolution was a foreign and unwieldy concept. So Darwin set out to make it accessible by comparing it to a method employed in farmyards around the world.
For years, Darwin — a fancy pigeon breeder — obsessed over what he called artificial selection, what cattle-breeders, gardeners, and crop-growers did to create new varieties of plant and animal: allowing just the ones with desirable traits to reproduce. Darwin’s first-hand experience with breeding set the stage for his now-famous metaphoric leap. [Continue reading…]