Our imaginative life today has access to the pre-linguistic, ancestral mind

Stephen T Asma writes: Richard Klein, Maurice Bloch and other prominent paleoanthropologists place the imagination quite late in the history of our species, thousands of years after the emergence of anatomically modern humans. In part, this theory reflects a bias that artistic faculties are a kind of evolutionary cheesecake – sweet desserts that emerge as byproducts of more serious cognitive adaptations such as language and logic. More importantly, it is premised on the relatively late appearance of cave art in the Upper Paleolithic period (c38,000 years ago). It is common for archaeologists to assume that imagination evolves late, after language, and the cave paintings are a sign of modern minds at work, thinking and creating just as we do today.

Contrary to this interpretation, I want to suggest that imagination, properly understood, is one of the earliest human abilities, not a recent arrival. Thinking and communicating are vastly improved by language, it is true. But ‘thinking with imagery’ and even ‘thinking with the body’ must have preceded language by hundreds of thousands of years. It is part of our mammalian inheritance to read, store and retrieve emotionally coded representations of the world, and we do this via conditioned associations, not propositional coding.

Lions on the savanna, for example, learn and make predictions because experience forges strong associations between perception and feeling. Animals appear to use images (visual, auditory, olfactory memories) to navigate novel territories and problems. For early humans, a kind of cognitive gap opened up between stimulus and response – a gap that created the possibility of having multiple responses to a perception, rather than one immediate response. This gap was crucial for the imagination: it created an inner space in our minds. The next step was that early human brains began to generate information, rather than merely record and process it – we began to create representations of things that never were but might be. On this view, imagination extends back into the Pleistocene, at least, and likely emerged slowly in our Homo erectus cousins. [Continue reading…]

Facebooktwittermail

Bacteria use brainlike bursts of electricity to communicate

Gabriel Popkin writes: Bacteria have an unfortunate — and inaccurate — public image as isolated cells twiddling about on microscope slides. The more that scientists learn about bacteria, however, the more they see that this hermitlike reputation is deeply misleading, like trying to understand human behavior without referring to cities, laws or speech. “People were treating bacteria as … solitary organisms that live by themselves,” said Gürol Süel, a biophysicist at the University of California, San Diego. “In fact, most bacteria in nature appear to reside in very dense communities.”

The preferred form of community for bacteria seems to be the biofilm. On teeth, on pipes, on rocks and in the ocean, microbes glom together by the billions and build sticky organic superstructures around themselves. In these films, bacteria can divide labor: Exterior cells may fend off threats, while interior cells produce food. And like humans, who have succeeded in large part by cooperating with each other, bacteria thrive in communities. Antibiotics that easily dispatch free-swimming cells often prove useless against the same types of cells when they’ve hunkered down in a film.

As in all communities, cohabiting bacteria need ways to exchange messages. Biologists have known for decades that bacteria can use chemical cues to coordinate their behavior. The best-known example, elucidated by Bonnie Bassler of Princeton University and others, is quorum sensing, a process by which bacteria extrude signaling molecules until a high enough concentration triggers cells to form a biofilm or initiate some other collective behavior. [Continue reading…]

Facebooktwittermail

Is linguistics a science?

Arika Okrent writes: Science is a messy business, but just like everything with loose ends and ragged edges, we tend to understand it by resorting to ideal types. On the one hand, there’s the archetype of the scientific method: a means of accounting for observations, generating precise, testable predictions, and yielding new discoveries about the natural consequences of natural laws. On the other, there’s our ever-replenishing font of story archetypes: the accidental event that results in a sudden clarifying insight; the hero who pursues the truth in the face of resistance or even danger; the surprising fact that challenges the dominant theory and brings it toppling to the ground.

The interplay of these archetypes has produced a spirited, long-running controversy about the nature and origins of language. Recently, it’s been flung back into public awareness following the publication of Tom Wolfe’s book The Kingdom of Speech (2016).

In Wolfe’s breathless re-telling, the dominant scientific theory is Noam Chomsky’s concept of a ‘universal grammar’ – the idea that all languages share a deep underlying structure that’s almost certainly baked into our biology by evolution. The crucial hypothesis is that its core, essential feature is recursion, the capacity to embed phrases within phrases ad infinitum, and so express complex relations between ideas (such as ‘Tom says that Dan claims that Noam believes that…’). And the challenging fact is the discovery of an Amazonian language, Pirahã, that does not have recursion. The scientific debate plays out as a classic David-and-Goliath story, with Chomsky as a famous, ivory-tower intellectual whose grand armchair proclamations are challenged by a rugged, lowly field linguist and former Christian missionary named Daniel Everett.

Stories this ripe for dramatisation come along rarely in any branch of science, much less the relatively obscure field of theoretical linguistics. But the truth will always be more complicated than the idealisations we use to understand it. In this case, the details lend themselves so well to juicy, edifice-crumbling story arcs that a deeper, more consequential point tends to be overlooked. It concerns not Everett’s challenge to Chomsky’s theory, but Chomsky’s challenge to the scientific method itself.

This counter-attack takes the form of the Chomskyans’ response to Everett. They say that even if Pirahã has no recursion, it matters not one bit for the theory of universal grammar. The capacity is intrinsic, even if it’s not always exploited. As Chomsky and his colleagues put it in a co-authored paper, ‘our language faculty provides us with a toolkit for building languages, but not all languages use all the tools’. This looks suspiciously like defiance of a central feature of the scientific archetype, one first put forward by the philosopher Karl Popper: theories are not scientific unless they have the potential to be falsified. If you claim that recursion is the essential feature of language, and if the existence of a recursionless language does not debunk your claim, then what could possibly invalidate it?

In an interview with Edge.org in 2007, Everett said he emailed Chomsky: ‘What is a single prediction that universal grammar makes that I could falsify? How could I test it?’ According to Everett, Chomsky replied to say that universal grammar doesn’t make any predictions; it’s a field of study, like biology.

The nub of the disagreement here boils down to what exactly linguistics says about the world, and the appropriate archetypes we should apply to make it effective. So just what kinds of questions does linguistics want to answer? What counts as evidence? Is universal grammar in particular – and theoretical linguistics in general – a science at all? [Continue reading…]

Facebooktwittermail

Sending information without transmitting a signal

Joshua Roebke writes: We connect to each other through particles. Calls and texts ride flecks of light, Web sites and photographs load on electrons. All communication is, essentially, physical. Information is recorded and broadcast on actual objects, even those we cannot see.

Physicists also connect to the world when they communicate with it. They dispatch glints of light toward particles or atoms, and wait for this light to report back. The light interacts with the bits of matter, and how this interaction changes the light reveals a property or two of the bits—although this interaction often changes the bits, too. The term of art for such a candid affair is a measurement.

Particles even connect to each other using other particles. The force of electromagnetism between two electrons is conveyed by particles of light, and quarks huddle inside a proton because they exchange gluons. Physics is, essentially, the study of interactions.

Information is always conveyed through interactions, whether between particles or ourselves. We are compositions of particles who communicate with each other, and we learn about our surroundings by interacting with them. The better we understand such interactions, the better we understand the world and ourselves.

Physicists already know that interactions are local. As with city politics, the influence of particles is confined to their immediate precincts. Yet interactions remain difficult to describe. Physicists have to treat particles as individuals and add complex terms to their solitary existence to model their intimacies with other particles. The resulting equations are usually impossible to solve. So physicists have to approximate even for single particles, which can interact with themselves as a boat rolls in its own wake. Although physicists are meticulous, it is a wonder they ever succeed. Still, their contentions are the most accurate theories we have.

Quantum mechanics is the consummate theory of particles, so it naturally describes measurements and interactions. During the past few decades, as computers have nudged the quantum, the theory has been reframed to encompass information, too. What quantum mechanics implies for measurements and interactions is notoriously bizarre. Its implications for information are stranger still.

One of the strangest of these implications refutes the material basis of communication as well as common sense. Some physicists believe that we may be able to communicate without transmitting particles. In 2013 an amateur physicist named Hatim Salih even devised a protocol, alongside professionals, in which information is obtained from a place where particles never travel. Information can be disembodied. Communication may not be so physical after all.[Continue reading…]

Facebooktwittermail

Being bilingual makes you experience time differently

Quartz reports: A new study shows that the words we use to talk about time also shape our view of its passage. This, say researchers, indicates that abstract concepts like duration are relative rather than universal, and that they are also influenced rather than solely innate.

The work, published in the American Psychological Association’s Journal of Experimental Psychology: General on April 27, examined how Spanish- and Swedish-speaking bilinguals conceived of time. The researchers—from University of Stockholm in Sweden and the University of Lancaster in the UK—found that their subjects, 40 of whom were native Swedish speakers and 40 of whom were native Spanish speakers—tended to think about time in terms that correspond to each language’s descriptors when linguistically prompted in that particular language but moved fluidly from one concept of time to another generally. This was true regardless of their native language.

Different languages describe time differently. For example, Swedish and English generally refer to time according to physical distance (“a long time,” “a short break”). Meanwhile, languages like Spanish or Greek, say, refer to time in volume generally (“a big chunk of time,” “a small moment”). [Continue reading…]

Facebooktwittermail

Daniel Everett: Becoming human without words for colors, numbers, or time

 

Facebooktwittermail

Understanding exactly what Trump means

Deborah Tannen writes: At Thursday’s Senate hearing, Sen. James E. Risch (R-Idaho) sought former FBI director James B. Comey’s agreement that President Trump did not tell him to drop his investigation of fired national security adviser Michael Flynn: “He did not direct you to let it go.” Comey agreed, “Not in his words, no.” Risch pressed his point: “He did not order you to let it go?” Comey concurred: “Again, those words are not an order.” Yet later in the hearing, in response to Sen. Angus King (I-Maine) asking whether the president’s words were “a directive,” Comey said, “Yes.”

Was Comey contradicting himself? Based on decades of studying indirectness in conversation — and a lifetime of using language to communicate — I’d say no. Risch was talking about the message: the literal meaning of words spoken. King, and later Sen. Kamala D. Harris (D-Calif.), were referring to the metamessage: what it means to say those words in that way in that context. When people talk to each other, they glean meaning from metamessages. But messages come in handy when someone wants to deny a meaning that was obvious when the words were spoken.

The president’s “exact words,” according to Comey’s notes, were: “I hope you can see your way clear to letting this go, to letting Flynn go. He is a good guy. I hope you can let this go.” Risch cried literal meaning. Zeroing in on the word “hope,” he asked Comey if he knew of anyone being charged with a criminal offense because “they hoped for an outcome.” Though he confessed that he didn’t, Comey said, “I took it as, this is what he wants me to do.” Risch rested his case: “You may have taken it as a direction but that’s not what he said.” Donald Trump Jr., the president’s son, later made the same point in a tweet: “Hoping and telling are two very different things.”

Actually, they aren’t, when the speaker is in a position of power, as Harris noted. Referring to her experience as a prosecutor, she said, “When a robber held a gun to somebody’s head and said, ‘I hope you will give me your wallet,’ the word ‘hope’ was not the most operative word at that moment.” The gun gives the robber power to encourage another to make his hope a reality.

Trump Jr. also tweeted, “Knowing my father for 39 years when he ‘orders or tells’ you to do something there is no ambiguity, you will know exactly what he means.” He’s right. Comey knew exactly what he meant. [Continue reading…]

Facebooktwittermail

The autocrat’s language

Masha Gessen writes: In the early 1990s, Russian journalists were engaged in the project of reinventing journalism—which itself had been used to perform the opposite of conveying reliable information. Language was a problem. The language of politics had been pillaged, as had the language of values and even the language of feelings: after decades of performing revolutionary passion, people had become weary of the very idea of passion. So the new Russian journalists opted for language that was descriptive in the most direct way: we tried to stick to verbs and nouns, and only to things that could be directly observed. It was the journalistic equivalent of the hardware store: if the shape of a word could not be clearly described and its weight could not be measured, it could not be used. This kind of language is good for describing things that are in front of your eyes and terrible for conveying the contents of your mind or heart. It was constraining.

Writing in Russian was a challenging exercise akin to navigating a mine field: one misstep could discredit the entire enterprise. Compared to this, writing in English was freedom. But then things in Russia got worse. A new government came in, and did new damage to the language. Vladimir Putin declared a “dictatorship of the law.” His main ideologue advanced the idea of “managed democracy.” Temporary president Dmitry Medvedev said, “Freedom is better than unfreedom.” Now words did not mean their opposite anymore. They just meant nothing. The phrase “dictatorship of the law” is so incoherent as to render both “dictatorship” and “law” meaningless.

Donald Trump has an instinct for doing both of these kinds of violence to language. He is particularly adept at taking words and phrases that deal with power relationships and turning them into their opposite. This was, for example, how he used the phrase “safe space” when talking about vice-president-elect Mike Pence’s visit to the musical Hamilton. Pence, if you recall, was booed and then passionately—and respectfully—addressed by the cast of the show. Trump was tweeting that this should not have happened. Now, the phrase “safe space” was coined to describe a place where people who usually feel unsafe and powerless would feel exceptionally safe. Claiming that the second most powerful man in the world should be granted a “safe space” in public turns the concept precisely on its head.

Trump performed the exact same trick on the phrase “witch hunt,” which he claimed was being carried out by Democrats to avenge their electoral loss. Witch hunts cannot actually be carried out by losers, big or small: the agent of a witch hunt must have power. And, of course, he has seized and flipped the term “fake news” in much the same way. [Continue reading…]

Facebooktwittermail

The language of prairie dogs

Ferris Jabr writes: [Con] Slobodchikoff, an emeritus professor of biology at Northern Arizona University, has been analyzing the sounds of prairie dogs for more than 30 years. Not long after he started, he learned that prairie dogs had distinct alarm calls for different predators. Around the same time, separate researchers found that a few other species had similar vocabularies of danger. What Slobodchikoff claimed to discover in the following decades, however, was extraordinary: Beyond identifying the type of predator, prairie-dog calls also specified its size, shape, color and speed; the animals could even combine the structural elements of their calls in novel ways to describe something they had never seen before. No scientist had ever put forward such a thorough guide to the native tongue of a wild species or discovered one so intricate. Prairie-dog communication is so complex, Slobodchikoff says — so expressive and rich in information — that it constitutes nothing less than language.

That would be an audacious claim to make about even the most overtly intelligent species — say, a chimpanzee or a dolphin — let alone some kind of dirt hamster with a brain that barely weighs more than a grape. The majority of linguists and animal-communication experts maintain that language is restricted to a single species: ourselves. Perhaps because it is so ostensibly entwined with thought, with consciousness and our sense of self, language is the last bastion encircling human exceptionalism. To concede that we share language with other species is to finally and fully admit that we are different from other animals only in degrees not in kind. In many people’s minds, language is the “cardinal distinction between man and animal, a sheerly dividing line as abrupt and immovable as a cliff,” as Tom Wolfe argues in his book “The Kingdom of Speech,” published last year.

Slobodchikoff thinks that dividing line is an illusion. To him, the idea that a human might have a two-way conversation with another species, even a humble prairie dog, is not a pretense; it’s an inevitability. And the notion that animals of all kinds routinely engage in sophisticated discourse with one another — that the world’s ecosystems reverberate with elaborate animal idioms just waiting to be translated — is not Doctor Dolittle-inspired nonsense; it is fact. [Continue reading…]

 

Facebooktwittermail

Technology doesn’t make us better people

Nicholas Carr writes: Welcome to the global village. It’s a nasty place.

On Easter Sunday, a man in Cleveland filmed himself murdering a random 74-year-old and posted the video on Facebook. The social network took the grisly clip down within two or three hours, but not before users shared it on other websites — where people around the world can still view it.

Surely incidents like this aren’t what Mark Zuckerberg had in mind. In 2012, as his company was preparing to go public, the Facebook founder wrote an earnest letter to would-be shareholders explaining that his company was more than just a business. It was pursuing a “social mission” to make the world a better place by encouraging self-expression and conversation. “People sharing more,” the young entrepreneur wrote, “creates a more open culture and leads to a better understanding of the lives and perspectives of others.”

Earlier this year, Zuckerberg penned another public letter, expressing even grander ambitions. Facebook, he announced, is expanding its mission from “connecting friends and family” to building “a global community that works for everyone.” The ultimate goal is to turn the already vast social network into a sort of supranational state “spanning cultures, nations and regions.”

But the murder in Cleveland, and any similar incidents that inevitably follow, reveal the hollowness of Silicon Valley’s promise that digital networks would bring us together in a more harmonious world.

Whether he knows it or not, Zuckerberg is part of a long tradition in Western thought. Ever since the building of the telegraph system in the 19th century, people have believed that advances in communication technology would promote social harmony. The more we learned about each other, the more we would recognize that we’re all one. In an 1899 article celebrating the laying of transatlantic Western Union cables, a New York Times columnist expressed the popular assumption well: “Nothing so fosters and promotes a mutual understanding and a community of sentiment and interests as cheap, speedy, and convenient communication.”

The great networks of the 20th century — radio, telephone, TV — reinforced this sunny notion. Spanning borders and erasing distances, they shrank the planet. Guglielmo Marconi declared in 1912 that his invention of radio would “make war impossible, because it will make war ridiculous.” AT&T’s top engineer, J.J. Carty, predicted in a 1923 interview that the telephone system would “join all the peoples of the earth in one brotherhood.” In his 1962 book “The Gutenberg Galaxy,” the media theorist Marshall McLuhan gave us the memorable term “global village” to describe the world’s “new electronic interdependence.” Most people took the phrase optimistically, as a prophecy of inevitable social progress. What, after all, could be nicer than a village?

If our assumption that communication brings people together were true, we should today be seeing a planetary outbreak of peace, love, and understanding. Thanks to the Internet and cellular networks, humanity is more connected than ever. Of the world’s 7 billion people, 6 billion have access to a mobile phone — a billion and a half more, the United Nations reports, than have access to a working toilet. Nearly 2 billion are on Facebook, more than a billion upload and download YouTube videos, and billions more converse through messaging apps like WhatsApp and WeChat. With smartphone in hand, everyone becomes a media hub, transmitting and receiving ceaselessly.

Yet we live in a fractious time, defined not by concord but by conflict. Xenophobia is on the rise. Political and social fissures are widening. From the White House down, public discourse is characterized by vitriol and insult. We probably shouldn’t be surprised. [Continue reading…]

Facebooktwittermail

Literature’s evolution has reflected and spurred the growing complexity of society

Julie Sedivy writes: Reading medieval literature, it’s hard not to be impressed with how much the characters get done—as when we read about King Harold doing battle in one of the Sagas of the Icelanders, written in about 1230. The first sentence bristles with purposeful action: “King Harold proclaimed a general levy, and gathered a fleet, summoning his forces far and wide through the land.” By the end of the third paragraph, the king has launched his fleet against a rebel army, fought numerous battles involving “much slaughter in either host,” bound up the wounds of his men, dispensed rewards to the loyal, and “was supreme over all Norway.” What the saga doesn’t tell us is how Harold felt about any of this, whether his drive to conquer was fueled by a tyrannical father’s barely concealed contempt, or whether his legacy ultimately surpassed or fell short of his deepest hopes.

Jump ahead about 770 years in time, to the fiction of David Foster Wallace. In his short story “Forever Overhead,” the 13-year-old protagonist takes 12 pages to walk across the deck of a public swimming pool, wait in line at the high diving board, climb the ladder, and prepare to jump. But over these 12 pages, we are taken into the burgeoning, buzzing mind of a boy just erupting into puberty—our attention is riveted to his newly focused attention on female bodies in swimsuits, we register his awareness that others are watching him as he hesitates on the diving board, we follow his undulating thoughts about whether it’s best to do something scary without thinking about it or whether it’s foolishly dangerous not to think about it.

These examples illustrate Western literature’s gradual progression from narratives that relate actions and events to stories that portray minds in all their meandering, many-layered, self-contradictory complexities. I’d often wondered, when reading older texts: Weren’t people back then interested in what characters thought and felt? [Continue reading…]

Facebooktwittermail

How a bias toward English-language science can result in preventable crises

Ben Panko writes: Thirteen years ago, a deadly strain of avian flu known as H5N1 was tearing through Asia’s bird populations. In January 2004, Chinese scientists reported that pigs too had become infected with the virus—an alarming development, since pigs are susceptible to human viruses and could potentially act as a “mixing vessel” that would allow the virus to jump to humans. “Urgent attention should be paid to the pandemic preparedness of these two subtypes of influenza,” the scientists wrote in their study.

Yet at the time, little attention was paid outside of China—because the study was published only in Chinese, in a small Chinese journal of veterinary medicine.

It wasn’t until August of that year that the World Health Organization and the United Nations learned of the study’s results and rushed to have it translated. Those scientists and policy makers ran headlong into one of science’s biggest unsolved dilemmas: language. A new study in the journal PLOS Biology sheds light on how widespread the gulf can be between English-language science and any-other-language science, and how that gap can lead to situations like the avian flu case, or worse. [Continue reading…]

Facebooktwittermail

You should talk about politics this Thanksgiving – here’s why, and how

By Stacy Branham, University of Maryland, Baltimore County

After one of the most divisive presidential elections in American history, many of us may be anxious about dinner-table dialogue with family and friends this Thanksgiving. There is no denying that the way we communicate about politics has fundamentally changed with the proliferation of technology and social media. Twitter bots, fake news and echo chambers are just a few of the highlights from this election season. Much of how we’re conversing online can’t – and shouldn’t – be replicated around the family table. We are getting out of practice at conducting meaningful, respectful conversation.

There’s not a quick fix. We need more empathic communication – the slow, deep (inter)personal discourse that can nurture identity and build and strengthen relationships. Yet contemporary communication platforms can make it harder to build empathy with conversational partners. Even the phrase “conversational partners” seems unfitting in the world of 140-character limits, followers, likes and shares. In many ways, our devices help us talk at (@?) instead of with one another.

Literally meaning “in-feeling,” empathy is a process of internalizing another person’s perspective. Empathy-building is unselfish; you suspend your own sensibilities and try to fully imagine and embrace those of someone else. You can gain empathy by learning about other cultures from different media, by experiencing what others have gone through personally, or by having deep conversations with others.

My research into cross-cultural communications has taught me that empathy is not only the key to feeling connected – “I understand you” – but also the foundation for changing our narratives about one another – “now I see we are not so different.” That’s an important point to remember after such a difficult political experience. Building empathy requires communication, specifically talking to one another. But, not just any talking will suffice – especially not the type of talking promoted by today’s highly popular communication technologies.

[Read more…]

Facebooktwittermail

To identify risky drivers, insurer will track language use in social media

Financial Times reports: UK-based insurer Admiral has come up with a way to crunch through social media posts to work out who deserves a lower premium. People who seem cautious and deliberate in their choice of words are likely to pay a lot less than those with overconfident remarks. [Continue reading…]

Facebooktwittermail

The vulnerability of monolingual Americans in an English-speaking world

Ivan Krastev writes: In our increasingly Anglophone world, Americans have become nakedly transparent to English speakers everywhere, yet the world remains bafflingly and often frighteningly opaque to monolingual Americans. While the world devours America’s movies and follows its politics closely, Americans know precious little about how non-Americans think and live. Americans have never heard of other countries’ movie stars and have only the vaguest notions of what their political conflicts are about.

This gross epistemic asymmetry is a real weakness. When WikiLeaks revealed the secret cables of the American State Department or leaked the emails of the Clinton campaign, it became a global news sensation and a major embarrassment for American diplomacy. Leaking Chinese diplomatic cables or Russian officials’ emails could never become a worldwide human-interest story, simply because only a relative handful of non-Chinese or non-Russians could read them, let alone make sense of them. [Continue reading…]

Although I’m pessimistic about the prospects of the meek inheriting the earth, the bi-lingual are in a very promising position. And Anglo-Americans should never forget that this is after all a country with a Spanish name. As for where I stand personally, I’m with the bi-lingual camp in spirit even if my own claim to be bi-lingual is a bit tenuous — an English-speaker who understands American-English but speaks British-English; does that count?

Facebooktwittermail

Ethical shifts come with thinking in a different language

Julie Sedivy writes: What defines who we are? Our habits? Our aesthetic tastes? Our memories? If pressed, I would answer that if there is any part of me that sits at my core, that is an essential part of who I am, then surely it must be my moral center, my deep-seated sense of right and wrong.

And yet, like many other people who speak more than one language, I often have the sense that I’m a slightly different person in each of my languages — more assertive in English, more relaxed in French, more sentimental in Czech. Is it possible that, along with these differences, my moral compass also points in somewhat different directions depending on the language I’m using at the time?

Psychologists who study moral judgments have become very interested in this question. Several recent studies have focused on how people think about ethics in a non-native language — as might take place, for example, among a group of delegates at the United Nations using a lingua franca to hash out a resolution. The findings suggest that when people are confronted with moral dilemmas, they do indeed respond differently when considering them in a foreign language than when using their native tongue.

In a 2014 paper led by Albert Costa, volunteers were presented with a moral dilemma known as the “trolley problem”: imagine that a runaway trolley is careening toward a group of five people standing on the tracks, unable to move. You are next to a switch that can shift the trolley to a different set of tracks, thereby sparing the five people, but resulting in the death of one who is standing on the side tracks. Do you pull the switch?

Most people agree that they would. But what if the only way to stop the trolley is by pushing a large stranger off a footbridge into its path? People tend to be very reluctant to say they would do this, even though in both scenarios, one person is sacrificed to save five. But Costa and his colleagues found that posing the dilemma in a language that volunteers had learned as a foreign tongue dramatically increased their stated willingness to shove the sacrificial person off the footbridge, from fewer than 20% of respondents working in their native language to about 50% of those using the foreign one. [Continue reading…]

Facebooktwittermail

Evidence rebuts Chomsky’s theory of language learning

Paul Ibbotson and Michael Tomasello write: The idea that we have brains hardwired with a mental template for learning grammar — famously espoused by Noam Chomsky of the Massachusetts Institute of Technology — has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages—and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.

The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all — such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique hu­­­man ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.

This conclusion is important because the study of language plays a central role in diverse disciplines — from poetry to artificial intelligence to linguistics itself; misguided methods lead to questionable results. Further, language is used by humans in ways no animal can match; if you understand what language is, you comprehend a little bit more about human nature. [Continue reading…]

Facebooktwittermail

Beware the bad big wolf: why you need to put your adjectives in the right order

By Simon Horobin, University of Oxford

Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.

But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.

More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.

[Read more…]

Facebooktwittermail

The race to save a dying language

Ross Perlin writes: n 2013, at a conference on endangered languages, a retired teacher named Linda Lambrecht announced the extraordinary discovery of a previously unknown language. Lambrecht – who is Chinese-Hawaiian, 71 years old, warm but no-nonsense – called it Hawaii Sign Language, or HSL. In front of a room full of linguists, she demonstrated that its core vocabulary – words such as “mother”, “pig” and “small” – was distinct from that of other sign languages.

The linguists were immediately convinced. William O’Grady, the chair of the linguistics department at the University of Hawaii, called it “the first time in 80 years that a new language has been discovered in the United States — and maybe the last time.” But the new language found 80 years ago was in remote Alaska, whereas HSL was hiding in plain sight in Honolulu, a metropolitan area of nearly a million people. It was the kind of discovery that made the world seem larger.

The last-minute arrival of recognition and support for HSL was a powerful, almost surreal vindication for Lambrecht, whose first language is HSL. For decades, it was stigmatised or ignored; now the language has acquired an agreed-upon name, an official “language code” from the International Organization for Standardization, the attention of linguists around the world, and a three-year grant from the Endangered Languages Documentation Programme at the School of Oriental and African Studies in London.

But just as linguists were substantiating its existence, HSL stood on the brink of extinction, remembered by just a handful of signers. Unless the language made a miraculous recovery, Lambrecht feared that her announcement might turn out to be HSL’s obituary.

Three years after announcing its existence, Lambrecht is still unearthing her language sign by sign. She may be the only person in the world who still uses HSL on a regular basis, signing into a camera while a linguist named James “Woody” Woodward and a handful of graduate students from the University of Hawaii document her every move. [Continue reading…]

Facebooktwittermail