Rebecca Solnit writes: This past week was not a good week for women. In the United States, it was reported that a man who allegedly raped a 12-year-old girl was granted joint custody of the resultant eight-year-old boy being raised by his young mother.
Earlier in the week, the severed head and legs of Swedish journalist Kim Wall, who disappeared after entering inventor Peter Madsen’s submarine, were discovered near Copenhagen. A hard drive belonging to Madsen, Danish police said, was loaded with videos showing women being decapitated alive.
A Swedish model received rape threats for posing in an Adidas advertisement with unshaven legs. The University of Southern California’s dean of medicine was dumped after reports resurfaced that he had sexually harrassed a young medical researcher in 2003. A number of men at liberal publications were revealed to have contacted Milo Yiannopoulos, urging him to attack women – “Please mock this fat feminist,” wrote a senior male staff writer at Vice’s women’s channel, since fired. And, of course, movie mogul Harvey Weinstein was described by the New York Times as a serial sexual harasser; his alleged offences, according to a TV journalist, including trapping her in a hallway, where he masturbated until he ejaculated into a potted plant.
This week, the New Yorker ran a follow-up story by Ronan Farrow (the biological son of Woody Allen, who has repudiated his father for his treatment of his sisters), expanding the charges women have made against Weinstein to include sexual assault. He quotes one young woman who said “he forced me to perform oral sex on him” after she showed up for a meeting. She added, “I have nightmares about him to this day.” Weinstein denies any non-consensual sex.
Saturday 7 October was the first anniversary of the release of the tape in which the United States president boasted about sexually assaulting women; 11 women then came forward to accuse Donald Trump. And last week began with the biggest mass shooting in modern US history, carried out by a man reported to have routinely verbally abused his girlfriend: domestic violence is common in the past of mass shooters.
Underlying all these attacks is a lack of empathy, a will to dominate, and an entitlement to control, harm and even take the lives of others. Though there is a good argument that mental illness is not a sufficient explanation – and most mentally ill people are nonviolent – mass shooters and rapists seem to have a lack of empathy so extreme it constitutes a psychological disorder. At this point in history, it seems to be not just a defect from birth, but a characteristic many men are instilled with by the culture around them. It seems to be the precondition for causing horrific suffering and taking pleasure in it as a sign of one’s own power and superiority, in regarding others as worthless, as yours to harm or eliminate. [Continue reading…]
Tom Jacobs writes: President Donald Trump probably would not have been elected if not for the overwhelming support he enjoyed from evangelical Christians. This continues to puzzle and frustrate his opponents, who ask why they voted for a man whose campaign was largely based on hatred and vilification.
It argues genuine piety can be a catalyst for compassion. But the shared rituals that create a cohesive congregation “may also produce hatred of others”—especially among those who lack deeply felt spiritual beliefs.
“Our data suggest that the social activities which accompany religion drive the hostility towards other groups, rather than the quality of one’s belief or the degree of devotion,” a research team led by Rod Lynch of the University of Missouri writes in the journal Evolutionary Psychological Science.
Building on research that dates back to the 1960s, Lynch and his colleagues remind us that religious people come in two varieties: true believers, and those who embrace a faith tradition as a way of fulfilling some secular need, such as peace of mind or connection to a community.
This distinction between “intrinsic” and “extrinsic” religiosity was laid out by the influential psychologist Gordon Allport in the 1960s, who reported ethnic prejudice was associated only with the latter. Much later research found this to also be true of homophobia. [Continue reading…]
Quartz reports: A new study shows that the words we use to talk about time also shape our view of its passage. This, say researchers, indicates that abstract concepts like duration are relative rather than universal, and that they are also influenced rather than solely innate.
The work, published in the American Psychological Association’s Journal of Experimental Psychology: General on April 27, examined how Spanish- and Swedish-speaking bilinguals conceived of time. The researchers—from University of Stockholm in Sweden and the University of Lancaster in the UK—found that their subjects, 40 of whom were native Swedish speakers and 40 of whom were native Spanish speakers—tended to think about time in terms that correspond to each language’s descriptors when linguistically prompted in that particular language but moved fluidly from one concept of time to another generally. This was true regardless of their native language.
Different languages describe time differently. For example, Swedish and English generally refer to time according to physical distance (“a long time,” “a short break”). Meanwhile, languages like Spanish or Greek, say, refer to time in volume generally (“a big chunk of time,” “a small moment”). [Continue reading…]
Robert Sapolsky writes: As a kid, I saw the 1968 version of Planet of the Apes. As a future primatologist, I was mesmerized. Years later I discovered an anecdote about its filming: At lunchtime, the people playing chimps and those playing gorillas ate in separate groups.
It’s been said, “There are two kinds of people in the world: those who divide the world into two kinds of people and those who don’t.” In reality, there’s lots more of the former. And it can be vastly consequential when people are divided into Us and Them, ingroup and outgroup, “the people” (i.e., our kind) and the Others.
Humans universally make Us/Them dichotomies along lines of race, ethnicity, gender, language group, religion, age, socioeconomic status, and so on. And it’s not a pretty picture. We do so with remarkable speed and neurobiological efficiency; have complex taxonomies and classifications of ways in which we denigrate Thems; do so with a versatility that ranges from the minutest of microaggression to bloodbaths of savagery; and regularly decide what is inferior about Them based on pure emotion, followed by primitive rationalizations that we mistake for rationality. Pretty depressing.
But crucially, there is room for optimism. Much of that is grounded in something definedly human, which is that we all carry multiple Us/Them divisions in our heads. A Them in one case can be an Us in another, and it can only take an instant for that identity to flip. Thus, there is hope that, with science’s help, clannishness and xenophobia can lessen, perhaps even so much so that Hollywood-extra chimps and gorillas can break bread together. [Continue reading…]
Jerry Useem writes: If power were a prescription drug, it would come with a long list of known side effects. It can intoxicate. It can corrupt. It can even make Henry Kissinger believe that he’s sexually magnetic. But can it cause brain damage?
When various lawmakers lit into John Stumpf at a congressional hearing last fall, each seemed to find a fresh way to flay the now-former CEO of Wells Fargo for failing to stop some 5,000 employees from setting up phony accounts for customers. But it was Stumpf’s performance that stood out. Here was a man who had risen to the top of the world’s most valuable bank, yet he seemed utterly unable to read a room. Although he apologized, he didn’t appear chastened or remorseful. Nor did he seem defiant or smug or even insincere. He looked disoriented, like a jet-lagged space traveler just arrived from Planet Stumpf, where deference to him is a natural law and 5,000 a commendably small number. Even the most direct barbs—“You have got to be kidding me” (Sean Duffy of Wisconsin); “I can’t believe some of what I’m hearing here” (Gregory Meeks of New York)—failed to shake him awake.
What was going through Stumpf’s head? New research suggests that the better question may be: What wasn’t going through it?
The historian Henry Adams was being metaphorical, not medical, when he described power as “a sort of tumor that ends by killing the victim’s sympathies.” But that’s not far from where Dacher Keltner, a psychology professor at UC Berkeley, ended up after years of lab and field experiments. Subjects under the influence of power, he found in studies spanning two decades, acted as if they had suffered a traumatic brain injury—becoming more impulsive, less risk-aware, and, crucially, less adept at seeing things from other people’s point of view.
Sukhvinder Obhi, a neuroscientist at McMaster University, in Ontario, recently described something similar. Unlike Keltner, who studies behaviors, Obhi studies brains. And when he put the heads of the powerful and the not-so-powerful under a transcranial-magnetic-stimulation machine, he found that power, in fact, impairs a specific neural process, “mirroring,” that may be a cornerstone of empathy. Which gives a neurological basis to what Keltner has termed the “power paradox”: Once we have power, we lose some of the capacities we needed to gain it in the first place. [Continue reading…]
David Z. Hambrick writes: Physical similarities aside, we share a lot in common with our primate relatives. For example, as Jane Goodall famously documented, chimpanzees form lifelong bonds and show affection in much the same way as humans. Chimps can also solve novel problems, use objects as tools, and may possess “theory of mind”—an understanding that others may have different perspectives than oneself. They can even outperform humans in certain types of cognitive tasks.
These commonalities may not seem all that surprising given what we now know from the field of comparative genomics: We share nearly all of our DNA with chimpanzees and other primates. However, social and cognitive complexity is not unique to our closest evolutionary cousins. In fact, it is abundant in species with which we would seem to have very little in common—like the spotted hyena.
For more than three decades, the Michigan State University zoologist Kay Holekamp has studied the habits of the spotted hyena in Kenya’s Masai Mara National Reserve, once spending five years straight living in a tent among her oft-maligned subjects. One of the world’s longest-running studies of a wild mammal, this landmark project has revealed that spotted hyenas not only have social groups as complex as those of many primates, but are also capable of some of the same types of problem solving.
This research sheds light on one of science’s greatest mysteries—how intelligence has evolved across the animal kingdom. [Continue reading…]
Nicholas Carr writes: Welcome to the global village. It’s a nasty place.
On Easter Sunday, a man in Cleveland filmed himself murdering a random 74-year-old and posted the video on Facebook. The social network took the grisly clip down within two or three hours, but not before users shared it on other websites — where people around the world can still view it.
Surely incidents like this aren’t what Mark Zuckerberg had in mind. In 2012, as his company was preparing to go public, the Facebook founder wrote an earnest letter to would-be shareholders explaining that his company was more than just a business. It was pursuing a “social mission” to make the world a better place by encouraging self-expression and conversation. “People sharing more,” the young entrepreneur wrote, “creates a more open culture and leads to a better understanding of the lives and perspectives of others.”
Earlier this year, Zuckerberg penned another public letter, expressing even grander ambitions. Facebook, he announced, is expanding its mission from “connecting friends and family” to building “a global community that works for everyone.” The ultimate goal is to turn the already vast social network into a sort of supranational state “spanning cultures, nations and regions.”
But the murder in Cleveland, and any similar incidents that inevitably follow, reveal the hollowness of Silicon Valley’s promise that digital networks would bring us together in a more harmonious world.
Whether he knows it or not, Zuckerberg is part of a long tradition in Western thought. Ever since the building of the telegraph system in the 19th century, people have believed that advances in communication technology would promote social harmony. The more we learned about each other, the more we would recognize that we’re all one. In an 1899 article celebrating the laying of transatlantic Western Union cables, a New York Times columnist expressed the popular assumption well: “Nothing so fosters and promotes a mutual understanding and a community of sentiment and interests as cheap, speedy, and convenient communication.”
The great networks of the 20th century — radio, telephone, TV — reinforced this sunny notion. Spanning borders and erasing distances, they shrank the planet. Guglielmo Marconi declared in 1912 that his invention of radio would “make war impossible, because it will make war ridiculous.” AT&T’s top engineer, J.J. Carty, predicted in a 1923 interview that the telephone system would “join all the peoples of the earth in one brotherhood.” In his 1962 book “The Gutenberg Galaxy,” the media theorist Marshall McLuhan gave us the memorable term “global village” to describe the world’s “new electronic interdependence.” Most people took the phrase optimistically, as a prophecy of inevitable social progress. What, after all, could be nicer than a village?
If our assumption that communication brings people together were true, we should today be seeing a planetary outbreak of peace, love, and understanding. Thanks to the Internet and cellular networks, humanity is more connected than ever. Of the world’s 7 billion people, 6 billion have access to a mobile phone — a billion and a half more, the United Nations reports, than have access to a working toilet. Nearly 2 billion are on Facebook, more than a billion upload and download YouTube videos, and billions more converse through messaging apps like WhatsApp and WeChat. With smartphone in hand, everyone becomes a media hub, transmitting and receiving ceaselessly.
Yet we live in a fractious time, defined not by concord but by conflict. Xenophobia is on the rise. Political and social fissures are widening. From the White House down, public discourse is characterized by vitriol and insult. We probably shouldn’t be surprised. [Continue reading…]
Jim Davies writes: In a classic experiment in 1953, students spent an hour doing repetitive, monotonous tasks, such as rotating square pegs a quarter turn, again and again. Then the experimenters asked the students to persuade someone else that this mind-numbing experience was in fact interesting. Some students got $1 ($9 today) to tell this fib while others got $20 ($176 today). In a survey at the end of the experiment, those paid only a trivial fee were more likely to describe the boring activity as engaging. They seemed to have persuaded themselves of their own lie.
According to the researchers, psychologists Merrill Carlsmith and Leon Festinger, this attitude shift was caused by “cognitive dissonance,” the discomfort we feel when we try to hold two contradictory ideas or beliefs at the same time. When faced with two opposing realities (“This is boring” and “I told someone it was interesting”), the well-paid students could externally justify their behavior (“I was paid to say that”). The poorly paid students, on the other hand, had to create an internal justification (“I must have said it was interesting for some good reason. Maybe I actually liked it”).
Scientists have uncovered more than 50 biases that, like this one, can mess with our thinking. For instance, there’s the “availability heuristic,” which makes us think something that’s easy to recall (because it’s emotional or because we’ve experienced it many times) is more common or probable than it really is. (Despite what you might think from watching CSI: Crime Scene Investigation, the world isn’t full of serial killers.) There’s also the “distinction bias,” which makes two options seem more different when considered simultaneously; the “denomination effect,” which makes us more likely to spend money when it’s in small bills or coins; and the “Dunning-Kruger effect,” which makes experts underestimate their abilities and laypeople overestimate theirs. [Continue reading…]
Tamsin Shaw writes: We are living in an age in which the behavioral sciences have become inescapable. The findings of social psychology and behavioral economics are being employed to determine the news we read, the products we buy, the cultural and intellectual spheres we inhabit, and the human networks, online and in real life, of which we are a part. Aspects of human societies that were formerly guided by habit and tradition, or spontaneity and whim, are now increasingly the intended or unintended consequences of decisions made on the basis of scientific theories of the human mind and human well-being.
The behavioral techniques that are being employed by governments and private corporations do not appeal to our reason; they do not seek to persuade us consciously with information and argument. Rather, these techniques change behavior by appealing to our nonrational motivations, our emotional triggers and unconscious biases. If psychologists could possess a systematic understanding of these nonrational motivations they would have the power to influence the smallest aspects of our lives and the largest aspects of our societies.
Michael Lewis’s The Undoing Project seems destined to be the most popular celebration of this ongoing endeavor to understand and correct human behavior. It recounts the complex friendship and remarkable intellectual partnership of Daniel Kahneman and Amos Tversky, the psychologists whose work has provided the foundation for the new behavioral science. It was their findings that first suggested we might understand human irrationality in a systematic way. When our thinking errs, they claimed, it does so predictably. Kahneman tells us that thanks to the various counterintuitive findings—drawn from surveys—that he and Tversky made together, “we now understand the marvels as well as the flaws of intuitive thought.”
Kahneman presented their new model of the mind to the general reader in Thinking, Fast and Slow (2011), where he characterized the human mind as the interrelated operation of two systems of thought: System One, which is fast and automatic, including instincts, emotions, innate skills shared with animals, as well as learned associations and skills; and System Two, which is slow and deliberative and allows us to correct for the errors made by System One.
Lewis’s tale of this intellectual revolution begins in 1955 with the twenty-one-year-old Kahneman devising personality tests for the Israeli army and discovering that optimal accuracy could be attained by devising tests that removed, as far as possible, the gut feelings of the tester. The testers were employing “System One” intuitions that skewed their judgment and could be avoided if tests were devised and implemented in ways that disallowed any role for individual judgment and bias. This is an especially captivating episode for Lewis, since his best-selling book, Moneyball (2003), told the analogous tale of Billy Beane, general manager of the Oakland Athletics baseball team, who used new forms of data analytics to override the intuitive judgments of baseball scouts in picking players.
The Undoing Project also applauds the story of the psychologist Lewis Goldberg, a colleague of Kahneman and Tversky in their days in Eugene, Oregon, who discovered that a simple algorithm could more accurately diagnose cancer than highly trained experts who were biased by their emotions and faulty intuitions. Algorithms—fixed rules for processing data—unlike the often difficult, emotional human protagonists of the book, are its uncomplicated heroes, quietly correcting for the subtle but consequential flaws in human thought.
The most influential of Kahneman and Tversky’s discoveries, however, is “prospect theory,” since this has provided the most important basis of the “biases and heuristics” approach of the new behavioral sciences. They looked at the way in which people make decisions under conditions of uncertainty and found that their behavior violated expected utility theory—a fundamental assumption of economic theory that holds that decision-makers reason instrumentally about how to maximize their gains. Kahneman and Tversky realized that they were not observing a random series of errors that occur when people attempted to do this. Rather, they identified a dozen “systematic violations of the axioms of rationality in choices between gambles.” These systematic errors make human irrationality predictable. [Continue reading…]
Elizabeth Kolbert writes: In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.
Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.
As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine — they’d been obtained from the Los Angeles County coroner’s office — the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.
In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well — significantly better than the average student — even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student — a conclusion that was equally unfounded.
“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.” [Continue reading…]
Daniel A. Yudkin and Jay Van Bavel write: During the first presidential debate, Hillary Clinton argued that “implicit bias is a problem for everyone, not just police.” Her comment moved to the forefront of public conversation an issue that scientists have been studying for decades: namely, that even well-meaning people frequently harbor hidden prejudices against members of other racial groups. Studies have shown that these subtle biases are widespread and associated with discrimination in legal, economic and organizational settings.
Critics of this notion, however, protest what they see as a character smear — a suggestion that everybody, deep down, is racist. Vice President-elect Mike Pence has said that an “accusation of implicit bias” in cases where a white police officer shoots a black civilian serves to “demean law enforcement.” Writing in National Review, David French claimed that the concept of implicit bias lets people “indict entire communities as bigoted.”
But implicit bias is not about bigotry per se. As new research from our laboratory suggests, implicit bias is grounded in a basic human tendency to divide the social world into groups. In other words, what may appear as an example of tacit racism may actually be a manifestation of a broader propensity to think in terms of “us versus them” — a prejudice that can apply, say, to fans of a different sports team. This doesn’t make the effects of implicit bias any less worrisome, but it does mean people should be less defensive about it. [Continue reading…]
We are a divided nation; that is an understatement. What’s more, we increasingly hear we are living in our own “bubble” or echo chamber that differing views cannot penetrate. To correct the problem, many are calling for people to reach out, to talk and above all, to listen. That is all well and good, but what are we supposed to talk about? We can’t hope to listen without a topic for finding common ground.
In my view, there are (at least) two prominent issues in this election that can serve as a bridge across our political divides. The first is that the political and economic system needs fixing because it favors those with special status or access. The second is that income inequality is reaching an intolerable level.
Might these two topics help mend the unpleasant Thanksgiving or Christmas dinners that many Americans are dreading? Instead of avoiding that unpleasantness, it may be a time to embrace it.
Whether you support Donald Trump or Hillary Clinton, fear might be the biggest factor driving you to the polls.
Over the weekend, pollster Peter Hart told NBC News that this has been “an election about fear.”
“Donald Trump’s message was the fear of what was happening to America,” he continued, “and Hillary Clinton’s was about the fear of Donald Trump.”
Indeed, Trump has made fear central to his campaign strategy. Using divisive and isolationist rhetoric, he has invoked images of immigrants and terrorists streaming into the country unaccounted for, of inner cities rife with poverty and crime.
Clinton, on the other hand, has used Trump’s words and actions to instill fears about what would happen to the country under a Trump presidency.
Given the fraught tone of the campaign, it’s no surprise that a poll from over the summer found that 81 percent of voters said they were afraid of one or both of the candidates winning.
For political candidates, why is it so effective to tap into voter fears? And what does the psychology research say about fear’s ability to influence behavior and decision-making?
The New York Times reports: The mixed martial arts fighter Ronda Rousey is “not a nice person.” The golf swing of the actor Samuel L. Jackson is “not athletic.” A lectern in the Oval Office “looks odd,” and the mobile carrier T-Mobile’s service “is terrible.”
These comments are not private thoughts, nor are they the result of an embarrassing hidden camera, an off-the-record comment or a document release. They are public statements made by Donald Trump to his 5.9 million Twitter followers.
We know this because we’ve read, tagged and quoted them all.
The end result is “Donald Trump’s Twitter Insults: The Complete List (So Far).” It’s not a sample of some insults, or just those about his political rivals — though plenty of those exist. It’s the full count — a 100 percent sample, in polling terms — representing our best effort to categorize more than 4,000 tweets Mr. Trump has made since he declared his candidacy in June 2015.
Of those, we found that one in every eight was a personal insult of some kind. [Continue reading…]
Julie Irwin writes: Research shows that one of the primary reasons to denigrate people is to signal membership in a group: They are out, so you are in. People are always looking to belong, and Trump may represent, for some people, a particularly attractive membership opportunity. He is clear about what “his kind” of people are — the winners, the big men on campus.
When he insults people as not having these qualities, he is providing an opportunity for others to affirm themselves by joining him in the insulting chorus. They can call back to him by being insulting to the losers, too. It becomes a signaling contest, and humans engage in this type of behavior all the time, insulting people while other people are watching, chirping on Twitter and at rallies, looking for their groups.
One of my favorite social psychology experiments makes the point: Students in fraternities and sororities who wanted to signal their loyalty were especially likely to denigrate people other fraternities and sororities by judging them as “foolish” or “unintelligent” if the insults were public. The insults are not for the insulted but for the group calling out to them.
This process only works if it is linked with warmth within the group.
On Twitter, Trump is friendly and chatty with people who support him, especially if they try to get his attention by insulting nonbelievers. “Trump pummels his opponents — and the press” one recent tweet said from someone named John to a few hundred followers — and Trump retweeted it to 5.75 million. He commonly quotes ordinary folks’ tweets and says “Thanks!!!!” to them as if they were his best friends. [Continue reading…]
Human memory does not operate like a video tape that can be rewound and rewatched, with every viewing revealing the same events in the same order. In fact, memories are reconstructed every time we recall them. Aspects of the memory can be altered, added or deleted altogether with each new recollection. This can lead to the phenomenon of false memory, where people have clear memories of an event that they never experienced.
False memory is surprisingly common, but a number of factors can increase its frequency. Recent research in my lab shows that being very interested in a topic can make you twice as likely to experience a false memory about that topic.
Previous research has indicated that experts in a few clearly defined fields, such as investments and American football, might be more likely to experience false memory in relation to their areas of expertise. Opinion as to the cause of this effect is divided. Some researchers have suggested that greater knowledge makes a person more likely to incorrectly recognise new information that is similar to previously experienced information. Another interpretation suggests that experts feel that they should know everything about their topic of expertise. According to this account, experts’ sense of accountability for their judgements causes them to “fill in the gaps” in their knowledge with plausible, but false, information.
To further investigate this, we asked 489 participants to rank seven topics from most to least interesting. The topics we used were football, politics, business, technology, film, science and pop music. The participants were then asked if they remembered the events described in four news items about the topic they selected as the most interesting, and four items about the topic selected as least interesting. In each case, three of the events depicted had really happened and one was fictional.
The results showed that being interested in a topic increased the frequency of accurate memories relating to that topic. Critically, it also increased the number of false memories – 25% of people experienced a false memory in relation to an interesting topic, compared with 10% in relation to a less interesting topic. Importantly, our participants were not asked to identify themselves as experts, and did not get to choose which topics they would answer questions about. This means that the increase in false memories is unlikely to be due to a sense of accountability for judgements about a specialist topic.
Susie Neilson writes: In The President’s Speech, a 1985 essay by the late neurologist and writer Oliver Sacks, he observes a group of people with aphasia, a language disorder, as they laugh uproariously at the television. The cause of their amusement is an unnamed actor-turned United States president, presumably Ronald Reagan, addressing his audience: “There he was, the old Charmer, the Actor, with his practised rhetoric, his histrionisms, his emotional appeal…The President was, as always, moving—but he was moving them, apparently, mainly to laughter. What could they be thinking? Were they failing to understand him? Or did they, perhaps, understand him all too well?”
Aphasic patients have a heightened ability to interpret body language, tonal quality, and other non-verbal aspects of communication due to a disruption of their speech, writing, reading, or listening abilities. Each aphasic person may have disruptions in any or all of these areas. Usually, the damage comes from a stroke or other head trauma — many people become aphasic in the wake of combat, for example, or after car accidents. “The key,” says Darlene Williamson, a speech pathologist specializing in aphasia and president of the National Aphasia Association, “is intelligence remains intact.”
In this sense, Williamson says, having aphasia is akin to visiting a foreign country, where everyone is communicating in a language you are conversational in at best. “The more impaired your language is,” she says, “the harder you’re working to be sure that you’re comprehending what’s going on.” How do we do this? By paying more careful attention to the cues we can understand, Williamson says. [Continue reading…]
Science magazine reports: For years, cognitive scientist Lars Chittka felt a bit eclipsed by his colleagues at Queen Mary University of London. Their studies of apes, crows, and parrots were constantly revealing how smart these animals were. He worked on bees, and at the time, almost everyone assumed that the insects acted on instinct, not intelligence. “So there was a challenge for me: Could we get our small-brained bees to solve tasks that would impress a bird cognition researcher?” he recalls. Now, it seems he has succeeded at last.
Chittka’s team has shown that bumble bees can not only learn to pull a string to retrieve a reward, but they can also learn this trick from other bees, even though they have no experience with such a task in nature. The study “successfully challenges the notion that ‘big brains’ are necessary” for new skills to spread, says Christian Rutz, an evolutionary ecologist who studies bird cognition at the University of St. Andrews in the United Kingdom.
Many researchers have used string pulling to assess the smarts of animals, particularly birds and apes. So Chittka and his colleagues set up a low clear plastic table barely tall enough to lay three flat artificial blue flowers underneath. Each flower contained a well of sugar water in the center and had a string attached that extended beyond the table’s boundaries. The only way the bumble bee could get the sugar water was to pull the flower out from under the table by tugging on the string. [Continue reading…]
Martha C Nussbaum writes: There’s no emotion we ought to think harder and more clearly about than anger. Anger greets most of us every day – in our personal relationships, in the workplace, on the highway, on airline trips – and, often, in our political lives as well. Anger is both poisonous and popular. Even when people acknowledge its destructive tendencies, they still so often cling to it, seeing it as a strong emotion, connected to self-respect and manliness (or, for women, to the vindication of equality). If you react to insults and wrongs without anger you’ll be seen as spineless and downtrodden. When people wrong you, says conventional wisdom, you should use justified rage to put them in their place, exact a penalty. We could call this football politics, but we’d have to acknowledge right away that athletes, whatever their rhetoric, have to be disciplined people who know how to transcend anger in pursuit of a team goal.
If we think closely about anger, we can begin to see why it is a stupid way to run one’s life. A good place to begin is Aristotle’s definition: not perfect, but useful, and a starting point for a long Western tradition of reflection. Aristotle says that anger is a response to a significant damage to something or someone one cares about, and a damage that the angry person believes to have been wrongfully inflicted. He adds that although anger is painful, it also contains within itself a hope for payback. So: significant damage, pertaining to one’s own values or circle of cares, and wrongfulness. All this seems both true and uncontroversial. More controversial, perhaps, is his idea (in which, however, all Western philosophers who write about anger concur) that the angry person wants some type of payback, and that this is a conceptual part of what anger is. In other words, if you don’t want some type of payback, your emotion is something else (grief, perhaps), but not really anger. [Continue reading…]