Category Archives: Psychology

Oliver Sacks and the neurology of identity

David Wallace-Wells writes: “To talk of diseases is a sort of Arabian Nights entertainment,” ran the epigraph to The Man Who Mistook His Wife for a Hat, Oliver Sacks’s fourth book and his first best seller—the one that made him famous, in 1985, as a Scheherazade of brain disorder. A sensitive bedside-manner neurologist, he had previously written three books, none of which had attracted much notice at all. The Man Who Mistook would mark the beginning of another career, and a much more public one, as perhaps the unlikeliest ambassador for brain science—a melancholic, savantlike physician disinterested in grand theories and transfixed by those neurological curiosities they failed to explain. Sacks was 52 years old and cripplingly withdrawn, a British alien living a lonely aquatic life on City Island, and for about ten years had been dealing with something like what he’d later call “the Lewis Thomas crisis,” after the physician and biologist who decided, in his fifties, to devote himself to writing essays and poetry.

“In those days, he was a complete recluse. I mean, there was nobody,” recalls Lawrence Weschler, a friend and essayist who once planned to write a biography of Sacks. “He was in a kind of rictus of shyness.” Sacks was struggling with writer’s block, wrestling with accusations of confabulation as well as the public’s general indifference to his first books, and would spend hours every day swimming in Long Island Sound, near the Throgs Neck Bridge, miles each way and often at night. “I feel I belong in the water — I feel we all belong in the water,” he’s said, and later echoed: “I cease to be a sort of obsessed intellect and a shaky body, and I just become a porpoise.”

W. H. Auden had become a friend, too, after he’d reviewed Sacks’s first book, the Victorian-ish survey Migraine, in 1971. Auden had also admired his second book, Awakenings, which detailed the miraculous recovery, then tragic relapse, of the near-catatonic patients given the experimental drug L-dopa. The book sold only modestly when published and attracted, he says, no medical reviews. Auden thought Sacks was capable of more. “You’re going to have to go beyond the clinical,” he told him. “Be metaphorical, be mythical, be whatever you need.”

The uncanny case studies of The Man Who Mistook were the mythic course that Sacks has followed ever since (his new book, Hallucinations, is another mesmerizing casebook of neurological marvles). Histories of such unfamiliar or forgotten disorders, tales of damage told in such picaresque and nondiagnostic ways, the stories of The Man Who Mistook struck early readers as landing oddly between science and fairy tale. (A man so unable to distinguish between objects he tried to lift his wife onto his head; a 49-year-old whose memory stalled at 19; a patient who thought Sacks was a delicatessen customer one minute and looked like Sigmund Freud the next.) “They come together at the intersection of fact and fable,” Sacks wrote in its grand preface. “But what facts! What fables! To what shall we compare them? We may not have any existing models, metaphors, or myths. Has the time perhaps come for new symbols, new myths?” [Continue reading…]

Facebooktwittermail

Intelligence and the stereotype threat

Annie Murphy Paul writes: We’ve all been there: you feel especially smart and funny when talking to a particular person, only to feel hopelessly unintelligent and inarticulate in the presence of another.

You’re not imagining things. Experiments show that when people report feeling comfortable with a conversational partner, they are judged by those partners and by observers as actually being more witty.

It’s just one example of the powerful influence that social factors can have on intelligence. As parents, teachers and students settle into the school year, this work should prompt us to think about intelligence not as a “lump of something that’s in our heads,” as the psychologist Joshua Aronson puts it, but as “a transaction among people.”

Mr. Aronson, an associate professor at New York University, has been a leader in investigating the effects of social forces on academic achievement. Along with the psychologist Claude Steele, he identified the phenomenon known as “stereotype threat.” Members of groups believed to be academically inferior — African-American and Latino students enrolled in college, or female students in math and science courses — score much lower on tests when reminded beforehand of their race or gender.

The pair’s experiments in the 1990s, and the dozens of studies by other researchers that followed, concluded that the performance of these students suffered because they were worried about confirming negative stereotypes about their group.

In a 1995 article in The Journal of Personality and Social Psychology, Professors Steele and Aronson found that black students performed comparably with white students when told that the test they were taking was “a laboratory problem-solving task.” Black students scored much lower, however, when they were instructed that the test was meant to measure their intellectual ability. In effect, the prospect of social evaluation suppressed these students’ intelligence.

Minorities aren’t the only ones vulnerable to stereotype threat. We all are. A group of people notably confident about their mathematical abilities — white male math and engineering majors who received high scores on the math portion of the SAT — did worse on a math test when told that the experiment was intended to investigate “why Asians appear to outperform other students on tests of math ability.” [Continue reading…]

Facebooktwittermail

The myth of leadership stress

The Los Angeles Times reports: Management consultants say 60% of senior executives experience high stress and anxiety on a regular basis, and a thriving industry of motivational speakers teaches business leaders how to manage their corrosive burden of stress. But just how uneasy lies the head that wears the crown?

Not so uneasy, it turns out.

A new study reveals that those who sit atop the nation’s political, military, business and nonprofit organizations are actually pretty chill. Compared with people of similar age, gender and ethnicity who haven’t made it to the top, leaders pronounced themselves less stressed and anxious. And their levels of cortisol, a hormone that circulates at high levels in the chronically stressed, told the same story.

The source of the leaders’ relative serenity was pretty simple: control.

Compared with workers who toil in lower echelons of the American economy, the leaders studied by a group of Harvard University researchers enjoyed control over their schedules, their daily living circumstances, their financial security, their enterprises and their lives.

“Leaders possess a particular psychological resource — a sense of control — that may buffer against stress,” the research team reported Monday in Proceedings of the National Academies of Sciences.

Though the finding appeared to fly in the face of conventional wisdom, it came as no surprise to those who have studied the role that social status plays in the well-being of our primate relatives. [Continue reading…]

Facebooktwittermail

The self illusion

A dewdrop seemingly captures a flower. Photo copyright Doug Benner

Jonah Lehrer talks to the psychologist Bruce Hood about his new book, The Self Illusion.

LEHRER: The title of The Self Illusion is literal. You argue that the self – this entity at the center of our personal universe – is actually just a story, a “constructed narrative.” Could you explain what you mean?

HOOD: The best stories make sense. They follow a logical path where one thing leads to another and provide the most relevant details and signposts along the way so that you get a sense of continuity and cohesion. This is what writers refer to as the narrative arc – a beginning, middle and an end. If a sequence of events does not follow a narrative, then it is incoherent and fragmented so does not have meaning. Our brains think in stories. The same is true for the self and I use a distinction that William James drew between the self as “I” and “me.” Our consciousness of the self in the here and now is the “I” and most of the time, we experience this as being an integrated and coherent individual – a bit like the character in the story. The self which we tell others about, is autobiographical or the “me” which again is a coherent account of who we think we are based on past experiences, current events and aspirations for the future.

The neuroscience supports the claim that self is constructed. For example, Michael Gazzaniga demonstrated that spilt-brain patients presented with inconsistent visual information, would readily confabulate an explanation to reconcile information unconsciously processed with information that was conscious. They would make up a story. Likewise, Oliver Sacks famously reported various patients who could confabulate accounts to make sense of their impairments. Ramachandran describes patients who are paralyzed but deny they have a problem. These are all extreme clinical cases but the same is true of normal people. We can easily spot the inconsistencies in other people’s accounts of their self but we are less able to spot our own, and when those inconsistencies are made apparent by the consequences of our actions, we make the excuse, “I wasn’t myself last night” or “It was the wine talking!” Well, wine doesn’t talk and if you were not your self, then who were you and who was being you?

LEHRER: The fragmented nature of the self is very much a theme of modernist literature. (Nietzsche said it first: “My hypothesis is the subject as multiplicity,” he wrote in a terse summary of his philosophy. Virginia Woolf echoed Nietzsche, writing in her diary that we are “splinters and mosaics; not, as they used to hold, immaculate, monolithic, consistent wholes.”) In your book, you argue that modern neuroscience has confirmed the “bundle theory” of the self proposed by Hume. Do you think they have also confirmed these artistic intuitions about the self? If so, how has science demonstrated this? Are we really just a collection of “splinters and mosaics”?

HOOD: Yes, absolutely. When I was first asked to write this book, I really could not see what the revelation was all about. We had to be a multitude – a complex system of evolved functions. Neuroscientists spend their time trying to reverse engineer the brain by trying to figure out the different functions we evolved through natural selection. So far, we have found that the brain is clearly a complex of interacting systems all the way up from the senses to the conceptual machinery of the mind – the output of the brain. From the very moment that input from the environment triggers a sensory receptor to set off a nerve impulse that becomes a chain reaction, we are nothing more that an extremely complicated processing system that has evolved to create rich re-presentations of the world around us. We have no direct contact with reality because everything we experience is an abstracted version of reality that has been through the processing machinery of our brains to produce experience. [Continue reading…]

This conversation has been sitting untouched in my sources ‘inbox’ for a while because it deals with a complex philosophical issue that doesn’t really lend itself to the on-the-fly nature of blogging. Even so, for better or worse, I’ll now venture forth and tease out at least one strand (and probably pick up a few others along the way).

“We have no direct contact with reality because everything we experience is an abstracted version of reality,” says Hood as he deconstructs self yet sustains subject-object dualism. This isn’t really correct (and I think Hood would readily concede this point) because the abstracted versions of reality constructed within and constantly transforming our brains are just as much a part of reality as the things these abstractions represent.

What distinguishes selves from the rest of reality is the dynamic relationship they possess with everything around them.

A rock has a relationship with its environment that is relatively simple and mostly passive. Under the influence of heat, moisture, and wind, it slowly erodes. It affects its immediate environment by casting a shadow and restricting the availability of oxygen, water, and light to soil under its surface, but in the network of terrestrial phenomena it forms a simple node.

In contrast, human selves are fabulously complex nodes that mirror and interact with each other through a vast array of connections in constant flux. An image of these nodes of complexity is captured in a Buddhist metaphor called Indra’s net. Timothy Brook eloquently describes the idea:

When Indra fashioned the world, he made it as a web, and at every knot in the web is tied a pearl. Everything that exists, or has ever existed, every idea that can be thought about, every datum that is true — every dharma, in the language of Indian philosophy — is a pearl in Indra’s net. Not only is every pearl tied to every other pearl by virtue of the web on which they hang, but on the surface of every pearl is reflected every other jewel on the net. Everything that exists in Indra’s web implies all else that exists.

Beyond this feature of universal connectivity in which everything participates, our neural pearls are supercharged with complexity. This is where reality fizzes!

With the hubris of science in general, neuroscience is inclined to treat its exploration of self as a new frontier, but Buddhist philosophy has been mapping out this territory for about 2,500 years. As this theory of self was first exported to the West through translations and later through spiritual teachings, it often got expressed in pop culture as the idea that the self is non-existent — egolessness. But as Robert Thurman explains, it is not that we have no self but that self’s nature is relational. There is no self which exists outside the set of relationships within which it forms a complex, dynamic, ever-changing node.

Facebooktwittermail

Debunking the myth of intuition

In a DER SPIEGEL interview, Nobel Prize-winning psychologist Daniel Kahneman discusses the innate weakness of human thought, deceptive memories and the misleading power of intuition.

SPIEGEL: Professor Kahneman, you’ve spent your entire professional life studying the snares in which human thought can become entrapped. For example, in your book, you describe how easy it is to increase a person’s willingness to contribute money to the coffee fund.

Kahneman: You just have to make sure that the right picture is hanging above the cash box. If a pair of eyes is looking back at them from the wall, people will contribute twice as much as they do when the picture shows flowers. People who feel observed behave more morally.

SPIEGEL: And this also works if we don’t even pay attention to the photo on the wall?

Kahneman: All the more if you don’t notice it. The phenomenon is called “priming”: We aren’t aware that we have perceived a certain stimulus, but it can be proved that we still respond to it.

SPIEGEL: People in advertising will like that.

Kahneman: Of course, that’s where priming is in widespread use. An attractive woman in an ad automatically directs your attention to the name of the product. When you encounter it in the shop later on, it will already seem familiar to you.

SPIEGEL: Isn’t erotic association much more important?

Kahneman: Of course, there are other mechanisms of advertising that also act on the subconscious. But the main effect is simply that a name we see in a shop looks familiar — because, when it looks familiar, it looks good. There is a very good evolutionary explanation for that: If I encounter something many times, and it hasn’t eaten me yet, then I’m safe. Familiarity is a safety signal. That’s why we like what we know.

SPIEGEL: Can these insights also be applied to politics?

Kahneman: Of course. For example, one can show that anything that reminds people of their mortality makes them more obedient. [Continue reading…]

Facebooktwittermail