Oliver Sacks, casting light on the interconnectedness of life

Michiko Kakutani writes: It’s no coincidence that so many of the qualities that made Oliver Sacks such a brilliant writer are the same qualities that made him an ideal doctor: keen powers of observation and a devotion to detail, deep reservoirs of sympathy, and an intuitive understanding of the fathomless mysteries of the human brain and the intricate connections between the body and the mind.

Dr. Sacks, who died on Sunday at 82, was a polymath and an ardent humanist, and whether he was writing about his patients, or his love of chemistry or the power of music, he leapfrogged among disciplines, shedding light on the strange and wonderful interconnectedness of life — the connections between science and art, physiology and psychology, the beauty and economy of the natural world and the magic of the human imagination.

In his writings, as he once said of his mentor, the great Soviet neuropsychologist and author A. R. Luria, “science became poetry.” [Continue reading…]

facebooktwittermail

Over half of psychology studies fail reproducibility test

Nature reports: Don’t trust everything you read in the psychology literature. In fact, two thirds of it should probably be distrusted.

In the biggest project of its kind, Brian Nosek, a social psychologist and head of the Center for Open Science in Charlottesville, Virginia, and 269 co-authors repeated work reported in 98 original papers from three psychology journals, to see if they independently came up with the same results.

The studies they took on ranged from whether expressing insecurities perpetuates them to differences in how children and adults respond to fear stimuli, to effective ways to teach arithmetic. [Continue reading…]

facebooktwittermail

Life’s stories

Julie Beck writes: In Paul Murray’s novel Skippy Dies, there’s a point where the main character, Howard, has an existential crisis.“‘It’s just not how I expected my life would be,'” he says.

“‘What did you expect?’” a friend responds.

“Howard ponders this. ‘I suppose—this sounds stupid, but I suppose I thought there’d be more of a narrative arc.’”

But it’s not stupid at all. Though perhaps the facts of someone’s life, presented end to end, wouldn’t much resemble a narrative to the outside observer, the way people choose to tell the stories of their lives, to others and — crucially — to themselves, almost always does have a narrative arc. In telling the story of how you became who you are, and of who you’re on your way to becoming, the story itself becomes a part of who you are.

“Life stories do not simply reflect personality. They are personality, or more accurately, they are important parts of personality, along with other parts, like dispositional traits, goals, and values,” writes Dan McAdams, a professor of psychology at Northwestern University, along with Erika Manczak, in a chapter for the APA Handbook of Personality and Social Psychology. [Continue reading…]

facebooktwittermail

Move to prohibit psychologists from involvement in national security interrogations

The New York Times reports: The board of the American Psychological Association plans to recommend a tough ethics policy that would prohibit psychologists from involvement in all national security interrogations, potentially creating a new obstacle to the Obama administration’s efforts to detain and interrogate terrorism suspects outside of the traditional criminal justice system.

The board of the of the A.P.A., the nation’s largest professional organization for psychologists, is expected to recommend that members approve the ban at its annual meeting in Toronto next week, according to two members, Nadine Kaslow and Susan H. McDaniel, the group’s president-elect. The board’s proposal would make it a violation of the association’s ethical policies for psychologists to play a role in national security interrogations involving any military or intelligence personnel, even the noncoercive interrogations now conducted by the Obama administration. The board’s proposal must be voted on and approved by the members’ council to become a policy.

The board’s recommendation is a response to a report from earlier this month after an independent investigation into the involvement of prominent psychologists and association officials in the harsh interrogation programs operated by the C.I.A. and the Defense Department during the Bush administration. [Continue reading…]

facebooktwittermail

Artificial neural networks on acid

red-tree-small-long-unsmoothed

Quartz reports: American sci-fi novelist Philip K. Dick once famously asked, Do Androids Dream of Electric Sheep? While he was on the right track, the answer appears to be, no, they don’t. They dream of dog-headed knights atop horses, of camel-birds and pig-snails, and of Dali-esque mutated landscapes.

Google’s image recognition software, which can detect, analyze, and even auto-caption images, uses artificial neural networks to simulate the human brain. In a process they’re calling “inceptionism,” Google engineers sought out to see what these artificial networks “dream” of — what, if anything, do they see in a nondescript image of clouds, for instance? What does a fake brain that’s trained to detect images of dogs see when it’s shown a picture of a knight?

Google trains the software by feeding it millions of images, eventually teaching it to recognize specific objects within a picture. When it’s fed an image, it is asked to emphasize the object in the image that it recognizes. The network is made up of layers — the higher the layer, the more precise the interpretation. Eventually, in the final output layer, the network makes a “decision” as to what’s in the image.

But the networks aren’t restricted to only identifying images. Their training allows them to generate images as well. [Continue reading…]

facebooktwittermail

Even atheists intuitively believe in a creator

sunflower

Tom Jacobs writes: Since the discoveries of Darwin, evidence has gradually mounted refuting the notion that the natural world is the product of a deity or other outside designer. Yet this idea remains firmly lodged in the human brain.

Just how firmly is the subject of newly published research, which finds even self-proclaimed atheists instinctively think of natural phenomena as being purposefully created.

The findings “suggest that there is a deeply rooted natural tendency to view nature as designed,” writes a research team led by Elisa Järnfelt of Newman University. They also provide evidence that, in the researchers’ words, “religious non-belief is cognitively effortful.” [Continue reading…]

facebooktwittermail

Extreme athletes gain control through fear – and sometimes pay the price

By Tim Woodman, Bangor University; Lew Hardy, Bangor University, and Matthew Barlow, Bangor University

The death of famed “daredevil” climber and base jumper Dean Potter has once again raised the idea that all high-risk sportspeople are hedonistic thrill seekers. Our research into extreme athletes shows this view is simplistic and wrong.

It’s about attitudes to risk. In his famous Moon speech in 1962, John F Kennedy said:

Many years ago the great British explorer George Mallory, who was to die on Mount Everest, was asked [by a New York Times journalist] why did he want to climb it. He said, ‘Because it is there.’ Well, space is there, and we’re going to climb it, and the moon and the planets are there, and new hopes for knowledge and peace are there …

Humans have evolved through taking risks. In fact, most human actions can be conceptualised as containing an element of risk: as we take our first step, we risk falling down; as we try a new food, we risk being disgusted; as we ride a bicycle, we risk falling over; as we go on a date, we risk being rejected; and as we travel to the moon, we risk not coming back.

Human endeavour and risk are intertwined. So it is not surprising that despite the increasingly risk-averse society that we live in, many people crave danger and risk – a life less sanitised.

[Read more…]

facebooktwittermail

The science of scarcity

Harvard Magazine: Toward the end of World War II, while thousands of Europeans were dying of hunger, 36 men at the University of Minnesota volunteered for a study that would send them to the brink of starvation. Allied troops advancing into German-occupied territories with supplies and food were encountering droves of skeletal people they had no idea how to safely renourish, and researchers at the university had designed a study they hoped might reveal the best methods of doing so. But first, their volunteers had to agree to starve.

The physical toll on these men was alarming: their metabolism slowed by 40 percent; sitting on atrophied muscles became painful; though their limbs were skeletal, their fluid-filled bellies looked curiously stout. But researchers also observed disturbing mental effects they hadn’t expected: obsessions about cookbooks and recipes developed; men with no previous interest in food thought — and talked — about nothing else. Overwhelming, uncontrollable thoughts had taken over, and as one participant later recalled, “Food became the one central and only thing really in one’s life.” There was no room left for anything else.

Though these odd behaviors were just a footnote in the original Minnesota study, to professor of economics Sendhil Mullainathan, who works on contemporary issues of poverty, they were among the most intriguing findings. Nearly 70 years after publication, that “footnote” showed something remarkable: scarcity had stolen more than flesh and muscle. It had captured the starving men’s minds.

Mullainathan is not a psychologist, but he has long been fascinated by how the mind works. As a behavioral economist, he looks at how people’s mental states and social and physical environments affect their economic actions. Research like the Minnesota study raised important questions: What happens to our minds — and our decisions — when we feel we have too little of something? Why, in the face of scarcity, do people so often make seemingly irrational, even counter-productive decisions? And if this is true in large populations, why do so few policies and programs take it into account?

In 2008, Mullainathan joined Eldar Shafir, Tod professor of psychology and public affairs at Princeton, to write a book exploring these questions. Scarcity: Why Having Too Little Means So Much (2013) presented years of findings from the fields of psychology and economics, as well as new empirical research of their own. Based on their analysis of the data, they sought to show that, just as food had possessed the minds of the starving volunteers in Minnesota, scarcity steals mental capacity wherever it occurs—from the hungry, to the lonely, to the time-strapped, to the poor.

That’s a phenomenon well-documented by psychologists: if the mind is focused on one thing, other abilities and skills — attention, self-control, and long-term planning — often suffer. Like a computer running multiple programs, Mullainathan and Shafir explain, our mental processors begin to slow down. We don’t lose any inherent capacities, just the ability to access the full complement ordinarily available for use.

But what’s most striking — and in some circles, controversial — about their work is not what they reveal about the effects of scarcity. It’s their assertion that scarcity affects anyone in its grip. Their argument: qualities often considered part of someone’s basic character — impulsive behavior, poor performance in school, poor financial decisions — may in fact be the products of a pervasive feeling of scarcity. And when that feeling is constant, as it is for people mired in poverty, it captures and compromises the mind.

This is one of scarcity’s most insidious effects, they argue: creating mindsets that rarely consider long-term best interests. “To put it bluntly,” says Mullainathan, “if I made you poor tomorrow, you’d probably start behaving in many of the same ways we associate with poor people.” And just like many poor people, he adds, you’d likely get stuck in the scarcity trap. [Continue reading…]

facebooktwittermail

These are the memories you’re most likely to get wrong

Jennifer Talarico writes: It isn’t surprising that many Bostonians have vivid memories of the 2013 Marathon bombing, or that many New Yorkers have very clear memories about where they were and what they were doing on 9/11.

But many individuals who were not onsite for these attacks, or not even in Boston on Apr. 15, 2013 or in New York on Sept. 11, 2001, also have vivid memories of how they learned about these events. Why would people who were not immediately or directly affected have such a long-lasting sense of knowing exactly where they were and what they were doing when they heard the news?

These recollections are called flashbulb memories. In a flashbulb memory, we recall the experience of learning about an event, not the factual details of the event itself.

There might be an advantage to recalling the elements of important events that happen to us or to those close to us, but there appears to be little benefit to recalling our experience hearing this kind of news. So why does learning about a big event create such vivid memories? And just how accurate are flashbulb memories? [Continue reading…]

facebooktwittermail

Searching the web creates an illusion of knowledge

Tom Jacobs writes: Surely you have noticed: A lot of people who have no idea what they are talking about are oddly certain of their superior knowledge. While this disconnect has been a problem throughout human history, new research suggests a ubiquitous feature of our high-tech world — the Internet — has made matters much worse.

In a series of studies, a Yale University research team led by psychologist Matthew Fisher shows that people who search for information on the Web emerge from the process with an inflated sense of how much they know — even regarding topics that are unrelated to the ones they Googled.

This illusion of knowledge appears to be “driven by the act of searching itself,” they write in the Journal of Experimental Psychology: General. Apparently conflating seeking information online with racking one’s brain, people consistently mistake “outsourced knowledge for internal knowledge.” [Continue reading…]

facebooktwittermail

The enigma of survival

Sally Satel writes: The evil hour descended on David Morris in the summer of 2009. The former marine and war reporter was in a theater watching a movie with his then girlfriend and suddenly found himself pacing the lobby with no memory of having left his seat. Later, his girlfriend explained that Morris had fled after an explosion occurred onscreen.

He began having dreams of his buddies being ripped apart. When awake, he would imagine innocent items—an apple or a container of Chinese takeout—blowing up. Pathological vigilance took root: “Preparing for bed was like getting ready for a night patrol.” The dreams persisted. “Part of me,” he admits, “was ashamed of the dreams, of the realization that I was trapped inside a cliché: the veteran so obsessed with his own past that even his unconscious made love to it every night.”

Post-traumatic stress disorder is the subject of two new books, one by Morris and another by war reporter Mac McClelland. The symptoms are crippling: relentless nightmares, unbidden waking images, hyperarousal, sleeplessness, and phobias. As a diagnosis, it has existed colloquially for generations—“shell shock” is one name that survives in the modern idiom—and it has particular resonance because of this generation’s wars. (Most soldiers are spared it, though the public tends to think they are not. A 2012 poll found that most people believe that most post-9/11 veterans suffer from PTSD. The actual rate has been estimated at between two and 17 percent.)

Morris thinks the symptoms—a body and mind reacting in fear long after the threat to life and limb is gone—hardly encompass the experience of PTSD. Historically, we might have sought out not only shrinks but also “poetry, our families, or the clergy for solace post horror.” Profitably, Morris turns to everyone: the Greeks, the great poets of World War I, historians, anthropologists, and yes, psychiatrists and psychologists.

From such wide consultation comes a masterful synthesis. The Evil Hours interweaves memoir with a cultural history of war’s psychic aftermath. Morris chronicles the development of PTSD as an official diagnosis and its earlier incarnations in other wars. From Homer’s Odyssey to the venerated war poets, from the crusade for recognition by organized psychiatry to the modern science of fear and resilience, Morris gives a sweeping view of the condition, illuminated by meditation on sacrifice and danger and, in his words, “the enigma of survival.” [Continue reading…]

facebooktwittermail

An integrated model of creativity and personality

Scott Barry Kaufman writes: Psychologists Guillaume Furst, Paolo Ghisletta and Todd Lubart present an integrative model of creativity and personality that is deeply grounded in past research on the personality of creative people.

Bringing together lots of different research threads over the years, they identified three “super-factors” of personality that predict creativity: Plasticity, Divergence, and Convergence.

Plasticity consists of the personality traits openness to experience, extraversion, high energy, and inspiration. The common factor here is high drive for exploration, and those high in this super-factor of personality tend to have a lot of dopamine — “the neuromodulator of exploration” — coursing through their brains. Prior research has shown a strong link between Plasticity and creativity, especially in the arts.

Divergence consists of non-conformity, impulsivity, low agreeableness, and low conscientiousness. People high in divergence may seem like jerks, but they are often just very independent thinkers. This super-factor is close to Hans Eysenck’s concept of “Psychoticism.” Throughout his life, Eysenck argued that these non-conforming characteristics were important contributors to high creative achievements.

Finally, Convergence consists of high conscientiousness, precision, persistence, and critical sense. While not typically included in discussions of creativity, these characteristics are also important contributors to the creative process. [Continue reading…]

facebooktwittermail

Stoicism — a philosophy of gratitude

Lary Wallace writes: We do this to our philosophies. We redraft their contours based on projected shadows, or give them a cartoonish shape like a caricaturist emphasising all the wrong features. This is how Buddhism becomes, in the popular imagination, a doctrine of passivity and even laziness, while Existentialism becomes synonymous with apathy and futile despair. Something similar has happened to Stoicism, which is considered – when considered at all – a philosophy of grim endurance, of carrying on rather than getting over, of tolerating rather than transcending life’s agonies and adversities.

No wonder it’s not more popular. No wonder the Stoic sage, in Western culture, has never obtained the popularity of the Zen master. Even though Stoicism is far more accessible, not only does it lack the exotic mystique of Eastern practice; it’s also regarded as a philosophy of merely breaking even while remaining determinedly impassive. What this attitude ignores is the promise proffered by Stoicism of lasting transcendence and imperturbable tranquility.

It ignores gratitude, too. This is part of the tranquility, because it’s what makes the tranquility possible. Stoicism is, as much as anything, a philosophy of gratitude – and a gratitude, moreover, rugged enough to endure anything. Philosophers who pine for supreme psychological liberation have often failed to realise that they belong to a confederacy that includes the Stoics. [Continue reading…]

facebooktwittermail

The art of not trying

John Tierney writes: Just be yourself.

The advice is as maddening as it is inescapable. It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself.

But when you’re nervous, how can you be yourself? How you can force yourself to relax? How can you try not to try?

It makes no sense, but the paradox is essential to civilization, according to Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.

He calls it the paradox of wu wei, the Chinese term for “effortless action.” Pronounced “ooo-way,” it has similarities to the concept of flow, that state of effortless performance sought by athletes, but it applies to a lot more than sports. Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.

Dr. Slingerland, a professor of Asian studies at the University of British Columbia, argues that the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good. [Continue reading…]

facebooktwittermail

The conception of perception shaped by context

facebooktwittermail

How Darkness Visible shined a light

Peter Fulham writes: Twenty-five years ago, in December, 1989, Darkness Visible, William Styron’s account of his descent into the depths of clinical depression and back, appeared in Vanity Fair. The piece revealed in unsparing detail how Styron’s lifelong melancholy at once gave way to a seductive urge to end his own life. A few months later, he released the essay as a book, augmenting the article with a recollection of when the illness first took hold of him: in Paris, as he was about to accept the 1985 Prix mondial Cino Del Duca, the French literary award. By the author’s own acknowledgement, the response from readers was unprecedented. “This was just overwhelming. It was just by the thousands that the letters came in,” he told Charlie Rose. “I had not really realized that it was going to touch that kind of a nerve.”

Styron may have been startled by the outpouring of mail, but in many ways, it’s easy to understand. The academic research on mental illness at the time was relatively comprehensive, but no one to date had offered the kind of report that Styron gave to the public: a firsthand account of what it’s like to have the monstrous condition overtake you. He also exposed the inadequacy of the word itself, which is still used interchangeably to describe a case of the blues, rather than the tempestuous agony sufferers know too well.

Depression is notoriously hard to describe, but Styron managed to split the atom. “I’d feel the horror, like some poisonous fogbank, roll in upon my mind,” he wrote in one chapter. In another: “It is not an immediately identifiable pain, like that of a broken limb. It may be more accurate to say that despair… comes to resemble the diabolical discomfort of being imprisoned in a fiercely overheated room. And because no breeze stirs this cauldron… it is entirely natural that the victim begins to think ceaselessly of oblivion.”

As someone who has fought intermittently with the same illness since college, those sentences were cathartic, just as I suspect they were for the many readers who wrote to Styron disclosing unequivocally that he had saved their lives. As brutal as depression can be, one of the main ways a person can restrain it is through solidarity. You are not alone, Styron reminded his readers, and the fog will lift. Patience is paramount. [Continue reading…]

facebooktwittermail

Why moral character is the key to personal identity

Nina Strohminger writes: One morning after her accident, a woman I’ll call Kate awoke in a daze. She looked at the man next to her in bed. He resembled her husband, with the same coppery beard and freckles dusted across his shoulders. But this man was definitely not her husband.

Panicked, she packed a small bag and headed to her psychiatrist’s office. On the bus, there was a man she had been encountering with increasing frequency over the past several weeks. The man was clever, he was a spy. He always appeared in a different form: one day as a little girl in a sundress, another time as a bike courier who smirked at her knowingly. She explained these bizarre developments to her doctor, who was quickly becoming one of the last voices in this world she could trust. But as he spoke, her stomach sank with a dreaded realisation: this man, too, was an impostor.

Kate has Capgras syndrome, the unshakeable belief that someone – often a loved one, sometimes oneself – has been replaced with an exact replica. She also has Fregoli syndrome, the delusion that the same person is taking on a variety of shapes, like an actor donning an expert disguise. Capgras and Fregoli delusions offer hints about an extraordinary cognitive mechanism active in the healthy mind, a mechanism so exquisitely tuned that we are hardly ever aware of it. This mechanism ascribes to each person a unique identity, and then meticulously tracks and updates it. This mechanism is crucial to virtually every human interaction, from navigating a party to navigating a marriage. Without it, we quickly fall apart. [Continue reading…]

facebooktwittermail

Gossip makes human society possible

Julie Beck writes: While gossiping is a behavior that has long been frowned upon, perhaps no one has frowned quite so intensely as the 16th- and 17th-century British. Back then, gossips, or “scolds” were sometimes forced to wear a menacing iron cage on their heads, called the “branks” or “scold’s bridle.” These masks purportedly had iron spikes or bits that went in the mouth and prevented the wearer from speaking. (And of course, of course, this ghastly punishment seems to have been mostly for women who were talking too much.)

Today, people who gossip are still not very well-liked, though we tend to resist the urge to cage their heads. Progress. And yet the reflexive distaste people feel for gossip and those who gossip in general is often nowhere to be found when people find themselves actually faced with a juicy morsel about someone they know. Social topics—personal relationships, likes and dislikes, anecdotes about social activities—made up about two-thirds of all conversations in analyses done by evolutionary psychologist Robin Dunbar. The remaining one-third of their time not spent talking about other people was devoted to discussing everything else: sports, music, politics, etc.

“Language in freely forming natural conversations is principally used for the exchange of social information,” Dunbar writes. “That such topics are so overwhelmingly important to us suggests that this is a primary function of language.” He even goes so far as to say: “Gossip is what makes human society as we know it possible.”

In recent years, research on the positive effects of gossip has proliferated. Rather than just a means to humiliate people and make them cry in the bathroom, gossip is now being considered by scientists as a way to learn about cultural norms, bond with others, promote cooperation, and even, as one recent study found, allow individuals to gauge their own success and social standing. [Continue reading…]

facebooktwittermail