Why futurism has a cultural blindspot

Tom Vanderbilt writes: In early 1999, during the halftime of a University of Washington basketball game, a time capsule from 1927 was opened. Among the contents of this portal to the past were some yellowing newspapers, a Mercury dime, a student handbook, and a building permit. The crowd promptly erupted into boos. One student declared the items “dumb.”

Such disappointment in time capsules seems to run endemic, suggests William E. Jarvis in his book Time Capsules: A Cultural History. A headline from The Onion, he notes, sums it up: “Newly unearthed time capsule just full of useless old crap.” Time capsules, after all, exude a kind of pathos: They show us that the future was not quite as advanced as we thought it would be, nor did it come as quickly. The past, meanwhile, turns out to not be as radically distinct as we thought.

In his book Predicting the Future, Nicholas Rescher writes that “we incline to view the future through a telescope, as it were, thereby magnifying and bringing nearer what we can manage to see.” So too do we view the past through the other end of the telescope, making things look farther away than they actually were, or losing sight of some things altogether.

These observations apply neatly to technology. We don’t have the personal flying cars we predicted we would. Coal, notes the historian David Edgerton in his book The Shock of the Old, was a bigger source of power at the dawn of the 21st century than in sooty 1900; steam was more significant in 1900 than 1800.

But when it comes to culture we tend to believe not that the future will be very different than the present day, but that it will be roughly the same. Try to imagine yourself at some future date. Where do you imagine you will be living? What will you be wearing? What music will you love?

Chances are, that person resembles you now. As the psychologist George Lowenstein and colleagues have argued, in a phenomenon they termed “projection bias,” people “tend to exaggerate the degree to which their future tastes will resemble their current tastes.” [Continue reading…]


Is there anything wrong with men who cry?

Sandra Newman writes: One of our most firmly entrenched ideas of masculinity is that men don’t cry. Although he might shed a discreet tear at a funeral, and it’s acceptable for him to well up when he slams his fingers in a car door, a real man is expected to quickly regain control. Sobbing openly is strictly for girls.

This isn’t just a social expectation; it’s a scientific fact. All the research to date finds that women cry significantly more than men. A meta-study by the German Society of Ophthalmology in 2009 found that women weep, on average, five times as often, and almost twice as long per episode. The discrepancy is such a commonplace, we tend to assume it’s biologically hard-wired; that, whether you like it or not, this is one gender difference that isn’t going away.

But actually, the gender gap in crying seems to be a recent development. Historical and literary evidence suggests that, in the past, not only did men cry in public, but no one saw it as feminine or shameful. In fact, male weeping was regarded as normal in almost every part of the world for most of recorded history. [Continue reading…]


The dangerous idea that life is a story

Galen Strawson writes: ‘Each of us constructs and lives a “narrative”,’ wrote the British neurologist Oliver Sacks, ‘this narrative is us’. Likewise the American cognitive psychologist Jerome Bruner: ‘Self is a perpetually rewritten story.’ And: ‘In the end, we become the autobiographical narratives by which we “tell about” our lives.’ Or a fellow American psychologist, Dan P McAdams: ‘We are all storytellers, and we are the stories we tell.’ And here’s the American moral philosopher J David Velleman: ‘We invent ourselves… but we really are the characters we invent.’ And, for good measure, another American philosopher, Daniel Dennett: ‘we are all virtuoso novelists, who find ourselves engaged in all sorts of behaviour… and we always put the best “faces” on it we can. We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character at the centre of that autobiography is one’s self.’

So say the narrativists. We story ourselves and we are our stories. There’s a remarkably robust consensus about this claim, not only in the humanities but also in psychotherapy. It’s standardly linked with the idea that self-narration is a good thing, necessary for a full human life.

I think it’s false – false that everyone stories themselves, and false that it’s always a good thing. These are not universal human truths – even when we confine our attention to human beings who count as psychologically normal, as I will here. They’re not universal human truths even if they’re true of some people, or even many, or most. The narrativists are, at best, generalising from their own case, in an all-too-human way. At best: I doubt that what they say is an accurate description even of themselves. [Continue reading…]


Oliver Sacks, casting light on the interconnectedness of life

Michiko Kakutani writes: It’s no coincidence that so many of the qualities that made Oliver Sacks such a brilliant writer are the same qualities that made him an ideal doctor: keen powers of observation and a devotion to detail, deep reservoirs of sympathy, and an intuitive understanding of the fathomless mysteries of the human brain and the intricate connections between the body and the mind.

Dr. Sacks, who died on Sunday at 82, was a polymath and an ardent humanist, and whether he was writing about his patients, or his love of chemistry or the power of music, he leapfrogged among disciplines, shedding light on the strange and wonderful interconnectedness of life — the connections between science and art, physiology and psychology, the beauty and economy of the natural world and the magic of the human imagination.

In his writings, as he once said of his mentor, the great Soviet neuropsychologist and author A. R. Luria, “science became poetry.” [Continue reading…]


Over half of psychology studies fail reproducibility test

Nature reports: Don’t trust everything you read in the psychology literature. In fact, two thirds of it should probably be distrusted.

In the biggest project of its kind, Brian Nosek, a social psychologist and head of the Center for Open Science in Charlottesville, Virginia, and 269 co-authors repeated work reported in 98 original papers from three psychology journals, to see if they independently came up with the same results.

The studies they took on ranged from whether expressing insecurities perpetuates them to differences in how children and adults respond to fear stimuli, to effective ways to teach arithmetic. [Continue reading…]


Life’s stories

Julie Beck writes: In Paul Murray’s novel Skippy Dies, there’s a point where the main character, Howard, has an existential crisis.“‘It’s just not how I expected my life would be,'” he says.

“‘What did you expect?’” a friend responds.

“Howard ponders this. ‘I suppose—this sounds stupid, but I suppose I thought there’d be more of a narrative arc.’”

But it’s not stupid at all. Though perhaps the facts of someone’s life, presented end to end, wouldn’t much resemble a narrative to the outside observer, the way people choose to tell the stories of their lives, to others and — crucially — to themselves, almost always does have a narrative arc. In telling the story of how you became who you are, and of who you’re on your way to becoming, the story itself becomes a part of who you are.

“Life stories do not simply reflect personality. They are personality, or more accurately, they are important parts of personality, along with other parts, like dispositional traits, goals, and values,” writes Dan McAdams, a professor of psychology at Northwestern University, along with Erika Manczak, in a chapter for the APA Handbook of Personality and Social Psychology. [Continue reading…]


Move to prohibit psychologists from involvement in national security interrogations

The New York Times reports: The board of the American Psychological Association plans to recommend a tough ethics policy that would prohibit psychologists from involvement in all national security interrogations, potentially creating a new obstacle to the Obama administration’s efforts to detain and interrogate terrorism suspects outside of the traditional criminal justice system.

The board of the of the A.P.A., the nation’s largest professional organization for psychologists, is expected to recommend that members approve the ban at its annual meeting in Toronto next week, according to two members, Nadine Kaslow and Susan H. McDaniel, the group’s president-elect. The board’s proposal would make it a violation of the association’s ethical policies for psychologists to play a role in national security interrogations involving any military or intelligence personnel, even the noncoercive interrogations now conducted by the Obama administration. The board’s proposal must be voted on and approved by the members’ council to become a policy.

The board’s recommendation is a response to a report from earlier this month after an independent investigation into the involvement of prominent psychologists and association officials in the harsh interrogation programs operated by the C.I.A. and the Defense Department during the Bush administration. [Continue reading…]


Artificial neural networks on acid


Quartz reports: American sci-fi novelist Philip K. Dick once famously asked, Do Androids Dream of Electric Sheep? While he was on the right track, the answer appears to be, no, they don’t. They dream of dog-headed knights atop horses, of camel-birds and pig-snails, and of Dali-esque mutated landscapes.

Google’s image recognition software, which can detect, analyze, and even auto-caption images, uses artificial neural networks to simulate the human brain. In a process they’re calling “inceptionism,” Google engineers sought out to see what these artificial networks “dream” of — what, if anything, do they see in a nondescript image of clouds, for instance? What does a fake brain that’s trained to detect images of dogs see when it’s shown a picture of a knight?

Google trains the software by feeding it millions of images, eventually teaching it to recognize specific objects within a picture. When it’s fed an image, it is asked to emphasize the object in the image that it recognizes. The network is made up of layers — the higher the layer, the more precise the interpretation. Eventually, in the final output layer, the network makes a “decision” as to what’s in the image.

But the networks aren’t restricted to only identifying images. Their training allows them to generate images as well. [Continue reading…]


Even atheists intuitively believe in a creator


Tom Jacobs writes: Since the discoveries of Darwin, evidence has gradually mounted refuting the notion that the natural world is the product of a deity or other outside designer. Yet this idea remains firmly lodged in the human brain.

Just how firmly is the subject of newly published research, which finds even self-proclaimed atheists instinctively think of natural phenomena as being purposefully created.

The findings “suggest that there is a deeply rooted natural tendency to view nature as designed,” writes a research team led by Elisa Järnfelt of Newman University. They also provide evidence that, in the researchers’ words, “religious non-belief is cognitively effortful.” [Continue reading…]


Extreme athletes gain control through fear – and sometimes pay the price

By Tim Woodman, Bangor University; Lew Hardy, Bangor University, and Matthew Barlow, Bangor University

The death of famed “daredevil” climber and base jumper Dean Potter has once again raised the idea that all high-risk sportspeople are hedonistic thrill seekers. Our research into extreme athletes shows this view is simplistic and wrong.

It’s about attitudes to risk. In his famous Moon speech in 1962, John F Kennedy said:

Many years ago the great British explorer George Mallory, who was to die on Mount Everest, was asked [by a New York Times journalist] why did he want to climb it. He said, ‘Because it is there.’ Well, space is there, and we’re going to climb it, and the moon and the planets are there, and new hopes for knowledge and peace are there …

Humans have evolved through taking risks. In fact, most human actions can be conceptualised as containing an element of risk: as we take our first step, we risk falling down; as we try a new food, we risk being disgusted; as we ride a bicycle, we risk falling over; as we go on a date, we risk being rejected; and as we travel to the moon, we risk not coming back.

Human endeavour and risk are intertwined. So it is not surprising that despite the increasingly risk-averse society that we live in, many people crave danger and risk – a life less sanitised.

[Read more…]


The science of scarcity

Harvard Magazine: Toward the end of World War II, while thousands of Europeans were dying of hunger, 36 men at the University of Minnesota volunteered for a study that would send them to the brink of starvation. Allied troops advancing into German-occupied territories with supplies and food were encountering droves of skeletal people they had no idea how to safely renourish, and researchers at the university had designed a study they hoped might reveal the best methods of doing so. But first, their volunteers had to agree to starve.

The physical toll on these men was alarming: their metabolism slowed by 40 percent; sitting on atrophied muscles became painful; though their limbs were skeletal, their fluid-filled bellies looked curiously stout. But researchers also observed disturbing mental effects they hadn’t expected: obsessions about cookbooks and recipes developed; men with no previous interest in food thought — and talked — about nothing else. Overwhelming, uncontrollable thoughts had taken over, and as one participant later recalled, “Food became the one central and only thing really in one’s life.” There was no room left for anything else.

Though these odd behaviors were just a footnote in the original Minnesota study, to professor of economics Sendhil Mullainathan, who works on contemporary issues of poverty, they were among the most intriguing findings. Nearly 70 years after publication, that “footnote” showed something remarkable: scarcity had stolen more than flesh and muscle. It had captured the starving men’s minds.

Mullainathan is not a psychologist, but he has long been fascinated by how the mind works. As a behavioral economist, he looks at how people’s mental states and social and physical environments affect their economic actions. Research like the Minnesota study raised important questions: What happens to our minds — and our decisions — when we feel we have too little of something? Why, in the face of scarcity, do people so often make seemingly irrational, even counter-productive decisions? And if this is true in large populations, why do so few policies and programs take it into account?

In 2008, Mullainathan joined Eldar Shafir, Tod professor of psychology and public affairs at Princeton, to write a book exploring these questions. Scarcity: Why Having Too Little Means So Much (2013) presented years of findings from the fields of psychology and economics, as well as new empirical research of their own. Based on their analysis of the data, they sought to show that, just as food had possessed the minds of the starving volunteers in Minnesota, scarcity steals mental capacity wherever it occurs—from the hungry, to the lonely, to the time-strapped, to the poor.

That’s a phenomenon well-documented by psychologists: if the mind is focused on one thing, other abilities and skills — attention, self-control, and long-term planning — often suffer. Like a computer running multiple programs, Mullainathan and Shafir explain, our mental processors begin to slow down. We don’t lose any inherent capacities, just the ability to access the full complement ordinarily available for use.

But what’s most striking — and in some circles, controversial — about their work is not what they reveal about the effects of scarcity. It’s their assertion that scarcity affects anyone in its grip. Their argument: qualities often considered part of someone’s basic character — impulsive behavior, poor performance in school, poor financial decisions — may in fact be the products of a pervasive feeling of scarcity. And when that feeling is constant, as it is for people mired in poverty, it captures and compromises the mind.

This is one of scarcity’s most insidious effects, they argue: creating mindsets that rarely consider long-term best interests. “To put it bluntly,” says Mullainathan, “if I made you poor tomorrow, you’d probably start behaving in many of the same ways we associate with poor people.” And just like many poor people, he adds, you’d likely get stuck in the scarcity trap. [Continue reading…]


These are the memories you’re most likely to get wrong

Jennifer Talarico writes: It isn’t surprising that many Bostonians have vivid memories of the 2013 Marathon bombing, or that many New Yorkers have very clear memories about where they were and what they were doing on 9/11.

But many individuals who were not onsite for these attacks, or not even in Boston on Apr. 15, 2013 or in New York on Sept. 11, 2001, also have vivid memories of how they learned about these events. Why would people who were not immediately or directly affected have such a long-lasting sense of knowing exactly where they were and what they were doing when they heard the news?

These recollections are called flashbulb memories. In a flashbulb memory, we recall the experience of learning about an event, not the factual details of the event itself.

There might be an advantage to recalling the elements of important events that happen to us or to those close to us, but there appears to be little benefit to recalling our experience hearing this kind of news. So why does learning about a big event create such vivid memories? And just how accurate are flashbulb memories? [Continue reading…]


Searching the web creates an illusion of knowledge

Tom Jacobs writes: Surely you have noticed: A lot of people who have no idea what they are talking about are oddly certain of their superior knowledge. While this disconnect has been a problem throughout human history, new research suggests a ubiquitous feature of our high-tech world — the Internet — has made matters much worse.

In a series of studies, a Yale University research team led by psychologist Matthew Fisher shows that people who search for information on the Web emerge from the process with an inflated sense of how much they know — even regarding topics that are unrelated to the ones they Googled.

This illusion of knowledge appears to be “driven by the act of searching itself,” they write in the Journal of Experimental Psychology: General. Apparently conflating seeking information online with racking one’s brain, people consistently mistake “outsourced knowledge for internal knowledge.” [Continue reading…]


The enigma of survival

Sally Satel writes: The evil hour descended on David Morris in the summer of 2009. The former marine and war reporter was in a theater watching a movie with his then girlfriend and suddenly found himself pacing the lobby with no memory of having left his seat. Later, his girlfriend explained that Morris had fled after an explosion occurred onscreen.

He began having dreams of his buddies being ripped apart. When awake, he would imagine innocent items—an apple or a container of Chinese takeout—blowing up. Pathological vigilance took root: “Preparing for bed was like getting ready for a night patrol.” The dreams persisted. “Part of me,” he admits, “was ashamed of the dreams, of the realization that I was trapped inside a cliché: the veteran so obsessed with his own past that even his unconscious made love to it every night.”

Post-traumatic stress disorder is the subject of two new books, one by Morris and another by war reporter Mac McClelland. The symptoms are crippling: relentless nightmares, unbidden waking images, hyperarousal, sleeplessness, and phobias. As a diagnosis, it has existed colloquially for generations—“shell shock” is one name that survives in the modern idiom—and it has particular resonance because of this generation’s wars. (Most soldiers are spared it, though the public tends to think they are not. A 2012 poll found that most people believe that most post-9/11 veterans suffer from PTSD. The actual rate has been estimated at between two and 17 percent.)

Morris thinks the symptoms—a body and mind reacting in fear long after the threat to life and limb is gone—hardly encompass the experience of PTSD. Historically, we might have sought out not only shrinks but also “poetry, our families, or the clergy for solace post horror.” Profitably, Morris turns to everyone: the Greeks, the great poets of World War I, historians, anthropologists, and yes, psychiatrists and psychologists.

From such wide consultation comes a masterful synthesis. The Evil Hours interweaves memoir with a cultural history of war’s psychic aftermath. Morris chronicles the development of PTSD as an official diagnosis and its earlier incarnations in other wars. From Homer’s Odyssey to the venerated war poets, from the crusade for recognition by organized psychiatry to the modern science of fear and resilience, Morris gives a sweeping view of the condition, illuminated by meditation on sacrifice and danger and, in his words, “the enigma of survival.” [Continue reading…]


An integrated model of creativity and personality

Scott Barry Kaufman writes: Psychologists Guillaume Furst, Paolo Ghisletta and Todd Lubart present an integrative model of creativity and personality that is deeply grounded in past research on the personality of creative people.

Bringing together lots of different research threads over the years, they identified three “super-factors” of personality that predict creativity: Plasticity, Divergence, and Convergence.

Plasticity consists of the personality traits openness to experience, extraversion, high energy, and inspiration. The common factor here is high drive for exploration, and those high in this super-factor of personality tend to have a lot of dopamine — “the neuromodulator of exploration” — coursing through their brains. Prior research has shown a strong link between Plasticity and creativity, especially in the arts.

Divergence consists of non-conformity, impulsivity, low agreeableness, and low conscientiousness. People high in divergence may seem like jerks, but they are often just very independent thinkers. This super-factor is close to Hans Eysenck’s concept of “Psychoticism.” Throughout his life, Eysenck argued that these non-conforming characteristics were important contributors to high creative achievements.

Finally, Convergence consists of high conscientiousness, precision, persistence, and critical sense. While not typically included in discussions of creativity, these characteristics are also important contributors to the creative process. [Continue reading…]


Stoicism — a philosophy of gratitude

Lary Wallace writes: We do this to our philosophies. We redraft their contours based on projected shadows, or give them a cartoonish shape like a caricaturist emphasising all the wrong features. This is how Buddhism becomes, in the popular imagination, a doctrine of passivity and even laziness, while Existentialism becomes synonymous with apathy and futile despair. Something similar has happened to Stoicism, which is considered – when considered at all – a philosophy of grim endurance, of carrying on rather than getting over, of tolerating rather than transcending life’s agonies and adversities.

No wonder it’s not more popular. No wonder the Stoic sage, in Western culture, has never obtained the popularity of the Zen master. Even though Stoicism is far more accessible, not only does it lack the exotic mystique of Eastern practice; it’s also regarded as a philosophy of merely breaking even while remaining determinedly impassive. What this attitude ignores is the promise proffered by Stoicism of lasting transcendence and imperturbable tranquility.

It ignores gratitude, too. This is part of the tranquility, because it’s what makes the tranquility possible. Stoicism is, as much as anything, a philosophy of gratitude – and a gratitude, moreover, rugged enough to endure anything. Philosophers who pine for supreme psychological liberation have often failed to realise that they belong to a confederacy that includes the Stoics. [Continue reading…]


The art of not trying

John Tierney writes: Just be yourself.

The advice is as maddening as it is inescapable. It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself.

But when you’re nervous, how can you be yourself? How you can force yourself to relax? How can you try not to try?

It makes no sense, but the paradox is essential to civilization, according to Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.

He calls it the paradox of wu wei, the Chinese term for “effortless action.” Pronounced “ooo-way,” it has similarities to the concept of flow, that state of effortless performance sought by athletes, but it applies to a lot more than sports. Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.

Dr. Slingerland, a professor of Asian studies at the University of British Columbia, argues that the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good. [Continue reading…]


The conception of perception shaped by context