Category Archives: Psychology

Artificial neural networks on acid

red-tree-small-long-unsmoothed

Quartz reports: American sci-fi novelist Philip K. Dick once famously asked, Do Androids Dream of Electric Sheep? While he was on the right track, the answer appears to be, no, they don’t. They dream of dog-headed knights atop horses, of camel-birds and pig-snails, and of Dali-esque mutated landscapes.

Google’s image recognition software, which can detect, analyze, and even auto-caption images, uses artificial neural networks to simulate the human brain. In a process they’re calling “inceptionism,” Google engineers sought out to see what these artificial networks “dream” of — what, if anything, do they see in a nondescript image of clouds, for instance? What does a fake brain that’s trained to detect images of dogs see when it’s shown a picture of a knight?

Google trains the software by feeding it millions of images, eventually teaching it to recognize specific objects within a picture. When it’s fed an image, it is asked to emphasize the object in the image that it recognizes. The network is made up of layers — the higher the layer, the more precise the interpretation. Eventually, in the final output layer, the network makes a “decision” as to what’s in the image.

But the networks aren’t restricted to only identifying images. Their training allows them to generate images as well. [Continue reading…]

Facebooktwittermail

Even atheists intuitively believe in a creator

sunflower

Tom Jacobs writes: Since the discoveries of Darwin, evidence has gradually mounted refuting the notion that the natural world is the product of a deity or other outside designer. Yet this idea remains firmly lodged in the human brain.

Just how firmly is the subject of newly published research, which finds even self-proclaimed atheists instinctively think of natural phenomena as being purposefully created.

The findings “suggest that there is a deeply rooted natural tendency to view nature as designed,” writes a research team led by Elisa Järnfelt of Newman University. They also provide evidence that, in the researchers’ words, “religious non-belief is cognitively effortful.” [Continue reading…]

Facebooktwittermail

Extreme athletes gain control through fear – and sometimes pay the price

By Tim Woodman, Bangor University; Lew Hardy, Bangor University, and Matthew Barlow, Bangor University

The death of famed “daredevil” climber and base jumper Dean Potter has once again raised the idea that all high-risk sportspeople are hedonistic thrill seekers. Our research into extreme athletes shows this view is simplistic and wrong.

It’s about attitudes to risk. In his famous Moon speech in 1962, John F Kennedy said:

Many years ago the great British explorer George Mallory, who was to die on Mount Everest, was asked [by a New York Times journalist] why did he want to climb it. He said, ‘Because it is there.’ Well, space is there, and we’re going to climb it, and the moon and the planets are there, and new hopes for knowledge and peace are there …

Humans have evolved through taking risks. In fact, most human actions can be conceptualised as containing an element of risk: as we take our first step, we risk falling down; as we try a new food, we risk being disgusted; as we ride a bicycle, we risk falling over; as we go on a date, we risk being rejected; and as we travel to the moon, we risk not coming back.

Human endeavour and risk are intertwined. So it is not surprising that despite the increasingly risk-averse society that we live in, many people crave danger and risk – a life less sanitised.

Continue reading

Facebooktwittermail

The science of scarcity

Harvard Magazine: Toward the end of World War II, while thousands of Europeans were dying of hunger, 36 men at the University of Minnesota volunteered for a study that would send them to the brink of starvation. Allied troops advancing into German-occupied territories with supplies and food were encountering droves of skeletal people they had no idea how to safely renourish, and researchers at the university had designed a study they hoped might reveal the best methods of doing so. But first, their volunteers had to agree to starve.

The physical toll on these men was alarming: their metabolism slowed by 40 percent; sitting on atrophied muscles became painful; though their limbs were skeletal, their fluid-filled bellies looked curiously stout. But researchers also observed disturbing mental effects they hadn’t expected: obsessions about cookbooks and recipes developed; men with no previous interest in food thought — and talked — about nothing else. Overwhelming, uncontrollable thoughts had taken over, and as one participant later recalled, “Food became the one central and only thing really in one’s life.” There was no room left for anything else.

Though these odd behaviors were just a footnote in the original Minnesota study, to professor of economics Sendhil Mullainathan, who works on contemporary issues of poverty, they were among the most intriguing findings. Nearly 70 years after publication, that “footnote” showed something remarkable: scarcity had stolen more than flesh and muscle. It had captured the starving men’s minds.

Mullainathan is not a psychologist, but he has long been fascinated by how the mind works. As a behavioral economist, he looks at how people’s mental states and social and physical environments affect their economic actions. Research like the Minnesota study raised important questions: What happens to our minds — and our decisions — when we feel we have too little of something? Why, in the face of scarcity, do people so often make seemingly irrational, even counter-productive decisions? And if this is true in large populations, why do so few policies and programs take it into account?

In 2008, Mullainathan joined Eldar Shafir, Tod professor of psychology and public affairs at Princeton, to write a book exploring these questions. Scarcity: Why Having Too Little Means So Much (2013) presented years of findings from the fields of psychology and economics, as well as new empirical research of their own. Based on their analysis of the data, they sought to show that, just as food had possessed the minds of the starving volunteers in Minnesota, scarcity steals mental capacity wherever it occurs—from the hungry, to the lonely, to the time-strapped, to the poor.

That’s a phenomenon well-documented by psychologists: if the mind is focused on one thing, other abilities and skills — attention, self-control, and long-term planning — often suffer. Like a computer running multiple programs, Mullainathan and Shafir explain, our mental processors begin to slow down. We don’t lose any inherent capacities, just the ability to access the full complement ordinarily available for use.

But what’s most striking — and in some circles, controversial — about their work is not what they reveal about the effects of scarcity. It’s their assertion that scarcity affects anyone in its grip. Their argument: qualities often considered part of someone’s basic character — impulsive behavior, poor performance in school, poor financial decisions — may in fact be the products of a pervasive feeling of scarcity. And when that feeling is constant, as it is for people mired in poverty, it captures and compromises the mind.

This is one of scarcity’s most insidious effects, they argue: creating mindsets that rarely consider long-term best interests. “To put it bluntly,” says Mullainathan, “if I made you poor tomorrow, you’d probably start behaving in many of the same ways we associate with poor people.” And just like many poor people, he adds, you’d likely get stuck in the scarcity trap. [Continue reading…]

Facebooktwittermail

These are the memories you’re most likely to get wrong

Jennifer Talarico writes: It isn’t surprising that many Bostonians have vivid memories of the 2013 Marathon bombing, or that many New Yorkers have very clear memories about where they were and what they were doing on 9/11.

But many individuals who were not onsite for these attacks, or not even in Boston on Apr. 15, 2013 or in New York on Sept. 11, 2001, also have vivid memories of how they learned about these events. Why would people who were not immediately or directly affected have such a long-lasting sense of knowing exactly where they were and what they were doing when they heard the news?

These recollections are called flashbulb memories. In a flashbulb memory, we recall the experience of learning about an event, not the factual details of the event itself.

There might be an advantage to recalling the elements of important events that happen to us or to those close to us, but there appears to be little benefit to recalling our experience hearing this kind of news. So why does learning about a big event create such vivid memories? And just how accurate are flashbulb memories? [Continue reading…]

Facebooktwittermail

Searching the web creates an illusion of knowledge

Tom Jacobs writes: Surely you have noticed: A lot of people who have no idea what they are talking about are oddly certain of their superior knowledge. While this disconnect has been a problem throughout human history, new research suggests a ubiquitous feature of our high-tech world — the Internet — has made matters much worse.

In a series of studies, a Yale University research team led by psychologist Matthew Fisher shows that people who search for information on the Web emerge from the process with an inflated sense of how much they know — even regarding topics that are unrelated to the ones they Googled.

This illusion of knowledge appears to be “driven by the act of searching itself,” they write in the Journal of Experimental Psychology: General. Apparently conflating seeking information online with racking one’s brain, people consistently mistake “outsourced knowledge for internal knowledge.” [Continue reading…]

Facebooktwittermail

The enigma of survival

Sally Satel writes: The evil hour descended on David Morris in the summer of 2009. The former marine and war reporter was in a theater watching a movie with his then girlfriend and suddenly found himself pacing the lobby with no memory of having left his seat. Later, his girlfriend explained that Morris had fled after an explosion occurred onscreen.

He began having dreams of his buddies being ripped apart. When awake, he would imagine innocent items—an apple or a container of Chinese takeout—blowing up. Pathological vigilance took root: “Preparing for bed was like getting ready for a night patrol.” The dreams persisted. “Part of me,” he admits, “was ashamed of the dreams, of the realization that I was trapped inside a cliché: the veteran so obsessed with his own past that even his unconscious made love to it every night.”

Post-traumatic stress disorder is the subject of two new books, one by Morris and another by war reporter Mac McClelland. The symptoms are crippling: relentless nightmares, unbidden waking images, hyperarousal, sleeplessness, and phobias. As a diagnosis, it has existed colloquially for generations—“shell shock” is one name that survives in the modern idiom—and it has particular resonance because of this generation’s wars. (Most soldiers are spared it, though the public tends to think they are not. A 2012 poll found that most people believe that most post-9/11 veterans suffer from PTSD. The actual rate has been estimated at between two and 17 percent.)

Morris thinks the symptoms—a body and mind reacting in fear long after the threat to life and limb is gone—hardly encompass the experience of PTSD. Historically, we might have sought out not only shrinks but also “poetry, our families, or the clergy for solace post horror.” Profitably, Morris turns to everyone: the Greeks, the great poets of World War I, historians, anthropologists, and yes, psychiatrists and psychologists.

From such wide consultation comes a masterful synthesis. The Evil Hours interweaves memoir with a cultural history of war’s psychic aftermath. Morris chronicles the development of PTSD as an official diagnosis and its earlier incarnations in other wars. From Homer’s Odyssey to the venerated war poets, from the crusade for recognition by organized psychiatry to the modern science of fear and resilience, Morris gives a sweeping view of the condition, illuminated by meditation on sacrifice and danger and, in his words, “the enigma of survival.” [Continue reading…]

Facebooktwittermail

An integrated model of creativity and personality

Scott Barry Kaufman writes: Psychologists Guillaume Furst, Paolo Ghisletta and Todd Lubart present an integrative model of creativity and personality that is deeply grounded in past research on the personality of creative people.

Bringing together lots of different research threads over the years, they identified three “super-factors” of personality that predict creativity: Plasticity, Divergence, and Convergence.

Plasticity consists of the personality traits openness to experience, extraversion, high energy, and inspiration. The common factor here is high drive for exploration, and those high in this super-factor of personality tend to have a lot of dopamine — “the neuromodulator of exploration” — coursing through their brains. Prior research has shown a strong link between Plasticity and creativity, especially in the arts.

Divergence consists of non-conformity, impulsivity, low agreeableness, and low conscientiousness. People high in divergence may seem like jerks, but they are often just very independent thinkers. This super-factor is close to Hans Eysenck’s concept of “Psychoticism.” Throughout his life, Eysenck argued that these non-conforming characteristics were important contributors to high creative achievements.

Finally, Convergence consists of high conscientiousness, precision, persistence, and critical sense. While not typically included in discussions of creativity, these characteristics are also important contributors to the creative process. [Continue reading…]

Facebooktwittermail

Stoicism — a philosophy of gratitude

Lary Wallace writes: We do this to our philosophies. We redraft their contours based on projected shadows, or give them a cartoonish shape like a caricaturist emphasising all the wrong features. This is how Buddhism becomes, in the popular imagination, a doctrine of passivity and even laziness, while Existentialism becomes synonymous with apathy and futile despair. Something similar has happened to Stoicism, which is considered – when considered at all – a philosophy of grim endurance, of carrying on rather than getting over, of tolerating rather than transcending life’s agonies and adversities.

No wonder it’s not more popular. No wonder the Stoic sage, in Western culture, has never obtained the popularity of the Zen master. Even though Stoicism is far more accessible, not only does it lack the exotic mystique of Eastern practice; it’s also regarded as a philosophy of merely breaking even while remaining determinedly impassive. What this attitude ignores is the promise proffered by Stoicism of lasting transcendence and imperturbable tranquility.

It ignores gratitude, too. This is part of the tranquility, because it’s what makes the tranquility possible. Stoicism is, as much as anything, a philosophy of gratitude – and a gratitude, moreover, rugged enough to endure anything. Philosophers who pine for supreme psychological liberation have often failed to realise that they belong to a confederacy that includes the Stoics. [Continue reading…]

Facebooktwittermail

The art of not trying

John Tierney writes: Just be yourself.

The advice is as maddening as it is inescapable. It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself.

But when you’re nervous, how can you be yourself? How you can force yourself to relax? How can you try not to try?

It makes no sense, but the paradox is essential to civilization, according to Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.

He calls it the paradox of wu wei, the Chinese term for “effortless action.” Pronounced “ooo-way,” it has similarities to the concept of flow, that state of effortless performance sought by athletes, but it applies to a lot more than sports. Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.

Dr. Slingerland, a professor of Asian studies at the University of British Columbia, argues that the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good. [Continue reading…]

Facebooktwittermail

How Darkness Visible shined a light

Peter Fulham writes: Twenty-five years ago, in December, 1989, Darkness Visible, William Styron’s account of his descent into the depths of clinical depression and back, appeared in Vanity Fair. The piece revealed in unsparing detail how Styron’s lifelong melancholy at once gave way to a seductive urge to end his own life. A few months later, he released the essay as a book, augmenting the article with a recollection of when the illness first took hold of him: in Paris, as he was about to accept the 1985 Prix mondial Cino Del Duca, the French literary award. By the author’s own acknowledgement, the response from readers was unprecedented. “This was just overwhelming. It was just by the thousands that the letters came in,” he told Charlie Rose. “I had not really realized that it was going to touch that kind of a nerve.”

Styron may have been startled by the outpouring of mail, but in many ways, it’s easy to understand. The academic research on mental illness at the time was relatively comprehensive, but no one to date had offered the kind of report that Styron gave to the public: a firsthand account of what it’s like to have the monstrous condition overtake you. He also exposed the inadequacy of the word itself, which is still used interchangeably to describe a case of the blues, rather than the tempestuous agony sufferers know too well.

Depression is notoriously hard to describe, but Styron managed to split the atom. “I’d feel the horror, like some poisonous fogbank, roll in upon my mind,” he wrote in one chapter. In another: “It is not an immediately identifiable pain, like that of a broken limb. It may be more accurate to say that despair… comes to resemble the diabolical discomfort of being imprisoned in a fiercely overheated room. And because no breeze stirs this cauldron… it is entirely natural that the victim begins to think ceaselessly of oblivion.”

As someone who has fought intermittently with the same illness since college, those sentences were cathartic, just as I suspect they were for the many readers who wrote to Styron disclosing unequivocally that he had saved their lives. As brutal as depression can be, one of the main ways a person can restrain it is through solidarity. You are not alone, Styron reminded his readers, and the fog will lift. Patience is paramount. [Continue reading…]

Facebooktwittermail

Why moral character is the key to personal identity

Nina Strohminger writes: One morning after her accident, a woman I’ll call Kate awoke in a daze. She looked at the man next to her in bed. He resembled her husband, with the same coppery beard and freckles dusted across his shoulders. But this man was definitely not her husband.

Panicked, she packed a small bag and headed to her psychiatrist’s office. On the bus, there was a man she had been encountering with increasing frequency over the past several weeks. The man was clever, he was a spy. He always appeared in a different form: one day as a little girl in a sundress, another time as a bike courier who smirked at her knowingly. She explained these bizarre developments to her doctor, who was quickly becoming one of the last voices in this world she could trust. But as he spoke, her stomach sank with a dreaded realisation: this man, too, was an impostor.

Kate has Capgras syndrome, the unshakeable belief that someone – often a loved one, sometimes oneself – has been replaced with an exact replica. She also has Fregoli syndrome, the delusion that the same person is taking on a variety of shapes, like an actor donning an expert disguise. Capgras and Fregoli delusions offer hints about an extraordinary cognitive mechanism active in the healthy mind, a mechanism so exquisitely tuned that we are hardly ever aware of it. This mechanism ascribes to each person a unique identity, and then meticulously tracks and updates it. This mechanism is crucial to virtually every human interaction, from navigating a party to navigating a marriage. Without it, we quickly fall apart. [Continue reading…]

Facebooktwittermail

Gossip makes human society possible

Julie Beck writes: While gossiping is a behavior that has long been frowned upon, perhaps no one has frowned quite so intensely as the 16th- and 17th-century British. Back then, gossips, or “scolds” were sometimes forced to wear a menacing iron cage on their heads, called the “branks” or “scold’s bridle.” These masks purportedly had iron spikes or bits that went in the mouth and prevented the wearer from speaking. (And of course, of course, this ghastly punishment seems to have been mostly for women who were talking too much.)

Today, people who gossip are still not very well-liked, though we tend to resist the urge to cage their heads. Progress. And yet the reflexive distaste people feel for gossip and those who gossip in general is often nowhere to be found when people find themselves actually faced with a juicy morsel about someone they know. Social topics—personal relationships, likes and dislikes, anecdotes about social activities—made up about two-thirds of all conversations in analyses done by evolutionary psychologist Robin Dunbar. The remaining one-third of their time not spent talking about other people was devoted to discussing everything else: sports, music, politics, etc.

“Language in freely forming natural conversations is principally used for the exchange of social information,” Dunbar writes. “That such topics are so overwhelmingly important to us suggests that this is a primary function of language.” He even goes so far as to say: “Gossip is what makes human society as we know it possible.”

In recent years, research on the positive effects of gossip has proliferated. Rather than just a means to humiliate people and make them cry in the bathroom, gossip is now being considered by scientists as a way to learn about cultural norms, bond with others, promote cooperation, and even, as one recent study found, allow individuals to gauge their own success and social standing. [Continue reading…]

Facebooktwittermail

Denying problems when we don’t like the political solutions

Phys.org: A new study from Duke University finds that people will evaluate scientific evidence based on whether they view its policy implications as politically desirable. If they don’t, then they tend to deny the problem even exists.

“Logically, the proposed solution to a problem, such as an increase in government regulation or an extension of the free market, should not influence one’s belief in the problem. However, we find it does,” said co-author Troy Campbell, a Ph.D. candidate at Duke’s Fuqua School of Business. “The cure can be more immediately threatening than the problem.”

The study, “Solution Aversion: On the Relation Between Ideology and Motivated Disbelief,” appears in the November issue of the Journal of Personality and Social Psychology.

The researchers conducted three experiments (with samples ranging from 120 to 188 participants) on three different issues—climate change, air pollution that harms lungs, and crime.

“The goal was to test, in a scientifically controlled manner, the question: Does the desirability of a solution affect beliefs in the existence of the associated problem? In other words, does what we call ‘solution aversion’ exist?” Campbell said.

“We found the answer is yes. And we found it occurs in response to some of the most common solutions for popularly discussed problems.”

For climate change, the researchers conducted an experiment to examine why more Republicans than Democrats seem to deny its existence, despite strong scientific evidence that supports it.

One explanation, they found, may have more to do with conservatives’ general opposition to the most popular solution—increasing government regulation—than with any difference in fear of the climate change problem itself, as some have proposed. [Continue reading…]

Facebooktwittermail

Cognitive disinhibition: the kernel of genius and madness

Dean Keith Simonton writes: When John Forbes Nash, the Nobel Prize-winning mathematician, schizophrenic, and paranoid delusional, was asked how he could believe that space aliens had recruited him to save the world, he gave a simple response. “Because the ideas I had about supernatural beings came to me the same way that my mathematical ideas did. So I took them seriously.”

Nash is hardly the only so-called mad genius in history. Suicide victims like painters Vincent Van Gogh and Mark Rothko, novelists Virginia Woolf and Ernest Hemingway, and poets Anne Sexton and Sylvia Plath all offer prime examples. Even ignoring those great creators who did not kill themselves in a fit of deep depression, it remains easy to list persons who endured well-documented psychopathology, including the composer Robert Schumann, the poet Emily Dickinson, and Nash. Creative geniuses who have succumbed to alcoholism or other addictions are also legion.

Instances such as these have led many to suppose that creativity and psychopathology are intimately related. Indeed, the notion that creative genius might have some touch of madness goes back to Plato and Aristotle. But some recent psychologists argue that the whole idea is a pure hoax. After all, it is certainly no problem to come up with the names of creative geniuses who seem to have displayed no signs or symptoms of mental illness.

Opponents of the mad genius idea can also point to two solid facts. First, the number of creative geniuses in the entire history of human civilization is very large. Thus, even if these people were actually less prone to psychopathology than the average person, the number with mental illness could still be extremely large. Second, the permanent inhabitants of mental asylums do not usually produce creative masterworks. The closest exception that anyone might imagine is the notorious Marquis de Sade. Even in his case, his greatest (or rather most sadistic) works were written while he was imprisoned as a criminal rather than institutionalized as a lunatic.

So should we believe that creative genius is connected with madness or not? Modern empirical research suggests that we should because it has pinpointed the connection between madness and creativity clearly. The most important process underlying strokes of creative genius is cognitive disinhibition — the tendency to pay attention to things that normally should be ignored or filtered out by attention because they appear irrelevant. [Continue reading…]

Facebooktwittermail

We are all confident idiots

David Dunning writes: Last March, during the enormous South by Southwest music festival in Austin, Texas, the late-night talk show Jimmy Kimmel Live! sent a camera crew out into the streets to catch hipsters bluffing. “People who go to music festivals pride themselves on knowing who the next acts are,” Kimmel said to his studio audience, “even if they don’t actually know who the new acts are.” So the host had his crew ask festival-goers for their thoughts about bands that don’t exist.

“The big buzz on the street,” said one of Kimmel’s interviewers to a man wearing thick-framed glasses and a whimsical T-shirt, “is Contact Dermatitis. Do you think he has what it takes to really make it to the big time?”

“Absolutely,” came the dazed fan’s reply.

The prank was an installment of Kimmel’s recurring “Lie Witness News” feature, which involves asking pedestrians a variety of questions with false premises. In another episode, Kimmel’s crew asked people on Hollywood Boulevard whether they thought the 2014 film Godzilla was insensitive to survivors of the 1954 giant lizard attack on Tokyo; in a third, they asked whether Bill Clinton gets enough credit for ending the Korean War, and whether his appearance as a judge on America’s Got Talent would damage his legacy. “No,” said one woman to this last question. “It will make him even more popular.”

One can’t help but feel for the people who fall into Kimmel’s trap. Some appear willing to say just about anything on camera to hide their cluelessness about the subject at hand (which, of course, has the opposite effect). Others seem eager to please, not wanting to let the interviewer down by giving the most boringly appropriate response: I don’t know. But for some of these interviewees, the trap may be an even deeper one. The most confident-sounding respondents often seem to think they do have some clue—as if there is some fact, some memory, or some intuition that assures them their answer is reasonable. [Continue reading…]

Facebooktwittermail

The biology of deceit

Daniel N Jones writes: It’s the friend who betrays you, the lover living a secret life, the job applicant with the fabricated résumé, or the sham sales pitch too good to resist. From the time humans learnt to co‑operate, we also learnt to deceive each other. For deception to be effective, individuals must hide their true intentions. But deception is hardly limited to humans. There is a never-ending arms race between the deceiver and the deceived among most living things. By studying different patterns of deception across the species, we can learn to better defend ourselves from dishonesty in the human world.

My early grasp of human deception came from the work of my adviser, the psychologist Delroy Paulhus at the University of British Columbia in Canada, who studied what he called the dark triad of personality: psychopathy, recognised by callous affect and reckless deceit; narcissism, a sense of grandiose entitlement and self-centered overconfidence; and Machiavellianism, the cynical and strategic manipulation of others.

If you look at the animal world, it’s clear that dark traits run through species from high to low. Some predators are fast, mobile and wide-ranging, executing their deceptions on as many others as they can; they resemble human psychopaths. Others are slow, stalking their prey in a specific, strategic (almost Machiavellian) way. Given the parallels between humans and other animals, I began to conceive my Mimicry Deception Theory, which argues that long- and short-term deceptive strategies cut across species, often by mimicking other lifestyles or forms.

Much of the foundational work for this idea comes from the evolutionary biologist Robert Trivers, who noted that many organisms gain an evolutionary advantage through deception. [Continue reading…]

Facebooktwittermail