Tom Bartlett writes: The former battery factory on the outskirts of Srebrenica, a small town in eastern Bosnia, has become a grim tourist attraction. Vans full of sightseers, mostly from other countries, arrive here daily to see the crumbling industrial structure, which once served as a makeshift United Nations outpost and temporary haven for Muslims under assault by Serb forces determined to seize the town and round up its residents. In July 1995 more than 8,000 Muslim men, from teenagers to the elderly, were murdered in and around Srebrenica, lined up behind houses, gunned down in soccer fields, hunted through the forest.
The factory is now a low-budget museum where you can watch a short film about the genocide and meet a survivor, a soft-spoken man in his mid-30s who has repeated the story of his escape and the death of his father and brother nearly every day here for the past five years. Visitors are then led to a cavernous room with display cases containing the personal effects of victims—a comb, two marbles, a handkerchief, a house key, a wedding ring, a pocket watch with a bullet hole—alongside water-stained photographs of the atrocity hung on cracked concrete walls. The English translations of the captions make for a kind of accidental poetry. “Frightened mothers with weeping children: where and how to go on … ?” reads one. “Endless sorrow for the dearest,” says another.
Across the street from the museum is a memorial bearing the names of the known victims, flanked by rows and rows of graves, each with an identical white marker. Nearby an old woman runs a tiny souvenir shop selling, among other items, baseball caps with the message “Srebrenica: Never Forget.”
This place is a symbol of the 1995 massacre, which, in turn, is a symbol of the entire conflict that followed the breakup of Yugoslavia. The killings here were a fraction of the total body count; The Bosnian Book of the Dead, published early this year, lists 96,000 who perished, though there were thousands more. It was the efficient brutality in Srebrenica that prompted the international community, after years of dithering and half measures, to take significant military action.
While that action ended the bloodshed, the reckoning is far from finished. Fragments of bone are still being sifted from the soil, sent for DNA analysis, and returned to families for burial. The general who led the campaign, Ratko Mladic, is on trial in The Hague after years on the run. In a recent proceeding, Mladic stared at a group of Srebrenica survivors in the gallery and drew a single finger across his throat. Around the same time, the president of Serbia issued a nonapology apology for the massacre, neglecting to call it genocide and using language so vague it seemed more insult than olive branch.
Standing near the memorial, surrounded by the dead, the driver of one of those tourist-filled vans, a Muslim who helped defend Sarajevo during a nearly four-year siege, briefly drops his sunny, professional demeanor. “How can you forgive when they say it didn’t happen?” he says. “The Nazis, they killed millions. They say, ‘OK, we are sorry.’ But the Serbs don’t do that.”
Some Serbs do acknowledge the genocide. According to a 2010 survey, though, most Serbs believe that whatever happened at Srebrenica has been exaggerated, despite being among the most scientifically documented mass killings in history. They shrug it off as a byproduct of war or cling to conspiracy theories or complain about being portrayed as villains. The facts disappear in a swirl of doubts and denial.[Continue reading...]
BBC News reports: Smaller animals tend to perceive time as if it is passing in slow motion, a new study has shown.
This means that they can observe movement on a finer timescale than bigger creatures, allowing them to escape from larger predators.
Insects and small birds, for example, can see more information in one second than a larger animal such as an elephant.
The work is published in the journal Animal Behaviour.
“The ability to perceive time on very small scales may be the difference between life and death for fast-moving organisms such as predators and their prey,” said lead author Kevin Healy, at Trinity College Dublin (TCD), Ireland.
The reverse was found in bigger animals, which may miss things that smaller creatures can rapidly spot. [Continue reading...]
William Saletan writes: To believe that the U.S. government planned or deliberately allowed the 9/11 attacks, you’d have to posit that President Bush intentionally sacrificed 3,000 Americans. To believe that explosives, not planes, brought down the buildings, you’d have to imagine an operation large enough to plant the devices without anyone getting caught. To insist that the truth remains hidden, you’d have to assume that everyone who has reviewed the attacks and the events leading up to them — the CIA, the Justice Department, the Federal Aviation Administration, the North American Aerospace Defense Command, the Federal Emergency Management Agency, scientific organizations, peer-reviewed journals, news organizations, the airlines, and local law enforcement agencies in three states — was incompetent, deceived, or part of the cover-up.
And yet, as Slate’s Jeremy Stahl points out, millions of Americans hold these beliefs. In a Zogby poll taken six years ago, only 64 percent of U.S. adults agreed that the attacks “caught US intelligence and military forces off guard.” More than 30 percent chose a different conclusion: that “certain elements in the US government knew the attacks were coming but consciously let them proceed for various political, military, and economic motives,” or that these government elements “actively planned or assisted some aspects of the attacks.”
How can this be? How can so many people, in the name of skepticism, promote so many absurdities?
The answer is that people who suspect conspiracies aren’t really skeptics. Like the rest of us, they’re selective doubters. They favor a worldview, which they uncritically defend. But their worldview isn’t about God, values, freedom, or equality. It’s about the omnipotence of elites. [Continue reading...]
Pacific Standard: When was the last time you engaged in unethical behavior? Be honest, now, and be specific: What time of day was it when you cheated on that test, lied to your spouse, or stole that item from the company break room?
If it was late afternoon or evening, you don’t have an excuse, exactly, but you certainly have company.
A newly published paper entitled The Morning Morality Effect suggests we’re more likely to act unethically later in the day. It provides further evidence that self-control is a finite resource that gradually gets depleted, and can’t be easily accessed when our reserves are low. [Continue reading...]
Katrin Bennhold writes: From a comfortable couch in his London living room, Sean O’Callaghan had been watching the shaky televised images of terrified people running from militants in an upscale mall in Kenya. Some of those inside had been asked their religion. Muslims were spared, non-Muslims executed.
“God, this is one tough lot of jihadis,” said a friend, a fellow Irishman, shaking his head.
“But we used to do the same thing,” Mr. O’Callaghan replied.
There was the 1976 Kingsmill massacre. Catholic gunmen stopped a van with 12 workmen in County Armagh, Northern Ireland, freed the one Catholic among them and lined up the 11 Protestants and shot them one by one.
Mr. O’Callaghan, a former paramilitary with the Irish Republican Army, has particular insight into such coldblooded killing.
On a sunny August day in 1974, he walked into a bar in Omagh, Northern Ireland, drew a short-barreled pistol and shot a man bent over the racing pages at the end of the counter, a man he had been told was a notorious traitor to the Irish Catholic cause.
Historical parallels are inevitably flawed. But a recent flurry of horrific bloodletting — the attack in Nairobi that left 60 dead, the execution by Syrian jihadis of bound and blindfolded prisoners, an Egyptian soldier peering through his rifle sight and firing on the teenage daughter of a Muslim Brotherhood leader — raises a question as old as Cain and Abel: Do we all have it in us?
Many experts think we do. For Mr. O’Callaghan, it was a matter of focus.
“What you’re seeing in that moment,” he said in an interview last week, “is not a human being.”
It is dangerous to assume that it takes a monster to commit a monstrosity, said Herbert Kelman, professor emeritus of social ethics at Harvard. [Continue reading...]
Steven Shapin writes: In the movie “Groundhog Day,” the TV weatherman Phil Connors finds himself living the same day again and again. This has its advantages, as he has hundreds of chances to get things right. He can learn to speak French, to sculpt ice, to play jazz piano, and to become the kind of person with whom his beautiful colleague Rita might fall in love. But it’s a torment, too. An awful solitude flows from the fact that he’s the only one in Punxsutawney, Pennsylvania, who knows that something has gone terribly wrong with time. Nobody else seems to have any memory of all the previous iterations of the day. What is a new day for Rita is another of the same for Phil. Their realities are different—what passes between them in Phil’s world leaves no trace in hers—as are their senses of selfhood: Phil knows Rita as she cannot know him, because he knows her day after day after day, while she knows him only today. Time, reality, and identity are each curated by memory, but Phil’s and Rita’s memories work differently. From Phil’s point of view, she, and everyone else in Punxsutawney, is suffering from amnesia.
Amnesia comes in distinct varieties. In “retrograde amnesia,” a movie staple, victims are unable to retrieve some or all of their past knowledge — Who am I? Why does this woman say that she’s my wife? — but they can accumulate memories for everything that they experience after the onset of the condition. In the less cinematically attractive “anterograde amnesia,” memory of the past is more or less intact, but those who suffer from it can’t lay down new memories; every person encountered every day is met for the first time. In extremely unfortunate cases, retrograde and anterograde amnesia can occur in the same individual, who is then said to suffer from “transient global amnesia,” a condition that is, thankfully, temporary. Amnesias vary in their duration, scope, and originating events: brain injury, stroke, tumors, epilepsy, electroconvulsive therapy, and psychological trauma are common causes, while drug and alcohol use, malnutrition, and chemotherapy may play a part.
There isn’t a lot that modern medicine can do for amnesiacs. If cerebral bleeding or clots are involved, these may be treated, and occupational and cognitive therapy can help in some cases. Usually, either the condition goes away or amnesiacs learn to live with it as best they can — unless the notion of learning is itself compromised, along with what it means to have a life. Then, a few select amnesiacs disappear from systems of medical treatment and reappear as star players in neuroscience and cognitive psychology.
No star ever shone more brightly in these areas than Henry Gustave Molaison, a patient who, for more than half a century, until his death, in 2008, was known only as H.M., and who is now the subject of a book, “Permanent Present Tense” (Basic), by Suzanne Corkin, the neuroscientist most intimately involved in his case. [Continue reading...]
Robert W Merry writes: In 1972, Duke University professor James David Barber brought out a book that immediately was heralded as a seminal study of presidential character. Titled The Presidential Character: Predicting Performance in the White House, the book looked at qualities of temperament and personality in assessing how the country’s chief executives approached the presidency—and how that in turn contributed to their success or failure in the office.
Although there were flaws in Barber’s approach, particularly in his efforts to typecast the personalities of various presidents, it does indeed lay before us an interesting and worthy matrix for assessing how various presidents approach the job and the ultimate quality of their leadership. So let’s apply the Barber matrix to the presidential incumbent, Barack Obama.
Barber, who died in 2004, assessed presidents based on two indices: first, whether they were “positive” or “negative”; and, second, whether they were “active” or “passive.” The first index—the positive/negative one—assesses how presidents regarded themselves in relation to the challenges of the office; so, for example, did they embrace the job with a joyful optimism or regard it as a necessary martyrdom they must sustain in order to prove their own self-worth? The second index—active vs. passive—measures their degree of wanting to accomplish big things or retreat into a reactive governing mode.
These two indices produce four categories of presidents, to wit:
Active-Positive: These are presidents with big national ambitions who are self-confident, flexible, optimistic, joyful in the exercise of power, possessing a certain philosophical detachment toward what they regard as a great game.
Active-Negative: These are compulsive people with low self-esteem, seekers of power as a means of self-actualization, given to rigidity and pessimism, driven, sometimes overly aggressive. But they harbor big dreams for bringing about accomplishments of large historical dimension.
Passive-Positive: These are compliant presidents who react to events rather than initiating them. They want to be loved and are thus ingratiating—and easily manipulated. They are “superficially optimistic” and harbor generally modest ambitions for their presidential years. But they are healthy in both ego and self-esteem.
Passive-Negative: These are withdrawn people with low self-esteem and little zest for the give-and-take of politics and the glad-handing requirements of the game. They avoid conflict and take no joy in the uses of power. They tend to get themselves boxed up through a preoccupation with principles, rules and procedures. [Continue reading...]
Daniel Goleman writes: Turning a blind eye. Giving someone the cold shoulder. Looking down on people. Seeing right through them.
These metaphors for condescending or dismissive behavior are more than just descriptive. They suggest, to a surprisingly accurate extent, the social distance between those with greater power and those with less — a distance that goes beyond the realm of interpersonal interactions and may exacerbate the soaring inequality in the United States.
A growing body of recent research shows that people with the most social power pay scant attention to those with little such power. This tuning out has been observed, for instance, with strangers in a mere five-minute get-acquainted session, where the more powerful person shows fewer signals of paying attention, like nodding or laughing. Higher-status people are also more likely to express disregard, through facial expressions, and are more likely to take over the conversation and interrupt or look past the other speaker.
Bringing the micropolitics of interpersonal attention to the understanding of social power, researchers are suggesting, has implications for public policy.
Of course, in any society, social power is relative; any of us may be higher or lower in a given interaction, and the research shows the effect still prevails. Though the more powerful pay less attention to us than we do to them, in other situations we are relatively higher on the totem pole of status — and we, too, tend to pay less attention to those a rung or two down.
A prerequisite to empathy is simply paying attention to the person in pain. In 2008, social psychologists from the University of Amsterdam and the University of California, Berkeley, studied pairs of strangers telling one another about difficulties they had been through, like a divorce or death of a loved one. The researchers found that the differential expressed itself in the playing down of suffering. The more powerful were less compassionate toward the hardships described by the less powerful. [Continue reading...]
Adam Grant asks: What makes some men miserly and others generous? What motivated Bill Gates, for example, to make more than $28 billion in philanthropic gifts while many of his billionaire peers kept relatively tightfisted control over their personal fortunes?
New evidence reveals a surprising answer. The mere presence of female family members — even infants — can be enough to nudge men in the generous direction.
In a provocative new study, the researchers Michael Dahl, Cristian Dezso and David Gaddis Ross examined generosity and what inspires it in wealthy men. Rather than looking at large-scale charitable giving, they looked at why some male chief executives paid their employees more generously than others. The researchers tracked the wages that male chief executives at more than 10,000 Danish companies paid their employees over the course of a decade.
Interestingly, the chief executives paid their employees less after becoming fathers. On average, after chief executives had a child, they paid about $100 less in annual compensation per employee. To be a good provider, the researchers write, it’s all too common for a male chief executive to claim “his firm’s resources for himself and his growing family, at the expense of his employees.”
But there was a twist. When Professor Dahl’s team examined the data more closely, the changes in pay depended on the gender of the child that the chief executives fathered. They reduced wages after having a son, but not after having a daughter.
Daughters apparently soften fathers and evoke more caretaking tendencies. The speculation is that as we brush our daughters’ hair and take them to dance classes, we become gentler, more empathetic and more other-oriented. [Continue reading...]
Joe Moran writes: If I had to describe being shy, I’d say it was like coming late to a party when everyone else is about three glasses in. All human interaction, if it is to develop from small talk into meaningful conversation, draws on shared knowledge and tacit understandings. But if you’re shy, it feels like you just nipped out of the room when they handed out this information. W Compton Leith, a reclusive curator at the British Museum whose book Apologia Diffidentis (1908) is a pioneering anthropology of shy people, wrote that ‘they go through life like persons afflicted with a partial deafness; between them and the happier world there is as it were a crystalline wall which the pleasant low voices of confidence can never traverse’.
Shyness has no logic: it impinges randomly on certain areas of my life and not others. What for most people is the biggest social fear of all, public speaking, I find fairly easy. Lecturing is a performance that allows me simply to impersonate a ‘normal’, working human being. Q&As, however, are another matter: there the performance ends and I will be found out. That left-field question from the audience, followed by brain-freeze and a calamitous attempt at an answer that ties itself up in tortured syntax and dissolves into terrifying silence. Though this rarely happens to me in real life, it has occurred often enough to fuel my catastrophising imagination.
The historian Theodore Zeldin once wondered how different the history of the world might seem if you told it, not through the story of war, politics or economics, but through the development of emotions. ‘One way of tackling it might be to write the history of shyness,’ he mused. ‘Nations may be unable to avoid fighting each other because of the myths and paranoias that separate them: shyness is one of the counterparts to these barriers on an individual level.’ The history of shyness might well make a fascinating research project, but it would be hellishly difficult to write. Shyness is by its nature a subjective, nebulous state that leaves little concrete evidence behind, if only because people are often too uncomfortable with their shyness to speak or write about it. [Continue reading...]
Michael P Lynch writes: In the wake of continuing revelations of government spying programs and the recent Supreme Court ruling on DNA collection – both of which push the generally accepted boundaries against state intrusion on the person — the issue of privacy is foremost on the public mind. The frequent mantra, heard from both media commentators and government officials, is that we face a “trade-off” between safety and convenience on one hand and privacy on the other. We just need, we are told, to find the right balance.
This way of framing the issue makes sense if you understand privacy solely as a political or legal concept. And its political importance is certainly part of what makes privacy so important: what is private is what is yours alone to control, without interference from others or the state. But the concept of privacy also matters for another, deeper reason. It is intimately connected to what it is to be an autonomous person.
What makes your thoughts your thoughts? One answer is that you have what philosophers sometimes call “privileged access” to them. This means at least two things. First, you access them in a way I can’t. Even if I could walk a mile in your shoes, I can’t know what you feel in the same way you can: you see it from the inside so to speak. Second, you can, at least sometimes, control what I know about your thoughts. You can hide your true feelings from me, or let me have the key to your heart.
The idea that the mind is essentially private is a central element of the Cartesian concept of the self — a concept that has been largely abandoned, for a variety of reasons. Descartes not only held that my thoughts were private, he took them to be transparent — all thoughts were conscious. Freud cured us of that. Descartes also thought that the only way to account for my special access to my thoughts was to take thoughts to be made out of a different sort of stuff than my body — to take our minds, in short, to be non-physical, distinct from the brain. Contemporary neuroscience and psychology have convinced many of us otherwise.
But while Descartes’s overall view has been rightly rejected, there is something profoundly right about the connection between privacy and the self, something that recent events should cause us to appreciate. What is right about it, in my view, is that to be an autonomous person is to be capable of having privileged access (in the two senses defined above) to information about your psychological profile — your hopes, dreams, beliefs and fears. A capacity for privacy is a necessary condition of autonomous personhood. [Continue reading...]
Ethan Watters writes: Imagine for a moment that the American Psychiatric Association was about to compile a new edition of its Diagnostic and Statistical Manual of Mental Disorders. But instead of 2013, imagine, just for fun, that the year is 1880.
Transported to the world of the late 19th century, the psychiatric body would have virtually no choice but to include hysteria in the pages of its new volume. Women by the tens of thousands, after all, displayed the distinctive signs: convulsive fits, facial tics, spinal irritation, sensitivity to touch, and leg paralysis. Not a doctor in the Western world at the time would have failed to recognize the presentation. “The illness of our age is hysteria,” a French journalist wrote. “Everywhere one rubs elbows with it.”
Hysteria would have had to be included in our hypothetical 1880 DSM for the exact same reasons that attention deficit hyperactivity disorder is included in the just-released DSM-5. The disorder clearly existed in a population and could be reliably distinguished, by experts and clinicians, from other constellations of symptoms. There were no reliable medical tests to distinguish hysteria from other illnesses then; the same is true of the disorders listed in the DSM-5 today. Practically speaking, the criteria by which something is declared a mental illness are virtually the same now as they were over a hundred years ago.
The DSM determines which mental disorders are worthy of insurance reimbursement, legal standing, and serious discussion in American life. That its diagnoses are not more scientific is, according to several prominent critics, a scandal. In a major blow to the APA’s dominance over mental-health diagnoses, Thomas R. Insel, director of the National Institute of Mental Health, recently declared that his organization would no longer rely on the DSM as a guide to funding research. “The weakness is its lack of validity,” he wrote. “Unlike our definitions of ischemic heart disease, lymphoma, or AIDS, the DSM diagnoses are based on a consensus about clusters of clinical symptoms, not any objective laboratory measure. In the rest of medicine, this would be equivalent to creating diagnostic systems based on the nature of chest pain or the quality of fever.” As an alternative, Insel called for the creation of a new, rival classification system based on genetics, brain imaging, and cognitive science.
This idea — that we might be able to strip away all subjectivity from the diagnosis of mental illness and render psychiatry truly scientific — is intuitively appealing. But there are a couple of problems with it. The first is that the science simply isn’t there yet. A functional neuroscientific understanding of mental suffering is years, perhaps generations, away from our grasp. What are clinicians and patients to do until then? But the second, more telling problem with Insel’s approach lies in its assumption that it is even possible to strip culture from the study of mental illness. Indeed, from where I sit, the trouble with the DSM — both this one and previous editions — is not so much that it is insufficiently grounded in biology, but that it ignores the inescapable relationship between social cues and the shifting manifestations of mental illness. [Continue reading...]
An even more profound problem with the reductionism of neuroscience is that it fails to question the physicalism that has become the bedrock of the scientific outlook. In other words, the belief that if something is real then it must be observable through standardized instruments which enable different observers to agree on the characteristics of the same thing.
In reality, human experience and the human world is primarily constructed from non-physical entities: ideas.
Take for instance the idea of a road — something that at first glance might seem indisputably physical. Imagine a six-lane freeway, filled with fast moving traffic. What could be more physical than that mass of steel and concrete carved emphatically through the landscape?
In fact, a road is a highly abstract concept that only exists inside a human mind and what allows us to drive on roads is the fact that generally speaking we share the same idea about what a road is — that it is a place which allows for the passage of wheel vehicles and is not suitable for picnics or sunbathing; that in the absence of symbolic warnings its boundaries will remain parallel and its continuation will not terminate without warning; that the road’s users will conform to a code of behavior that makes individual actions generally predictable by, for instance, avoiding contact with adjacent vehicles, driving on the same side, and at similar speeds. To drive at night, when sensory input is reduced to a minimum — from road markings whose location we mentally compute; from the narrow field of vision provided by headlights; from the lights of other vehicles whose standardized positions and locations allow us to unconsciously compute their proximity — is to place full faith in the idea of the road.
Neuroscience has advanced to the point where it’s now possible to map in some detail the neural foundations of thought, but a thought can no more be reduced to the firing of neurons than can a word be reduced to the illumination of a configuration of pixels. Ideas do not have an atomic structure and are not bound by time or space. The cartography of the mind cannot be charted by any kind of imaging technology. Not only is such technology ineffective; it is also redundant, since mind is by its very nature (along with a certain amount of discipline and practice) open to self scrutiny.
Monash University: The pain sensations of others can be felt by some people, just by witnessing their agony, according to new research.
A Monash University study into the phenomenon known as somatic contagion found almost one in three people could feel pain when they see others experience pain. It identified two groups of people that were prone to this response – those who acquire it following trauma, injury such as amputation or chronic pain, and those with the condition present at birth, known as the congenital variant.
Presenting her findings at the Australian and New Zealand College of Anaesthetists’ annual scientific meeting in Melbourne earlier this week, Dr Melita Giummarra, from the School of Psychology and Psychiatry, said in some cases people suffered severe painful sensations in response to another person’s pain.
“My research is now beginning to differentiate between at least these two unique profiles of somatic contagion,” Dr Giummarra said.
“While the congenital variant appears to involve a blurring of the boundary between self and other, with heightened empathy, acquired somatic contagion involves reduced empathic concern for others, but increased personal distress.
“This suggests that the pain triggered corresponds to a focus on their own pain experience rather than that of others.”
Most people experience emotional discomfort when they witness pain in another person and neuroimaging studies have shown that this is linked to activation in the parts of the brain that are also involved in the personal experience of pain.
Dr Giummarra said for some people the pain they ‘absorb’ mirrors the location and site of the pain in another they are witnessing and is generally localised.
“We know that the same regions of the brain are activated for these groups of people as when they experience their own pain. First in emotional regions but then there is also sensory activation. It is a vicarious – it literally triggers their pain, Dr Giummarra said”
Dr Giummarra has developed a new tool to characterise the reactions people have to pain in others that is also sensitive to somatic contagion – the Empathy for Pain Scale.
Wired: For Rick Doblin, being invited to the Pentagon was an emotional experience. Growing up in the 60s, Doblin embraced the counterculture and protested the Vietnam war and the military-industrial complex behind it.
Yesterday he was at the Pentagon trying to persuade military medical officials to permit a clinical trial that would test MDMA, the active ingredient in the party drug Ecstasy, in conjunction with psychotherapy, in active duty soldiers with post-traumatic stress disorder.
“There’s been this history of conflict between psychedelics and the military, and we’re trying to say that’s not the only vision,” Doblin said. “There’s a way for us to come together.”
Doblin is the founder and director of the non-profit Multidisciplinary Association for Psychedelic Studies (MAPS), which is trying to get drugs like psilocybin, LSD, and MDMA approved for medical use. MAPS has already sponsored small clinical trials of MDMA-assisted psychotherapy for PTSD, first in survivors of sexual abuse and assault, and now in military veterans, police, and firefighters.
Doblin spoke with Wired about his military mission and what it says about shifting attitudes towards psychedelic drugs. [Continue reading...]