America’s huge appetite for conspiracy theories

Conspiracy Theories and the Paranoid Style(s) of Mass Opinion,” a paper recently published in the American Journal of Political Science, finds that half of Americans consistently endorse at least one conspiracy theory.

Tom Jacobs writes: It’s easy to assume this represents widespread ignorance, but these findings suggest otherwise. Oliver and Wood report that, except for the Obama “birthers” and the 9/11 “truthers,” “respondents who endorse conspiracy theories are not less-informed about basic political facts than average citizens.”

So what does drive belief in these contrived explanations? The researchers argue the tendency to accept them is “derived from two innate psychological predispositions.”

The first, which has an evolutionary explanation, is an “unconscious cognitive bias to draw causal connections between seemingly related phenomena.” Jumping to conclusions based on weak evidence allows us to “project feelings of control in uncertain situations,” the researchers note.

The second is our “natural attraction towards melodramatic narratives as explanations for prominent events — particularly those that interpret history (in terms of) universal struggles between good and evil.”

Stories that fit that pattern “provide compelling explanations for otherwise confusing or ambiguous events, they write, noting that “many predominant beliefs systems … draw heavily upon the idea of unseen, intentional forces shaping contemporary events.”

“For many Americans, complicated or nuanced explanations for political events are both cognitively taxing and have limited appeal,” write Oliver and Wood. “A conspiracy narrative may provide a more accessible and convincing account of political events.”

That said, they add, “Even highly engaged or ideological segments of the population can be swayed by the power of these narratives, particularly when they coincide with their other political views.”

facebooktwittermail

How we understand what others think, believe, feel, and want

f13-iconNicholas Epley writes: One of the most amazing court cases you probably have never heard of had come down to this. Standing Bear, the reluctant chief of the Ponca tribe, rose on May 2, 1879, to address a packed audience in a Nebraska courtroom. At issue was the existence of a mind that many were unable to see.

Standing Bear’s journey to this courtroom had been excruciating. The U.S. government had decided several years earlier to force the 752 Ponca Native Americans off their lands along the fertile Niobrara River and move them to the desolate Indian Territory, in what is now northern Oklahoma. Standing Bear surrendered everything he owned, assembled his tribe, and began marching a six-hundred-mile “trail of tears.” If the walk didn’t kill them (as it did Standing Bear’s daughter), then the parched Indian Territory would. Left with meager provisions and fields of parched rock to farm, nearly a third of the Poncas died within the first year. This included Standing Bear’s son. As his son lay dying, Standing Bear promised to return his son’s bones to the tribe’s burial grounds so that his son could walk the afterlife with his ancestors, according to their religion. Desperate, Standing Bear decided to go home.

Carrying his son’s bones in a bag clutched to his chest, Standing Bear and twenty-seven others began their return in the dead of winter. Word spread of the group’s travel as they approached the Omaha Indian reservation, midway through their journey. The Omahas welcomed them with open arms, but U.S. officials welcomed them with open handcuffs. General George Crook was ordered by government officials to return the beleaguered Poncas to the Indian Territory.

Crook couldn’t bear the thought. “I’ve been forced many times by orders from Washington to do most inhuman things in dealings with the Indians,” he said, “but now I’m ordered to do a more cruel thing than ever before.” Crook was an honorable man who could no more disobey direct orders than he could fly, so instead he stalled, encouraging a newspaper editor from Omaha to enlist lawyers who would then sue General Crook (as the U.S. government’s representative) on Standing Bear’s behalf. The suit? To have the U.S. government recognize Standing Bear as a person, as a human being. [Continue reading...]

facebooktwittermail

What, me? Biased?

a13-iconTom Jacobs writes: Pretty much all of us are prone to “bias blindness.” We can easily spot prejudice in others, but we’re oblivious to our own, insisting on our impartiality in spite of any and all evidence to the contrary.

Newly published research suggests this problem is actually worse than we thought. It finds that even when people use an evaluation strategy they concede is biased, they continue to insist their judgments are objective.

“Recognizing one’s bias is a critical first step in trying to correct for it,” writes a research team led by Emily Pronin and Katherine Hansen of Princeton University. “These experiments make clear how difficult that first step can be to reach.”

Although their findings have clear implications regarding political opinions, the researchers avoided such fraught topics and focused on art. In two experiments, participants (74 Princeton undergraduates in the first, 85 adults recruited online in the second) looked at a series of 80 paintings and rated the artistic merit of each on a one-to-nine scale. [Continue reading...]

facebooktwittermail

The roots of America’s narcissism epidemic

f13-iconWill Storr writes: For much of human history, our beliefs have been based on the assumption that people are fundamentally bad. Strip away a person’s smile and you’ll find a grotesque, writhing animal-thing. Human instincts have to be controlled, and religions have often been guides for containing the demons. Sigmund Freud held a similar view: Psychotherapy was his method of making the unconscious conscious, helping people restrain their bestial desires and accord with the moral laws of civilization.

In the middle of the 20th century, an alternative school of thought appeared. It was popularized by Carl Rogers, an influential psychotherapist at the University of Chicago, and it reversed the presumption of original sin. Rogers argued that people are innately decent. Children, he believed, should be raised in an environment of “unconditional positive regard”. They should be liberated from the inhibitions and restraints that prevented them from attaining their full potential.

It was a characteristically American idea — perhaps even the American idea. Underneath it all, people are good, and to get the best out of themselves, they just need to be free.

Economic change gave Rogers’s theory traction. It was the 1950s, and a nation of workmen was turning into a nation of salesmen. To make good in life, interpersonal sunniness was becoming essential. Meanwhile, rising divorce rates and the surge of women into the workplace were triggering anxieties about the lives of children born into the baby boom. Parents wanted to counteract the stresses of modern family life, and boosting their children’s self-esteem seemed like the solution.

By the early 1960s, wild thinkers in California were pushing Rogers’s idea even further. The “human potential movement” argued that most people were using just 10 percent of their intellectual capacity. It leaned on the work of Abraham Maslow, who studied exceptional people such as Albert Einstein and Eleanor Roosevelt and said there were five human needs, the most important of which was self-actualization—the realization of one’s maximum potential. Number two on the list was esteem.

At the close of the decade, the idea that self-esteem was the key to psychological riches finally exploded. The trigger was Nathaniel Branden, a handsome Canadian psychotherapist who had moved to Los Angeles as a disciple of the philosopher Ayn Rand. One of Rand’s big ideas was that that moral good would arise when humans ruthlessly pursued their own self-interest. She and Branden began a tortuous love affair, and her theories had an intense impact on the young psychotherapist. In The Psychology of Self-Esteem, published in 1969, Branden argued that self-esteem “has profound effects on a man’s thinking processes, emotions, desires, values and goals. It is the single most significant key to his behavior.” It was an international bestseller, and it propelled the self-esteem movement out of the counterculture and into the mainstream.

The year that Branden published his book, a sixteen-year-old in Euclid, Ohio named Roy Baumeister was grappling with his own self-esteem problem: his Dad. [Continue reading...]

facebooktwittermail

Study: ‘Trolls’ online appear to be sadists in real life

n13-iconThe Register reports: A group of Canadian researchers has given the imprimatur of social-science recognition to a fact that many of us who spend time in internet comment forums have suspected: there’s a strong correlation between online trolling and sadism.

“Both trolls and sadists feel sadistic glee at the distress of others. Sadists just want to have fun … and the Internet is their playground!” write Erin Buckels, Paul Trapnell, and Delroy Paulhus of the Universities of Manitoba, Winnipeg, and British Columbia, respectively, in a paper to be published in the journal Personality and Individual Differences.

The researchers define trolling as “the practice of behaving in a deceptive, destructive, or disruptive manner in a social setting on the Internet with no apparent instrumental purpose,” referring to trolls as “agents of chaos on the Internet, exploiting ‘hot-button issues’ to make users appear overly emotional or foolish in some manner. If an unfortunate person falls into their trap, trolling intensifies for further, merciless amusement.”

The Canadian psychologists’ paper is entitled “Trolls just want to have fun”, which is not merely a bit of boffinary humor at the expense of Cyndi Lauper, but rather a reference to one of the researchers’ findings. “We found clear evidence,” they write, “that sadists tend to troll because they enjoy it.” [Continue reading...]

facebooktwittermail

Douglas Hofstadter — Research on artificial intelligence is sidestepping the core question: how do people think?

f13-iconDouglas Hofstadter is a cognitive scientist at Indiana University and the Pulitzer Prize-winning author of Gödel, Escher, Bach: An Eternal Golden Braid.

Popular Mechanics: You’ve said in the past that IBM’s Jeopardy-playing computer, Watson, isn’t deserving of the term artificial intelligence. Why?

Douglas Hofstadter: Well, artificial intelligence is a slippery term. It could refer to just getting machines to do things that seem intelligent on the surface, such as playing chess well or translating from one language to another on a superficial level — things that are impressive if you don’t look at the details. In that sense, we’ve already created what some people call artificial intelligence. But if you mean a machine that has real intelligence, that is thinking — that’s inaccurate. Watson is basically a text search algorithm connected to a database just like Google search. It doesn’t understand what it’s reading. In fact, read is the wrong word. It’s not reading anything because it’s not comprehending anything. Watson is finding text without having a clue as to what the text means. In that sense, there’s no intelligence there. It’s clever, it’s impressive, but it’s absolutely vacuous.

Do you think we’ll start seeing diminishing returns from a Watson-like approach to AI?

I can’t really predict that. But what I can say is that I’ve monitored Google Translate — which uses a similar approach — for many years. Google Translate is developing and it’s making progress because the developers are inventing new, clever ways of milking the quickness of computers and the vastness of its database. But it’s not making progress at all in the sense of understanding your text, and you can still see it falling flat on its face a lot of the time. And I know it’ll never produce polished [translated] text, because real translating involves understanding what is being said and then reproducing the ideas that you just heard in a different language. Translation has to do with ideas, it doesn’t have to do with words, and Google Translate is about words triggering other words.

So why are AI researchers so focused on building programs and computers that don’t do anything like thinking?

They’re not studying the mind and they’re not trying to find out the principles of intelligence, so research may not be the right word for what drives people in the field that today is called artificial intelligence. They’re doing product development.

I might say though, that 30 to 40 years ago, when the field was really young, artificial intelligence wasn’t about making money, and the people in the field weren’t driven by developing products. It was about understanding how the mind works and trying to get computers to do things that the mind can do. The mind is very fluid and flexible, so how do you get a rigid machine to do very fluid things? That’s a beautiful paradox and very exciting, philosophically. [Continue reading...]

facebooktwittermail

What makes humans capable of horrific violence?

Tom Bartlett writes: The former battery factory on the outskirts of Srebrenica, a small town in eastern Bosnia, has become a grim tourist attraction. Vans full of sightseers, mostly from other countries, arrive here daily to see the crumbling industrial structure, which once served as a makeshift United Nations outpost and temporary haven for Muslims under assault by Serb forces determined to seize the town and round up its residents. In July 1995 more than 8,000 Muslim men, from teenagers to the elderly, were murdered in and around Srebrenica, lined up behind houses, gunned down in soccer fields, hunted through the forest.

The factory is now a low-budget museum where you can watch a short film about the genocide and meet a survivor, a soft-spoken man in his mid-30s who has repeated the story of his escape and the death of his father and brother nearly every day here for the past five years. Visitors are then led to a cavernous room with display cases containing the personal effects of victims—a comb, two marbles, a handkerchief, a house key, a wedding ring, a pocket watch with a bullet hole—alongside water-stained photographs of the atrocity hung on cracked concrete walls. The English translations of the captions make for a kind of accidental poetry. “Frightened mothers with weeping children: where and how to go on … ?” reads one. “Endless sorrow for the dearest,” says another.

Across the street from the museum is a memorial bearing the names of the known victims, flanked by rows and rows of graves, each with an identical white marker. Nearby an old woman runs a tiny souvenir shop selling, among other items, baseball caps with the message “Srebrenica: Never Forget.”

This place is a symbol of the 1995 massacre, which, in turn, is a symbol of the entire conflict that followed the breakup of Yugoslavia. The killings here were a fraction of the total body count; The Bosnian Book of the Dead, published early this year, lists 96,000 who perished, though there were thousands more. It was the efficient brutality in Srebrenica that prompted the international community, after years of dithering and half measures, to take significant military action.

While that action ended the bloodshed, the reckoning is far from finished. Fragments of bone are still being sifted from the soil, sent for DNA analysis, and returned to families for burial. The general who led the campaign, Ratko Mladic, is on trial in The Hague after years on the run. In a recent proceeding, Mladic stared at a group of Srebrenica survivors in the gallery and drew a single finger across his throat. Around the same time, the president of Serbia issued a nonapology apology for the massacre, neglecting to call it genocide and using language so vague it seemed more insult than olive branch.

Standing near the memorial, surrounded by the dead, the driver of one of those tourist-filled vans, a Muslim who helped defend Sarajevo during a nearly four-year siege, briefly drops his sunny, professional demeanor. “How can you forgive when they say it didn’t happen?” he says. “The Nazis, they killed millions. They say, ‘OK, we are sorry.’ But the Serbs don’t do that.”

Some Serbs do acknowledge the genocide. According to a 2010 survey, though, most Serbs believe that whatever happened at Srebrenica has been exaggerated, despite being among the most scientifically documented mass killings in history. They shrug it off as a byproduct of war or cling to conspiracy theories or complain about being portrayed as villains. The facts disappear in a swirl of doubts and denial.[Continue reading...]

facebooktwittermail

Slow-motion world for small animals

chipmunk

BBC News reports: Smaller animals tend to perceive time as if it is passing in slow motion, a new study has shown.

This means that they can observe movement on a finer timescale than bigger creatures, allowing them to escape from larger predators.

Insects and small birds, for example, can see more information in one second than a larger animal such as an elephant.

The work is published in the journal Animal Behaviour.

“The ability to perceive time on very small scales may be the difference between life and death for fast-moving organisms such as predators and their prey,” said lead author Kevin Healy, at Trinity College Dublin (TCD), Ireland.

The reverse was found in bigger animals, which may miss things that smaller creatures can rapidly spot. [Continue reading...]

facebooktwittermail

The psychology of conspiracy theories

William Saletan writes: To believe that the U.S. government planned or deliberately allowed the 9/11 attacks, you’d have to posit that President Bush intentionally sacrificed 3,000 Americans. To believe that explosives, not planes, brought down the buildings, you’d have to imagine an operation large enough to plant the devices without anyone getting caught. To insist that the truth remains hidden, you’d have to assume that everyone who has reviewed the attacks and the events leading up to them — the CIA, the Justice Department, the Federal Aviation Administration, the North American Aerospace Defense Command, the Federal Emergency Management Agency, scientific organizations, peer-reviewed journals, news organizations, the airlines, and local law enforcement agencies in three states — was incompetent, deceived, or part of the cover-up.

And yet, as Slate’s Jeremy Stahl points out, millions of Americans hold these beliefs. In a Zogby poll taken six years ago, only 64 percent of U.S. adults agreed that the attacks “caught US intelligence and military forces off guard.” More than 30 percent chose a different conclusion: that “certain elements in the US government knew the attacks were coming but consciously let them proceed for various political, military, and economic motives,” or that these government elements “actively planned or assisted some aspects of the attacks.”

How can this be? How can so many people, in the name of skepticism, promote so many absurdities?

The answer is that people who suspect conspiracies aren’t really skeptics. Like the rest of us, they’re selective doubters. They favor a worldview, which they uncritically defend. But their worldview isn’t about God, values, freedom, or equality. It’s about the omnipotence of elites. [Continue reading...]

facebooktwittermail

Yes, I’m an ethical person — before lunch, anyway

Pacific Standard: When was the last time you engaged in unethical behavior? Be honest, now, and be specific: What time of day was it when you cheated on that test, lied to your spouse, or stole that item from the company break room?

If it was late afternoon or evening, you don’t have an excuse, exactly, but you certainly have company.

A newly published paper entitled The Morning Morality Effect suggests we’re more likely to act unethically later in the day. It provides further evidence that self-control is a finite resource that gradually gets depleted, and can’t be easily accessed when our reserves are low. [Continue reading...]

facebooktwittermail

The deadly fury that demonization unleashes

Katrin Bennhold writes: From a comfortable couch in his London living room, Sean O’Callaghan had been watching the shaky televised images of terrified people running from militants in an upscale mall in Kenya. Some of those inside had been asked their religion. Muslims were spared, non-Muslims executed.

“God, this is one tough lot of jihadis,” said a friend, a fellow Irishman, shaking his head.

“But we used to do the same thing,” Mr. O’Callaghan replied.

There was the 1976 Kingsmill massacre. Catholic gunmen stopped a van with 12 workmen in County Armagh, Northern Ireland, freed the one Catholic among them and lined up the 11 Protestants and shot them one by one.

Mr. O’Callaghan, a former paramilitary with the Irish Republican Army, has particular insight into such coldblooded killing.

On a sunny August day in 1974, he walked into a bar in Omagh, Northern Ireland, drew a short-barreled pistol and shot a man bent over the racing pages at the end of the counter, a man he had been told was a notorious traitor to the Irish Catholic cause.

Historical parallels are inevitably flawed. But a recent flurry of horrific bloodletting — the attack in Nairobi that left 60 dead, the execution by Syrian jihadis of bound and blindfolded prisoners, an Egyptian soldier peering through his rifle sight and firing on the teenage daughter of a Muslim Brotherhood leader — raises a question as old as Cain and Abel: Do we all have it in us?

Many experts think we do. For Mr. O’Callaghan, it was a matter of focus.

“What you’re seeing in that moment,” he said in an interview last week, “is not a human being.”

It is dangerous to assume that it takes a monster to commit a monstrosity, said Herbert Kelman, professor emeritus of social ethics at Harvard. [Continue reading...]

facebooktwittermail

Henry Gustave Molaison — the man who forgot everything

Steven Shapin writes: In the movie “Groundhog Day,” the TV weatherman Phil Connors finds himself living the same day again and again. This has its advantages, as he has hundreds of chances to get things right. He can learn to speak French, to sculpt ice, to play jazz piano, and to become the kind of person with whom his beautiful colleague Rita might fall in love. But it’s a torment, too. An awful solitude flows from the fact that he’s the only one in Punxsutawney, Pennsylvania, who knows that something has gone terribly wrong with time. Nobody else seems to have any memory of all the previous iterations of the day. What is a new day for Rita is another of the same for Phil. Their realities are different—what passes between them in Phil’s world leaves no trace in hers—as are their senses of selfhood: Phil knows Rita as she cannot know him, because he knows her day after day after day, while she knows him only today. Time, reality, and identity are each curated by memory, but Phil’s and Rita’s memories work differently. From Phil’s point of view, she, and everyone else in Punxsutawney, is suffering from amnesia.

Amnesia comes in distinct varieties. In “retrograde amnesia,” a movie staple, victims are unable to retrieve some or all of their past knowledge — Who am I? Why does this woman say that she’s my wife? — but they can accumulate memories for everything that they experience after the onset of the condition. In the less cinematically attractive “anterograde amnesia,” memory of the past is more or less intact, but those who suffer from it can’t lay down new memories; every person encountered every day is met for the first time. In extremely unfortunate cases, retrograde and anterograde amnesia can occur in the same individual, who is then said to suffer from “transient global amnesia,” a condition that is, thankfully, temporary. Amnesias vary in their duration, scope, and originating events: brain injury, stroke, tumors, epilepsy, electroconvulsive therapy, and psychological trauma are common causes, while drug and alcohol use, malnutrition, and chemotherapy may play a part.

There isn’t a lot that modern medicine can do for amnesiacs. If cerebral bleeding or clots are involved, these may be treated, and occupational and cognitive therapy can help in some cases. Usually, either the condition goes away or amnesiacs learn to live with it as best they can — unless the notion of learning is itself compromised, along with what it means to have a life. Then, a few select amnesiacs disappear from systems of medical treatment and reappear as star players in neuroscience and cognitive psychology.

No star ever shone more brightly in these areas than Henry Gustave Molaison, a patient who, for more than half a century, until his death, in 2008, was known only as H.M., and who is now the subject of a book, “Permanent Present Tense” (Basic), by Suzanne Corkin, the neuroscientist most intimately involved in his case. [Continue reading...]

facebooktwittermail

The psychological parallels between Barack Obama and Richard Nixon

Robert W Merry writes: In 1972, Duke University professor James David Barber brought out a book that immediately was heralded as a seminal study of presidential character. Titled The Presidential Character: Predicting Performance in the White House, the book looked at qualities of temperament and personality in assessing how the country’s chief executives approached the presidency—and how that in turn contributed to their success or failure in the office.

Although there were flaws in Barber’s approach, particularly in his efforts to typecast the personalities of various presidents, it does indeed lay before us an interesting and worthy matrix for assessing how various presidents approach the job and the ultimate quality of their leadership. So let’s apply the Barber matrix to the presidential incumbent, Barack Obama.

Barber, who died in 2004, assessed presidents based on two indices: first, whether they were “positive” or “negative”; and, second, whether they were “active” or “passive.” The first index—the positive/negative one—assesses how presidents regarded themselves in relation to the challenges of the office; so, for example, did they embrace the job with a joyful optimism or regard it as a necessary martyrdom they must sustain in order to prove their own self-worth? The second index—active vs. passive—measures their degree of wanting to accomplish big things or retreat into a reactive governing mode.

These two indices produce four categories of presidents, to wit:

Active-Positive: These are presidents with big national ambitions who are self-confident, flexible, optimistic, joyful in the exercise of power, possessing a certain philosophical detachment toward what they regard as a great game.

Active-Negative: These are compulsive people with low self-esteem, seekers of power as a means of self-actualization, given to rigidity and pessimism, driven, sometimes overly aggressive. But they harbor big dreams for bringing about accomplishments of large historical dimension.

Passive-Positive: These are compliant presidents who react to events rather than initiating them. They want to be loved and are thus ingratiating—and easily manipulated. They are “superficially optimistic” and harbor generally modest ambitions for their presidential years. But they are healthy in both ego and self-esteem.

Passive-Negative: These are withdrawn people with low self-esteem and little zest for the give-and-take of politics and the glad-handing requirements of the game. They avoid conflict and take no joy in the uses of power. They tend to get themselves boxed up through a preoccupation with principles, rules and procedures. [Continue reading...]

facebooktwittermail

Video: Noam Chomsky interview on artificial intelligence

facebooktwittermail

The more power individuals acquire, the less human they become

Daniel Goleman writes: Turning a blind eye. Giving someone the cold shoulder. Looking down on people. Seeing right through them.

These metaphors for condescending or dismissive behavior are more than just descriptive. They suggest, to a surprisingly accurate extent, the social distance between those with greater power and those with less — a distance that goes beyond the realm of interpersonal interactions and may exacerbate the soaring inequality in the United States.

A growing body of recent research shows that people with the most social power pay scant attention to those with little such power. This tuning out has been observed, for instance, with strangers in a mere five-minute get-acquainted session, where the more powerful person shows fewer signals of paying attention, like nodding or laughing. Higher-status people are also more likely to express disregard, through facial expressions, and are more likely to take over the conversation and interrupt or look past the other speaker.

Bringing the micropolitics of interpersonal attention to the understanding of social power, researchers are suggesting, has implications for public policy.

Of course, in any society, social power is relative; any of us may be higher or lower in a given interaction, and the research shows the effect still prevails. Though the more powerful pay less attention to us than we do to them, in other situations we are relatively higher on the totem pole of status — and we, too, tend to pay less attention to those a rung or two down.

A prerequisite to empathy is simply paying attention to the person in pain. In 2008, social psychologists from the University of Amsterdam and the University of California, Berkeley, studied pairs of strangers telling one another about difficulties they had been through, like a divorce or death of a loved one. The researchers found that the differential expressed itself in the playing down of suffering. The more powerful were less compassionate toward the hardships described by the less powerful. [Continue reading...]

facebooktwittermail

Why men need women

Adam Grant asks: What makes some men miserly and others generous? What motivated Bill Gates, for example, to make more than $28 billion in philanthropic gifts while many of his billionaire peers kept relatively tightfisted control over their personal fortunes?

New evidence reveals a surprising answer. The mere presence of female family members — even infants — can be enough to nudge men in the generous direction.

In a provocative new study, the researchers Michael Dahl, Cristian Dezso and David Gaddis Ross examined generosity and what inspires it in wealthy men. Rather than looking at large-scale charitable giving, they looked at why some male chief executives paid their employees more generously than others. The researchers tracked the wages that male chief executives at more than 10,000 Danish companies paid their employees over the course of a decade.

Interestingly, the chief executives paid their employees less after becoming fathers. On average, after chief executives had a child, they paid about $100 less in annual compensation per employee. To be a good provider, the researchers write, it’s all too common for a male chief executive to claim “his firm’s resources for himself and his growing family, at the expense of his employees.”

But there was a twist. When Professor Dahl’s team examined the data more closely, the changes in pay depended on the gender of the child that the chief executives fathered. They reduced wages after having a son, but not after having a daughter.

Daughters apparently soften fathers and evoke more caretaking tendencies. The speculation is that as we brush our daughters’ hair and take them to dance classes, we become gentler, more empathetic and more other-oriented. [Continue reading...]

facebooktwittermail