Kimberley Brownlee writes: Great loners are fascinating. Henry David Thoreau at Walden Pond, Buddhist monks in their hermitage, and fictional heroes such as Robinson Crusoe are all romantic figures of successful solitary survival. Their setting is the wilderness. Their apparent triumph is the outcome of grit, ingenuity and self-reliance.
One reason that such characters seem appealing is that, ironically, they are reassuring. They give the comforting impression that anyone could thrive in isolation as they do. This reassurance can be summed up in the declaration made by Henrik Ibsen’s Dr Stockmann at the end of An Enemy of the People (1882), after the locals have persecuted him for revealing that the town’s tourist baths are contaminated. Stockmann declares: ‘The strongest man in the world is he who stands most alone.’
The great loners embody an idea of freedom from the vagaries and stresses of social life. As human beings, we are vulnerable to each other’s moods, proclivities, ideologies, perceptions, knowledge and ignorance. We are vulnerable to our society’s conventions, policies and hierarchies. We need other people’s blessing and often their help in order to get resources. When we’re young and when we’re old, we are vulnerable enough that our lives are happy only if other people choose to care about us.
No wonder then that Robinson Crusoe is one of the best-known novels in history; there is solace in the hermit’s self-governing independence. But this romantic image of the eremitic life rests on a mistaken idea of both the great loners’ circumstances and the nature of social isolation. [Continue reading…]
Amanda Taub writes: One of the first things we learned about Omar Mateen, the gunman in the nightclub massacre in Orlando, Fla., was that his ex-wife said he had beaten her severely until she left him in 2009.
If it sounds familiar that a gunman in a mass shooting would have a history of domestic violence, it should.
In February, Cedric Ford shot 17 people at his Kansas workplace, killing three, only 90 minutes after being served with a restraining order sought by his ex-girlfriend, who said he had abused her. And Man Haron Monis, who holed up with hostages for 17 hours in a cafe in Sydney, Australia, in 2014, an episode that left two people dead and four wounded, had terrorized his ex-wife. He had threatened to harm her if she left him, and was eventually charged with organizing her murder.
When Everytown for Gun Safety, a gun control group, analyzed F.B.I. data on mass shootings from 2009 to 2015, it found that 57 percent of the cases included a spouse, former spouse or other family member among the victims — and that 16 percent of the attackers had previously been charged with domestic violence.
Social scientists have not settled on an explanation for this correlation, but their research reveals striking parallels between the factors that drive the two phenomena.
There are, of course, a tangle of factors behind every murder, especially terrorism inspired by foreign groups. But research on domestic violence hints at a question that often arises from seemingly inexplicable events like Mr. Mateen’s massacre of 49 people at an Orlando nightclub — what drives individuals to commit such mass attacks? — and sheds light on the psychology of violence. [Continue reading…]
Jay Michaelson writes: If Mateen turns out to have been repressing his sexuality, that makes the attack more about homophobia, not less. Now what we have is not a gay-hating radical Islamist but a self-hating gay man who found in Islamist ideology a way to express his animus at everyone and everything.
Nor is this unique to Islam. When Christian fundamentalist Ted Haggard preached vitriolic sermons against homosexuality, it was because of — not despite — his furtive sex dates with a drug-dealing gay masseur. When former senator Larry Craig inveighed against the evils of equality, it was because of — not despite — his own shame around soliciting men for sex in public restrooms.
That is why study after study has shown that the more homophobic one is, the more likely one is to have repressed homosexual desires. If you’re battling your demons in private, you’re going to battle them in public too.
It is also why repressed gay people seek out fundamentalist religion in the first place. Religious fundamentalism sublimates the repressed sexual urge into religious zeal. In its Christian, Muslim, and Jewish forms, it insists that we are all struggling with evil urges that must be assiduously repressed. And its restrictions on sexual expression, coupled with often single sex male environments, work just fine for those who couldn’t express their sexuality in an “acceptable” way anyway.
Once again, that is as true for “celibate” Catholic priests as it is for ultra-Orthodox Jews as it is for born-again evangelicals as it is for newly radicalized Muslims. The logic of fundamentalist religion is the logic of repression and sublimation, of projecting one’s own inner struggles onto the screen of the theological.
I know this from personal experience. For 10 years, I lived my life as a closeted, Orthodox Jew. I wasn’t closeted because I was Orthodox; I was Orthodox because I was closeted. [Continue reading…]
Dave Cullen writes: My mantra after every mass-shooting tragedy is and always will be: don’t jump to conclusions too quickly, especially on motive. It’s healthy to discuss possibilities, particularly as the evidence piles up, but remember that early facts often turn out to point in the wrong direction. The media and even the president lunged way too quickly to assume that the Orlando massacre was an act of international terrorism, which it may or may not have been. Today we’re all asking a different question: Was the shooter gay? A rapidly accumulating set of evidence — including reports that he had been to Pulse in the years prior to the shooting — suggests that he was, or that he was still struggling with gay urges. But he might also just have been casing the place.
The idea that the killer, Omar Mateen, was gay himself may sound baffling, and much of the national media has treated it that way in the day or so since news of his bar-hopping habits surfaced. But it’s no surprise to most gay people. Many of my gay friends assumed as much from the beginning, and predicted this before the first scraps of evidence surfaced. It sounds as if he was a self-loathing gay — which is almost the same as saying he was just coming to terms with being gay. Has anyone ever discovered his gayness and not wanted to tear it out of himself?
Most of us haven’t just known that guy, we’ve been that guy. I never touched a man, or admitted how badly I wanted to, until I was 28. That initiated a seven-year bi phase, which I called “experimenting” for the first half, even after a friend said, “Experimenting? How many times do you need to rerun the experiment?” I had slept with at least a hundred men by then. Still not convinced. Hmmm.[Continue reading…]
Edward Mendelson writes: Every technological revolution coincides with changes in what it means to be a human being, in the kinds of psychological borders that divide the inner life from the world outside. Those changes in sensibility and consciousness never correspond exactly with changes in technology, and many aspects of today’s digital world were already taking shape before the age of the personal computer and the smartphone. But the digital revolution suddenly increased the rate and scale of change in almost everyone’s lives. Elizabeth Eisenstein’s exhilaratingly ambitious historical study The Printing Press as an Agent of Change (1979) may overstate its argument that the press was the initiating cause of the great changes in culture in the early sixteenth century, but her book pointed to the many ways in which new means of communication can amplify slow, preexisting changes into an overwhelming, transforming wave.
In The Changing Nature of Man (1956), the Dutch psychiatrist J.H. van den Berg described four centuries of Western life, from Montaigne to Freud, as a long inward journey. The inner meanings of thought and actions became increasingly significant, while many outward acts became understood as symptoms of inner neuroses rooted in everyone’s distant childhood past; a cigar was no longer merely a cigar. A half-century later, at the start of the digital era in the late twentieth century, these changes reversed direction, and life became increasingly public, open, external, immediate, and exposed.
Virginia Woolf’s serious joke that “on or about December 1910 human character changed” was a hundred years premature. Human character changed on or about December 2010, when everyone, it seemed, started carrying a smartphone. For the first time, practically anyone could be found and intruded upon, not only at some fixed address at home or at work, but everywhere and at all times. Before this, everyone could expect, in the ordinary course of the day, some time at least in which to be left alone, unobserved, unsustained and unburdened by public or familial roles. That era now came to an end.
Many probing and intelligent books have recently helped to make sense of psychological life in the digital age. Some of these analyze the unprecedented levels of surveillance of ordinary citizens, others the unprecedented collective choice of those citizens, especially younger ones, to expose their lives on social media; some explore the moods and emotions performed and observed on social networks, or celebrate the Internet as a vast aesthetic and commercial spectacle, even as a focus of spiritual awe, or decry the sudden expansion and acceleration of bureaucratic control.
The explicit common theme of these books is the newly public world in which practically everyone’s lives are newly accessible and offered for display. The less explicit theme is a newly pervasive, permeable, and transient sense of self, in which much of the experience, feeling, and emotion that used to exist within the confines of the self, in intimate relations, and in tangible unchanging objects — what William James called the “material self” — has migrated to the phone, to the digital “cloud,” and to the shape-shifting judgments of the crowd. [Continue reading…]
Keith Frankish writes: Do you think racial stereotypes are false? Are you sure? I’m not asking if you’re sure whether or not the stereotypes are false, but if you’re sure whether or not you think that they are. That might seem like a strange question. We all know what we think, don’t we?
Most philosophers of mind would agree, holding that we have privileged access to our own thoughts, which is largely immune from error. Some argue that we have a faculty of ‘inner sense’, which monitors the mind just as the outer senses monitor the world. There have been exceptions, however. The mid-20th-century behaviourist philosopher Gilbert Ryle held that we learn about our own minds, not by inner sense, but by observing our own behaviour, and that friends might know our minds better than we do. (Hence the joke: two behaviourists have just had sex and one turns to the other and says: ‘That was great for you, darling. How was it for me?’) And the contemporary philosopher Peter Carruthers proposes a similar view (though for different reasons), arguing that our beliefs about our own thoughts and decisions are the product of self-interpretation and are often mistaken.
Evidence for this comes from experimental work in social psychology. It is well established that people sometimes think they have beliefs that they don’t really have. For example, if offered a choice between several identical items, people tend to choose the one on the right. But when asked why they chose it, they confabulate a reason, saying they thought the item was a nicer colour or better quality. Similarly, if a person performs an action in response to an earlier (and now forgotten) hypnotic suggestion, they will confabulate a reason for performing it. What seems to be happening is that the subjects engage in unconscious self-interpretation. They don’t know the real explanation of their action (a bias towards the right, hypnotic suggestion), so they infer some plausible reason and ascribe it to themselves. They are not aware that they are interpreting, however, and make their reports as if they were directly aware of their reasons. [Continue reading…]
“The killing of a gorilla at the Cincinnati Zoo in order to save a child who fell in its enclosure has sparked nationwide outrage,” reports CBS News.
I share the outrage.
I happen to be among those who believe that the incarceration of wild animals for the entertainment of sightseers, cannot be justified. It does little to elevate the consciousness of the people and even less the well-being of the captives. The protection of endangered species requires first and foremost the protection of endangered habitats.
Upon seeing the news of the gorilla’s death, like many others, I also thought that if a four-year boy could even get into a situation like this, there had to be negligence on the part of parents, bystanders, and/or the zoo operators. Likewise, the decision to shoot and kill the 17-year-old gorilla, Harambe (a Swahili name which means, “all pull together”) seemed very questionable.
Among the outraged voices showing up on Facebook, the most venomous attacks have been directed at Michelle Gregg, the boy’s mother.
The crappy mother should have gotten shot instead, not the poor innocent gorilla!
Michelle Gregg says, “God protected my child until the authorities were able to get to him.” No, Harambe protected your child after you & God failed to stop him from climbing into the enclosure! And innocent Harambe ended up dead for his efforts, shot with a bullet that would have been better spent on you, for failing to look after your own child and being the cause of all this!
The creator of a Facebook page, Justice For Harambe (which has already received over 60,000 likes), propagated the claim that Gregg was planning to sue to zoo, and yet when asked to support this claim with some evidence simply said: “Educated guess.” The page’s stated objective is: “We wish to see charges brought against those responsible!!”
The outrage directed at Gregg has prompted a smaller wave of outrage coming from those who underline the fact that even when under the supervision of the most attentive of parents, small children do have a talent for slipping out of sight.
Meanwhile, the United Nations refugee agency announced on Sunday that at least 700 people are believed to have drowned in the Mediterranean this week as tens of thousands of refugees continue to seek safety in Europe.
The latest chapter in the worst humanitarian crisis since World War II has prompted very little outrage on this side of the Atlantic.
For observers of social media in the U.S., it’s hard to avoid concluding that the life of a gorilla is commonly regarded here as being more precious than the lives of countless human beings.
Although to some extent it’s heartening that this much concern is being shown about the premature death of a gorilla, it’s disturbing that over the last year and longer there has been such widespread indifference shown towards millions of people in desperate need.
Is there really such a compassion deficit in America, or does this reveal more about the psychology of rage?
My guess is that among those now seeking justice for Harambe, prior to this weekend many had not paid a great deal of interest in the welfare of western lowland gorillas.
The guiding emotions here were outrage at what seemed like the unnecessary loss of an innocent life, and a certain sympathy with fellow primates which all children feel and most adults have learned to sublimate.
The great apes fascinate us because on some level we recognize them as kin. We don’t just look at them; we see them with reflective awareness looking at us.
Yet why would a sense of kinship be able to extend outside our own species while falling short among other members of the human race?
What is at play here seems to have less to do with who or what we identify with than it does with the pathways that facilitate our connections.
It turns out that in the age of social media, outrage has become such a potent force because it allows strangers to bond.
Teddy Wayne writes:
A 2013 study, from Beihang University in Beijing, of Weibo, a Twitter-like site, found that anger is the emotion that spreads the most easily over social media. Joy came in a distant second. The main difference, said Ryan Martin, a psychology professor at the University of Wisconsin, Green Bay, who studies anger, is that although we tend to share the happiness only of people we are close to, we are willing to join in the rage of strangers. As the study suggests, outrage is lavishly rewarded on social media, whether through supportive comments, retweets or Facebook likes. People prone to Internet outrage are looking for validation, Professor Martin said. “They want to hear that others share it,” he said, “because they feel they’re vindicated and a little less lonely and isolated in their belief.”
Harambe’s death pulled strangers together in their shared anger. The sad and stern face of a silverback resonated across a population which, struggling to find common ground through things we can affirm, finds it much more easily in our discontent.
Specifically, something is undermining young people’s mental health, especially girls.
In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.
“It indicates a lot of suffering,” Twenge told Quartz.
It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:
Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.
The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).
“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.
Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.
Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.
Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.
“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.
“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]
Cari Romm writes: Here’s a fun exercise: Take a minute and count up all your friends. Not just the close ones, or the ones you’ve seen recently — I mean every single person on this Earth that you consider a pal.
Got a number in your mind? Good. Now cut it in half.
Okay, yes, “fun” may have been a bit of a reach there. But this new, smaller number may actually be more accurate. As it turns out, we can be pretty terrible at knowing who our friends are: In what may be among the saddest pieces of social-psychology research published in quite some time, a study in the journal PLoS One recently made the case that as many as half the people we consider our friends don’t feel the same way. [Continue reading…]
Let’s suspend any questions about the validity of this research finding (even though a lot of scientific papers these days do seem more geared towards grabbing social media attention than the advance of knowledge) and let’s consider instead whether this should indeed be a cause of sadness.
If it turns out that most of us have half as many friends as we imagine, that sounds like a strong reason for an unwelcome boost in self-doubt and insecurity.
If our friends are the people we trust, does this mean that a lot of our trust is misplaced?
Maybe — but that’s not as bad as it sounds.
Trust is a gamble. If we actually had no doubt and could reliably know who was a friend and who was not, there would be no need for trust.
Trust is a relationship with the unknown, and since it’s inevitably going to extend too far or not far enough, it seems that human beings as social creatures are built to trust more rather than less.
So this proclivity to imagine our net of friendships extends further than it really does, is probably less a reason for sadness than reason to be glad that for most people, trust is stronger than fear.
Amanda Gefter writes: As we go about our daily lives, we tend to assume that our perceptions — sights, sounds, textures, tastes — are an accurate portrayal of the real world. Sure, when we stop and think about it — or when we find ourselves fooled by a perceptual illusion — we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Not so, says Donald D. Hoffman, a professor of cognitive science at the University of California, Irvine. Hoffman has spent the past three decades studying perception, artificial intelligence, evolutionary game theory and the brain, and his conclusion is a dramatic one: The world presented to us by our perceptions is nothing like reality. What’s more, he says, we have evolution itself to thank for this magnificent illusion, as it maximizes evolutionary fitness by driving truth to extinction.
Getting at questions about the nature of reality, and disentangling the observer from the observed, is an endeavor that straddles the boundaries of neuroscience and fundamental physics. On one side you’ll find researchers scratching their chins raw trying to understand how a three-pound lump of gray matter obeying nothing more than the ordinary laws of physics can give rise to first-person conscious experience. This is the aptly named “hard problem.” [Continue reading…]
Every single person is different. We all have different backgrounds, views, values and interests. And yet there is one universal feeling that we all experience at every single moment. Call it an “ego”, a “self” or just an “I” – it’s the idea that our thoughts and feelings are our own, and no one else has access to them in the same way. This may sound a bit like post-war French existentialism or psycho-analysis, but it’s actually a topic that’s being increasingly addressed by neuroscientists.
We were part of a team interested in finding out how this sense of self is expressed in the brain – and what happens when it dissolves. To do that, we used brain imaging and the psychedelic drug LSD.
Our sense of self is something so natural that we are not always fully aware of it. In fact, it is when it is disturbed that it becomes the most noticeable. This could be due to mental illnesses such as psychosis, when people might experience the delusional belief that their thoughts are no longer private, but can be accessed and even modified by other people. Or it could be due to the influence of psychedelic drugs such as LSD, when the user can feel that their ego is “dissolving” and they are becoming at one with the world. From a scientific point of view, these experiences of “ego death” or ego dissolution are also opportunities to search for this sense of self in the brain.
Our study, led by Enzo Tagliazucchi and published in Current Biology, set out to probe what is happening in the brain when our sense of self becomes altered by psychedelic drugs (link to Enzo’s paper). We studied 15 healthy volunteers before and after taking LSD, which altered their normal feelings of their selves and their relationship with the environment. These subjects were scanned while intoxicated and while receiving placebo using functional MRI, a technique which allows us to study the brain’s activity by measuring changes in blood flow. By contrasting the activity of the brain when receiving a placebo with its activity after taking LSD, we could start exploring the brain mechanisms involved in the normal experience of the self.
Tom Jacobs writes: The link between violent video games and aggressive behavior is old, if still troubling, news. But violence is not the only problematic aspect of these enormously popular entertainments: Many are also blatantly, unapologetically sexist.
Take the best-selling Grand Theft Auto series. In London’s Daily Telegraph, a reviewer called the fifth installment “relentlessly misanthropic,” adding that “the game often coerced me into actions that degraded women.”
Well, it turns out that sort of “fun” has real-life consequences. In a newly published study, a research team from Italy and the United States reports playing such games reduces some players’ compassion for women who have been victims of violence. [Continue reading…]
Researchers from Imperial College London, working with the Beckley Foundation, have for the first time visualised the effects of LSD on the brain: In a series of experiments, scientists have gained a glimpse into how the psychedelic compound affects brain activity. The team administered LSD (Lysergic acid diethylamide) to 20 healthy volunteers in a specialist research centre and used various leading-edge and complementary brain scanning techniques to visualise how LSD alters the way the brain works.
The findings, published in Proceedings of the National Academy of Sciences (PNAS), reveal what happens in the brain when people experience the complex visual hallucinations that are often associated with LSD state. They also shed light on the brain changes that underlie the profound altered state of consciousness the drug can produce.
A major finding of the research is the discovery of what happens in the brain when people experience complex dreamlike hallucinations under LSD. Under normal conditions, information from our eyes is processed in a part of the brain at the back of the head called the visual cortex. However, when the volunteers took LSD, many additional brain areas – not just the visual cortex – contributed to visual processing.
Dr Robin Carhart-Harris, from the Department of Medicine at Imperial, who led the research, explained: “We observed brain changes under LSD that suggested our volunteers were ‘seeing with their eyes shut’ – albeit they were seeing things from their imagination rather than from the outside world. We saw that many more areas of the brain than normal were contributing to visual processing under LSD – even though the volunteers’ eyes were closed. Furthermore, the size of this effect correlated with volunteers’ ratings of complex, dreamlike visions.”
The study also revealed what happens in the brain when people report a fundamental change in the quality of their consciousness under LSD.
Dr Carhart-Harris explained: “Normally our brain consists of independent networks that perform separate specialised functions, such as vision, movement and hearing – as well as more complex things like attention. However, under LSD the separateness of these networks breaks down and instead you see a more integrated or unified brain.
“Our results suggest that this effect underlies the profound altered state of consciousness that people often describe during an LSD experience. It is also related to what people sometimes call ‘ego-dissolution’, which means the normal sense of self is broken down and replaced by a sense of reconnection with themselves, others and the natural world. This experience is sometimes framed in a religious or spiritual way – and seems to be associated with improvements in well-being after the drug’s effects have subsided.” [Continue reading…]
Amanda Feilding, executive director of the Beckley Foundation, in an address she will deliver to the Royal Society tomorrow, says: I think Albert Hoffman would have been delighted to have his “Problem child” celebrated at the Royal Society, as in his long lifetime the academic establishment never recognised his great contribution. But for the taboo surrounding this field, he would surely have won the Nobel Prize. That was the beginning of the modern psychedelic age, which has fundamentally changed society.
After the discovery of the effects of LSD, there was a burst of excitement in the medical and therapeutic worlds – over 1000 experimental and clinical studies were undertaken. Then, in the early 60s, LSD escaped from the labs and began to spread into the world at large. Fuelled by its transformational insights, a cultural evolution took place, whose effects are still felt today. It sparked a wave of interest in Eastern mysticism, healthy living, nurturing the environment, individual freedoms and new music and art among many other changes. Then the establishment panicked and turned to prohibition, partly motivated by American youth becoming disenchanted with fighting a war in far-off Vietnam.
Aghast at the global devastation caused by the war on drugs, I set up the Beckley Foundation in 1998. With the advent of brain imaging technology, I realised that one could correlate the subjective experience of altered states of consciousness, brought about by psychedelic substances, with empirical findings. I realised that only through the very best science investigating how psychedelics work in the brain could one overcome the misplaced taboo which had transformed them from the food of the gods to the work of the devil. [Continue reading…]
Just to be clear, as valuable as this research is, it is an exercise in map-making. The map should never be confused with the territory.
Regan Penaluna writes: In a recent Sunday, at my local Italian market, I considered the octopus. To eat the tentacle would be, in a way, like eating a brain — the eight arms of an octopus contain two-thirds of its half billion neurons. Delicious for some, yes — but for others, a jumping off point for the philosophical question of other minds.
“I do think it feels like something to be an octopus,” says Peter Godfrey-Smith, a professor of philosophy at CUNY Graduate Center, who has spent almost a decade considering the idea. Stories of octopuses’ remarkable ability to solve puzzles, open bottles, and interact with aquarium caretakers, suggest an affinity between their intelligence and our own. He wonders: What, if anything, is going on in its head — or as may be the case, its arms? The rest of its neurons are contained in lobes wrapping around its esophagus and sitting behind its eyes. This alien-like physiology is the result of almost 600 million years of evolution that separate us.
Since a 2008 dive off the coast of Sydney, Australia, where Godfrey-Smith encountered curious, 3-foot long cuttlefish, he’s been fascinated by the minds of cephalopods, which have the largest nervous systems of all the invertebrates. He’s teamed up with scientists to uncover their secret lives and behaviors, publishing in scientific journals and also a blog, where you can follow his adventures with posts that blend “natural history and philosophy.” He has a book coming out at the end of the year called Other Minds, which digs into how the octopus helps us understand the evolution of subjective experience. “I think cephalopods have a special kind of otherness, because they are organized so differently from us and diverged evolutionarily from our line so long ago,” he says. “If they do have minds, theirs are the most other minds of all.” [Continue reading…]
The subject of narcissism has intrigued people for centuries, but social scientists now claim that it has become a modern “epidemic”. So what is it, what has led to its increase, and is there anything we can do about it?
In the beginning
The term narcissism originated more than 2,000 years ago, when Ovid wrote the legend of Narcissus. He tells the story of a beautiful Greek hunter who, one day, happens to see his reflection in a pool of water and falls in love with it. He becomes obsessed with its beauty, and is unable to leave his reflected image until he dies. After his death, the flower narcissus grew where he lay.
The concept of narcissism was popularised by the psychoanalyst Sigmund Freud through his work on the ego and its relationship to the outside world; this work became the starting point for many others developing theories on narcissism.
Melissa Dahl writes: The fire alarm goes off, and it’s apparently not a mistake or a drill: Just outside the door, smoke fills the hallway. Luckily, you happen to have a guide for such a situation: a little bot with a sign that literally reads EMERGENCY GUIDE ROBOT. But, wait — it’s taking you in the opposite direction of the way you came in, and it seems to be wanting you to go down an unfamiliar hallway. Do you trust your own instinct and escape the way you came? Or do you trust the robot?
Probably, you will blindly follow the robot, according to the findings of a fascinating new study from the Georgia Institute of Technology. In an emergency situation — a fake one, though the test subjects didn’t know that — most people trusted the robot over their own instincts, even when the robot had showed earlier signs of malfunctioning. It’s a new wrinkle for researchers who study trust in human-robot interactions. Previously, this work had been focused on getting people to trust robotics, such as Google’s driverless cars. Now this new research hints at another problem: How do you stop people from trusting robots too much? It’s a timely question, especially considering the news this week of the first crash caused by one of Google’s self-driving cars. [Continue reading…]
This piece has been taken down at the request of The Conversation.
Michael Schulson writes: n the 1970s, a young American anthropologist named Michael Dove set out for Indonesia, intending to solve an ethnographic mystery. Then a graduate student at Stanford, Dove had been reading about the Kantu’, a group of subsistence farmers who live in the tropical forests of Borneo. The Kantu’ practise the kind of shifting agriculture known to anthropologists as swidden farming, and to everyone else as slash-and-burn. Swidden farmers usually grow crops in nutrient-poor soil. They use fire to clear their fields, which they abandon at the end of each growing season.
Like other swidden farmers, the Kantu’ would establish new farming sites ever year in which to grow rice and other crops. Unlike most other swidden farmers, the Kantu’ choose where to place these fields through a ritualised form of birdwatching. They believe that certain species of bird – the Scarlet-rumped Trogon, the Rufous Piculet, and five others – are the sons-in-law of God. The appearances of these birds guide the affairs of human beings. So, in order to select a site for cultivation, a Kantu’ farmer would walk through the forest until he spotted the right combination of omen birds. And there he would clear a field and plant his crops.
Dove figured that the birds must be serving as some kind of ecological indicator. Perhaps they gravitated toward good soil, or smaller trees, or some other useful characteristic of a swidden site. After all, the Kantu’ had been using bird augury for generations, and they hadn’t starved yet. The birds, Dove assumed, had to be telling the Kantu’ something about the land. But neither he, nor any other anthropologist, had any notion of what that something was. [Continue reading…]