Specifically, something is undermining young people’s mental health, especially girls.
In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.
“It indicates a lot of suffering,” Twenge told Quartz.
It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:
Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.
The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).
“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.
Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.
Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.
Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.
“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.
“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]
Cari Romm writes: Here’s a fun exercise: Take a minute and count up all your friends. Not just the close ones, or the ones you’ve seen recently — I mean every single person on this Earth that you consider a pal.
Got a number in your mind? Good. Now cut it in half.
Okay, yes, “fun” may have been a bit of a reach there. But this new, smaller number may actually be more accurate. As it turns out, we can be pretty terrible at knowing who our friends are: In what may be among the saddest pieces of social-psychology research published in quite some time, a study in the journal PLoS One recently made the case that as many as half the people we consider our friends don’t feel the same way. [Continue reading…]
Let’s suspend any questions about the validity of this research finding (even though a lot of scientific papers these days do seem more geared towards grabbing social media attention than the advance of knowledge) and let’s consider instead whether this should indeed be a cause of sadness.
If it turns out that most of us have half as many friends as we imagine, that sounds like a strong reason for an unwelcome boost in self-doubt and insecurity.
If our friends are the people we trust, does this mean that a lot of our trust is misplaced?
Maybe — but that’s not as bad as it sounds.
Trust is a gamble. If we actually had no doubt and could reliably know who was a friend and who was not, there would be no need for trust.
Trust is a relationship with the unknown, and since it’s inevitably going to extend too far or not far enough, it seems that human beings as social creatures are built to trust more rather than less.
So this proclivity to imagine our net of friendships extends further than it really does, is probably less a reason for sadness than reason to be glad that for most people, trust is stronger than fear.
Amanda Gefter writes: As we go about our daily lives, we tend to assume that our perceptions — sights, sounds, textures, tastes — are an accurate portrayal of the real world. Sure, when we stop and think about it — or when we find ourselves fooled by a perceptual illusion — we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Not so, says Donald D. Hoffman, a professor of cognitive science at the University of California, Irvine. Hoffman has spent the past three decades studying perception, artificial intelligence, evolutionary game theory and the brain, and his conclusion is a dramatic one: The world presented to us by our perceptions is nothing like reality. What’s more, he says, we have evolution itself to thank for this magnificent illusion, as it maximizes evolutionary fitness by driving truth to extinction.
Getting at questions about the nature of reality, and disentangling the observer from the observed, is an endeavor that straddles the boundaries of neuroscience and fundamental physics. On one side you’ll find researchers scratching their chins raw trying to understand how a three-pound lump of gray matter obeying nothing more than the ordinary laws of physics can give rise to first-person conscious experience. This is the aptly named “hard problem.” [Continue reading…]
Every single person is different. We all have different backgrounds, views, values and interests. And yet there is one universal feeling that we all experience at every single moment. Call it an “ego”, a “self” or just an “I” – it’s the idea that our thoughts and feelings are our own, and no one else has access to them in the same way. This may sound a bit like post-war French existentialism or psycho-analysis, but it’s actually a topic that’s being increasingly addressed by neuroscientists.
We were part of a team interested in finding out how this sense of self is expressed in the brain – and what happens when it dissolves. To do that, we used brain imaging and the psychedelic drug LSD.
Our sense of self is something so natural that we are not always fully aware of it. In fact, it is when it is disturbed that it becomes the most noticeable. This could be due to mental illnesses such as psychosis, when people might experience the delusional belief that their thoughts are no longer private, but can be accessed and even modified by other people. Or it could be due to the influence of psychedelic drugs such as LSD, when the user can feel that their ego is “dissolving” and they are becoming at one with the world. From a scientific point of view, these experiences of “ego death” or ego dissolution are also opportunities to search for this sense of self in the brain.
Our study, led by Enzo Tagliazucchi and published in Current Biology, set out to probe what is happening in the brain when our sense of self becomes altered by psychedelic drugs (link to Enzo’s paper). We studied 15 healthy volunteers before and after taking LSD, which altered their normal feelings of their selves and their relationship with the environment. These subjects were scanned while intoxicated and while receiving placebo using functional MRI, a technique which allows us to study the brain’s activity by measuring changes in blood flow. By contrasting the activity of the brain when receiving a placebo with its activity after taking LSD, we could start exploring the brain mechanisms involved in the normal experience of the self.
Tom Jacobs writes: The link between violent video games and aggressive behavior is old, if still troubling, news. But violence is not the only problematic aspect of these enormously popular entertainments: Many are also blatantly, unapologetically sexist.
Take the best-selling Grand Theft Auto series. In London’s Daily Telegraph, a reviewer called the fifth installment “relentlessly misanthropic,” adding that “the game often coerced me into actions that degraded women.”
Well, it turns out that sort of “fun” has real-life consequences. In a newly published study, a research team from Italy and the United States reports playing such games reduces some players’ compassion for women who have been victims of violence. [Continue reading…]
Researchers from Imperial College London, working with the Beckley Foundation, have for the first time visualised the effects of LSD on the brain: In a series of experiments, scientists have gained a glimpse into how the psychedelic compound affects brain activity. The team administered LSD (Lysergic acid diethylamide) to 20 healthy volunteers in a specialist research centre and used various leading-edge and complementary brain scanning techniques to visualise how LSD alters the way the brain works.
The findings, published in Proceedings of the National Academy of Sciences (PNAS), reveal what happens in the brain when people experience the complex visual hallucinations that are often associated with LSD state. They also shed light on the brain changes that underlie the profound altered state of consciousness the drug can produce.
A major finding of the research is the discovery of what happens in the brain when people experience complex dreamlike hallucinations under LSD. Under normal conditions, information from our eyes is processed in a part of the brain at the back of the head called the visual cortex. However, when the volunteers took LSD, many additional brain areas – not just the visual cortex – contributed to visual processing.
Dr Robin Carhart-Harris, from the Department of Medicine at Imperial, who led the research, explained: “We observed brain changes under LSD that suggested our volunteers were ‘seeing with their eyes shut’ – albeit they were seeing things from their imagination rather than from the outside world. We saw that many more areas of the brain than normal were contributing to visual processing under LSD – even though the volunteers’ eyes were closed. Furthermore, the size of this effect correlated with volunteers’ ratings of complex, dreamlike visions.”
The study also revealed what happens in the brain when people report a fundamental change in the quality of their consciousness under LSD.
Dr Carhart-Harris explained: “Normally our brain consists of independent networks that perform separate specialised functions, such as vision, movement and hearing – as well as more complex things like attention. However, under LSD the separateness of these networks breaks down and instead you see a more integrated or unified brain.
“Our results suggest that this effect underlies the profound altered state of consciousness that people often describe during an LSD experience. It is also related to what people sometimes call ‘ego-dissolution’, which means the normal sense of self is broken down and replaced by a sense of reconnection with themselves, others and the natural world. This experience is sometimes framed in a religious or spiritual way – and seems to be associated with improvements in well-being after the drug’s effects have subsided.” [Continue reading…]
Amanda Feilding, executive director of the Beckley Foundation, in an address she will deliver to the Royal Society tomorrow, says: I think Albert Hoffman would have been delighted to have his “Problem child” celebrated at the Royal Society, as in his long lifetime the academic establishment never recognised his great contribution. But for the taboo surrounding this field, he would surely have won the Nobel Prize. That was the beginning of the modern psychedelic age, which has fundamentally changed society.
After the discovery of the effects of LSD, there was a burst of excitement in the medical and therapeutic worlds – over 1000 experimental and clinical studies were undertaken. Then, in the early 60s, LSD escaped from the labs and began to spread into the world at large. Fuelled by its transformational insights, a cultural evolution took place, whose effects are still felt today. It sparked a wave of interest in Eastern mysticism, healthy living, nurturing the environment, individual freedoms and new music and art among many other changes. Then the establishment panicked and turned to prohibition, partly motivated by American youth becoming disenchanted with fighting a war in far-off Vietnam.
Aghast at the global devastation caused by the war on drugs, I set up the Beckley Foundation in 1998. With the advent of brain imaging technology, I realised that one could correlate the subjective experience of altered states of consciousness, brought about by psychedelic substances, with empirical findings. I realised that only through the very best science investigating how psychedelics work in the brain could one overcome the misplaced taboo which had transformed them from the food of the gods to the work of the devil. [Continue reading…]
Just to be clear, as valuable as this research is, it is an exercise in map-making. The map should never be confused with the territory.
Regan Penaluna writes: In a recent Sunday, at my local Italian market, I considered the octopus. To eat the tentacle would be, in a way, like eating a brain — the eight arms of an octopus contain two-thirds of its half billion neurons. Delicious for some, yes — but for others, a jumping off point for the philosophical question of other minds.
“I do think it feels like something to be an octopus,” says Peter Godfrey-Smith, a professor of philosophy at CUNY Graduate Center, who has spent almost a decade considering the idea. Stories of octopuses’ remarkable ability to solve puzzles, open bottles, and interact with aquarium caretakers, suggest an affinity between their intelligence and our own. He wonders: What, if anything, is going on in its head — or as may be the case, its arms? The rest of its neurons are contained in lobes wrapping around its esophagus and sitting behind its eyes. This alien-like physiology is the result of almost 600 million years of evolution that separate us.
Since a 2008 dive off the coast of Sydney, Australia, where Godfrey-Smith encountered curious, 3-foot long cuttlefish, he’s been fascinated by the minds of cephalopods, which have the largest nervous systems of all the invertebrates. He’s teamed up with scientists to uncover their secret lives and behaviors, publishing in scientific journals and also a blog, where you can follow his adventures with posts that blend “natural history and philosophy.” He has a book coming out at the end of the year called Other Minds, which digs into how the octopus helps us understand the evolution of subjective experience. “I think cephalopods have a special kind of otherness, because they are organized so differently from us and diverged evolutionarily from our line so long ago,” he says. “If they do have minds, theirs are the most other minds of all.” [Continue reading…]
The subject of narcissism has intrigued people for centuries, but social scientists now claim that it has become a modern “epidemic”. So what is it, what has led to its increase, and is there anything we can do about it?
In the beginning
The term narcissism originated more than 2,000 years ago, when Ovid wrote the legend of Narcissus. He tells the story of a beautiful Greek hunter who, one day, happens to see his reflection in a pool of water and falls in love with it. He becomes obsessed with its beauty, and is unable to leave his reflected image until he dies. After his death, the flower narcissus grew where he lay.
The concept of narcissism was popularised by the psychoanalyst Sigmund Freud through his work on the ego and its relationship to the outside world; this work became the starting point for many others developing theories on narcissism.
Melissa Dahl writes: The fire alarm goes off, and it’s apparently not a mistake or a drill: Just outside the door, smoke fills the hallway. Luckily, you happen to have a guide for such a situation: a little bot with a sign that literally reads EMERGENCY GUIDE ROBOT. But, wait — it’s taking you in the opposite direction of the way you came in, and it seems to be wanting you to go down an unfamiliar hallway. Do you trust your own instinct and escape the way you came? Or do you trust the robot?
Probably, you will blindly follow the robot, according to the findings of a fascinating new study from the Georgia Institute of Technology. In an emergency situation — a fake one, though the test subjects didn’t know that — most people trusted the robot over their own instincts, even when the robot had showed earlier signs of malfunctioning. It’s a new wrinkle for researchers who study trust in human-robot interactions. Previously, this work had been focused on getting people to trust robotics, such as Google’s driverless cars. Now this new research hints at another problem: How do you stop people from trusting robots too much? It’s a timely question, especially considering the news this week of the first crash caused by one of Google’s self-driving cars. [Continue reading…]
This piece has been taken down at the request of The Conversation.
Michael Schulson writes: n the 1970s, a young American anthropologist named Michael Dove set out for Indonesia, intending to solve an ethnographic mystery. Then a graduate student at Stanford, Dove had been reading about the Kantu’, a group of subsistence farmers who live in the tropical forests of Borneo. The Kantu’ practise the kind of shifting agriculture known to anthropologists as swidden farming, and to everyone else as slash-and-burn. Swidden farmers usually grow crops in nutrient-poor soil. They use fire to clear their fields, which they abandon at the end of each growing season.
Like other swidden farmers, the Kantu’ would establish new farming sites ever year in which to grow rice and other crops. Unlike most other swidden farmers, the Kantu’ choose where to place these fields through a ritualised form of birdwatching. They believe that certain species of bird – the Scarlet-rumped Trogon, the Rufous Piculet, and five others – are the sons-in-law of God. The appearances of these birds guide the affairs of human beings. So, in order to select a site for cultivation, a Kantu’ farmer would walk through the forest until he spotted the right combination of omen birds. And there he would clear a field and plant his crops.
Dove figured that the birds must be serving as some kind of ecological indicator. Perhaps they gravitated toward good soil, or smaller trees, or some other useful characteristic of a swidden site. After all, the Kantu’ had been using bird augury for generations, and they hadn’t starved yet. The birds, Dove assumed, had to be telling the Kantu’ something about the land. But neither he, nor any other anthropologist, had any notion of what that something was. [Continue reading…]
Pacific Standard reports: They’re feared and often loathed, viewed as non-conformists who pose a threat to our nation’s moral compass. But if more were open about their inclinations, and engaged in congenial conversation with members of the mistrusting majority, that prejudice might start melting away.
It happened with gays and lesbians. Perhaps it’s time for atheists to give it a try.
That’s one implication of newly published research, which reports simply imagining a positive interaction with an atheist is enough to increase willingness to engage and cooperate with them. [Continue reading…]
Tim Lomas writes: [‘untranslatable’] words exert great fascination, not only in specialised fields like linguistics or anthropology (Wierzbicka, 1999), but also in popular culture. Part of the fascination seems to derive from the notion that such words offer ‘windows’ into other cultures, and thus potentially into new ways of being in the world. As Wierzbicka (1997, p. 5) puts it, ‘words with special, culture-specific meanings reflect and pass on not only ways of living characteristic of a given society, but also ways of thinking’. Thus, ‘untranslatable’ words are not only of interest to translators; after all, many such professionals argue that it can be difficult to find exact translations for most words, and that nearly all terms lose some specificity or nuance when rendered in another tongue (Hatim & Munday, 2004). Rather, ‘untranslatability’ reflects the notion that such words identify phenomena that have only been recognised by specific cultures. Perhaps the most famous example is Schadenfreude, a German term describing pleasure at the misfortunes of others. Such words are not literally untranslatable, of course, since their meaning can be conveyed in a sentence. Rather, they are deemed ‘untranslatable’ to the extent that other languages lack a single word/phrase for the phenomenon.
The significance of such words is much debated. A dominant theoretical notion here is ‘linguistic relativity’ (Hussein, 2012). First formulated by the German philosophers Herder (1744–1803) and Humboldt (1767–1835), it came to prominence with the linguist Sapir (1929) and his student Whorf (1940). Their so-called ‘Sapir-Whorf hypothesis’ holds that language plays a constitutive role in the way that people experience, understand and even perceive the world. As Whorf (1956, pp. 213–214) put it, ‘We dissect nature along lines laid out by our native languages … The world is presented as a kaleidoscopic flux of impressions which has to be organized … largely by the linguistic systems in our minds’. This hypothesis comes in various strengths. Its stronger form is linguistic determinism, where language inextricably constitutes and constrains thought. For instance, Whorf argued that the Hopi people had a different experience of time due to particularities in their grammar, such that they lacked a linear sense of past, present and future. This strong determinism has been criticised, e.g. by Pinker (1995), who argued that the Hopi experience of time was not particularly different to that of Western cultures. However, the milder form of the hypothesis, linguistic relativism, simply holds that language shapes thought and experience. This milder hypothesis is generally accepted by most anthropologists and other such scholars (Perlovsky, 2009). [Read more…]
Erica Wagner writes: Helen Maria Williams observed the French Revolution at first hand. A poet, essayist and novelist known for her support of radical causes, she entertained the likes of Thomas Paine and Mary Wollstonecraft in her salons. Among the things she perceived, in her accounts of political turmoil across the English Channel, were differences in national character when it came to expressing emotion.
“You will see Frenchmen bathed in tears at a tragedy,” she wrote in 1792. “An Englishman has quite as much sensibility to a generous or tender sentiment; but he thinks it would be unmanly to weep; and, though half choaked with emotion, he scorns to be overcome, contrives to gain the victory over his feelings, and throws into his countenance as much apathy as he can well wish.”
And so you would be forgiven for thinking that the stiff upper lip – the complete refusal of lachrymosity, no matter what disaster befalls us – has been paralysing the faces of Britishers since Stonehenge was raised on Salisbury Plain. But, as Thomas Dixon shows in his erudite and entertaining book Weeping Britannia, you would be wrong. Once upon a time and not so very long ago, this nation was given to paroxysms of sobbing at almost any opportunity. Dixon, a historian of emotions, philosophy, science and religion (phew!) at Queen Mary, University of London, asks what dried our tears and wonders whether the death of Diana, Princess of Wales, in 1997 unlocked the floodgates again.
Both he and Tiffany Watt Smith, in The Book of Human Emotions, offer a reminder that “emotion” is a pretty novel idea. [Continue reading…]
Princess Ojiaku writes: Sad music might make people feel vicarious unpleasant emotions, found a study published last year in Frontiers in Psychology. But this experience can ultimately be pleasurable because it allows a negative emotion to exist indirectly, and at a safe distance. Instead of feeling the depths of despair, people can feel nostalgia for a time when they were in a similar emotional state: a non-threatening way to remember a sadness.
People who are very empathetic are more likely to take pleasure in the emotional experience of sad music, according to another study in Frontiers of Psychology. Others enjoy sad songs because they help them return to an emotionally balanced state, according to a review in Frontiers in Human Neuroscience, published in 2015. And those more open to varied experiences might enjoy the songs because the unique emotions that come up when listening to the music fulfill their need for novelty in thoughts and feelings.
In fact, the quest for variety could also explain why some people crave dissonant or experimental music full of uneven, cacophonous or downright disorienting sound. Musical genres such as noise, no wave and experimental rock have a dedicated fan and artist base in thriving underground scenes. Research suggests that people who are more open to novelty are more likely to take pleasure in the uncommon elements found in these non-traditional types of music. A 2012 study in Psychology of Music found a positive correlation between openness to experience and liking jazz, a genre that often defies a traditional pop structure. [Continue reading…]