Specifically, something is undermining young people’s mental health, especially girls.
In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.
“It indicates a lot of suffering,” Twenge told Quartz.
It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:
Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.
The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).
“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.
Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.
Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.
Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.
“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.
“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]
Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.
Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.
A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”
Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.
Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)
That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]
Douglas Starr writes: About a year and a half ago, Jessica Schneider was handed a flyer by one of her colleagues in the child-advocacy community. It advertised a training session, offered under the auspices of the Illinois Principals Association (I.P.A.), in how to interrogate students. Specifically, teachers and school administrators would be taught an abbreviated version of the Reid Technique, which is used across the country by police officers, private-security personnel, insurance-fraud investigators, and other people for whom getting at the truth is part of the job. Schneider, who is a staff attorney at the Chicago Lawyers’ Committee for Civil Rights Under Law, was alarmed. She knew that some psychologists and jurists have characterized the technique as coercive and liable to produce false confessions — especially when used with juveniles, who are highly suggestible. When she expressed her concerns to Brian Schwartz, the I.P.A.’s general counsel, he said that the association had been offering Reid training for many years and found it both popular and benign. To prove it, he invited Schneider to attend a session in January of 2015.
The training was led by Joseph Buckley, the president of John E. Reid and Associates, which is based in Chicago. Like the adult version of the Reid Technique, the school version involves three basic parts: an investigative component, in which you gather evidence; a behavioral analysis, in which you interview a suspect to determine whether he or she is lying; and a nine-step interrogation, a nonviolent but psychologically rigorous process that is designed, according to Reid’s workbook, “to obtain an admission of guilt.” Most of the I.P.A. session, Schneider told me, focussed on behavioral analysis. Buckley described to trainees how patterns of body language — including slumping, failing to look directly at the interviewer, offering “evasive” responses, and showing generally “guarded” behaviors — could supposedly reveal whether a suspect was lying. (Some of the cues were downright mythological — like, for instance, the idea that individuals look left when recalling the truth and right when trying to fabricate.) Several times during the session, Buckley showed videos of interrogations involving serious crimes, such as murder, theft, and rape. None of the videos portrayed young people being questioned for typical school misbehavior, nor did any of the Reid teaching materials refer to “students” or “kids.” They were always “suspects” or “subjects.”
Laura Nirider, a professor of law at Northwestern University and the project director of the Center on Wrongful Convictions of Youth, attended the same session as Schneider. She told me that about sixty people were there. “Everybody was on the edge of their seat: ‘So this is how we can learn to get the drop on little Billy for writing graffiti on the underside of the lunchroom table,’” she said. One vice-principal told Nirider that the first thing he does when he interrogates students is take away their cell phones, “so they can’t call their mothers.” [Continue reading…]
The phrase “Natural History” is linked in most people’s minds today with places that use the phrase: the various Natural History Museums, or television programmes narrated so evocatively by renowned naturalist Sir David Attenborough.
As times have changed, used in its traditional sense the phrase now has an almost archaic ring to it, perhaps recalling the Victorian obsession with collecting butterflies or beetles, rocks or fossils, or stuffed birds and animals, or perhaps the 18th century best-seller, Gilbert White’s The Natural History of Selborne.
Once natural history was part of what was equally archaically called natural philosophy, encompassing the enquiry into all aspects of the natural world that we inhabit, from the tiniest creature to the largest, to molecules and materials, to planets and stars in outer space. These days, we call it science. Natural history specifically strives to study and understand organisms within their environment, which would these days equate to the disciplines of ecology or conservation.
In a recent article in the journal BioScience, a group of 17 scientists decry what they see as a shift away from this traditional learning (once typical parts of biology degrees) that taught students about organisms: where they live, what they eat, how they behave, their variety and relationships to their ecosystems in which they live.
Partly by the promise of a course-specific career, and perhaps partly because of poorly taught courses that can emphasise rote learning, students are enticed into more exciting fields such as biotechnology or evolutionary developmental biology (“evo-devo”), where understanding an organism is less important than understanding the function of a particular organ or limb.
Gordon Brown writes: In Beirut — the troubled capital of a country with one of the worst histories of sectarian violence in the world — a unique experiment is underway.
Born out of a National Charter for Education on Living Together in Lebanon — which leaders of all major religions have signed — a common school curriculum on shared values is being taught in primary and secondary schools to Shiite, Sunni and Christian pupils.
The curriculum focuses on “the promotion of coexistence” by embracing “inclusive citizenship” and “religious diversity” and aims to ensure what the instigators call “liberation from the risks of . . . sectarianism.” But the new curriculum is more than an optimistic plea to love thy neighbor and an assertion of a golden rule common to all religions. It teaches pupils that they can celebrate differences without threatening coexistence.
The curriculum is designed for children starting at age 9 and includes four modules. The first tells the story of the global human family, asserting that all are equal in dignity. The second focuses on the rights and duties of citizenship, irrespective of religious or ethnic background. The third covers religious diversity, including the “refusal of any radicalism and religious or sectarian seclusion.” In the fourth, the emphasis shifts from the local to the need for global cultural diversity.
Of course, there is a long way to go before this experiment bears fruit, but the fact that it is happening today in Lebanon is of global significance because of the country’s decision to offer schooling to all Syrian refugee children.
Operating under a double-shift system — Lebanese children are taught in the morning, Syrian refugees in the afternoon — the public schools now house more refugee pupils — nearly 200,000 Syrian boys and girls — than local ones. [Continue reading…]
The Atlantic reports: It wasn’t long after the onset of the Great Recession that academics and headline writers began referring to recent college graduates as a “lost generation.” Faced with unemployment rates for their cohort higher than at any time since World War II, young Americans seemed doomed to a lifetime of lower earnings and savings. But even at the peak of pessimistic predictions, pundits had to acknowledge: Those with college degrees were relatively well-off compared to those without.
What, then, do you call an entire generation that never even finishes college? That’s the threat facing Syria’s young adults. In the years leading up to the current civil war, enrollment figures for Syrian tertiary education had been climbing steadily upward—from 12 percent of the college-age population in 2002, according to the UNESCO Institute for Statistics, to 26 percent in 2010, on the eve of the Syrian uprising. Now, the estimated 100,000 university-qualified refugees currently scattered throughout the Middle East and Europe must place their hopes in schools outside Syria—and that’s to say nothing of those still inside the country, where few educational institutions remain functional. In neighboring Turkey, Lebanon, and Jordan, all of which have been overwhelmed with refugees since the start of the conflict, only a fraction of students have found ways to continue their studies, despite the number of Syrian students in Turkish universities, for example, reportedly quadrupling in recent years. With professors and researchers displaced as well, Syria’s entire university infrastructure is at risk. [Continue reading…]
As the clock moves towards 12.45pm I begin to anxiously await the flurry of emails that I’ve come to expect in advance of my 2pm class. The class is on law and human rights. Students email to say that a deterioration in the security situation means they must stay within the relative safety of their own area, their parents naturally apprehensive that travel across the West Bank could potentially be dangerous.
This has become the everyday reality this semester for students attending Al Quds University, and Al Quds (Bard) University – a partnership with the American liberal arts institution.
The university soon gives the call for all staff and students to evacuate. In an entirely depressing but ultimately predictable scenario, Palestinian students will not be able to take their classes in literature, law, biology or media. Those on site make their way to the agreed “safe” area with alcohol-drenched cotton balls handed out by the ever vigilant staff of the Palestinian Red Crescent to ward off the effects of the inevitable deluge of tear gas.
The university has tried to continue life as normal. On October 13, Al Quds university welcomed the president of India, Pranab Mukherjee on campus with great pomp and splendour to receive an honorary degree. Indian flags adorned the beautiful campus grounds and academics dressed in ceremonial gowns to applaud the visit of the world leader.
Brendan Browne., Author provided
But there were also protests from students angry at recent violence against them in Jerusalem, using this platform to draw attention to their ongoing suffering. Within 45 minutes of the Indian contingent leaving, Israeli forces stormed the campus and violently arrested eight students while simultaneously causing significant damage to property, according to the student group Mojama’a Alanshita which posted a video of some of the arrests on Facebook.
Robert Twigger writes: I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.
We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?
The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’ [Continue reading…]
With the world’s focus firmly on the European response to the refugee crisis in recent weeks, attention has been diverted away from the humanitarian needs of the Middle East itself.
Only a minority of refugees have fled to Europe, with the majority of Syrians travelling across neighbouring borders to Jordan, Turkey and Lebanon. These movements of people have placed considerable pressure on already stretched public services, and children – one of the most vulnerable groups – are being severely affected.
Hundreds of thousands of them are at risk of becoming ill, malnourished, abused and exploited – and for the vast majority, they have no access to education.
A significant proportion of the 13m children reported by UNICEF as deprived of an education in the Middle East, are from Syria. With limited and interrupted education, what does the future hold for these children – and for the future of Syria?
The Guardian reports: A postgraduate student of counter-terrorism was falsely accused of being a terrorist after an official at Staffordshire University had spotted him reading a textbook entitled Terrorism Studies in the college library.
Mohammed Umar Farooq, who was enrolled in the terrorism, crime and global security master’s programme, told the Guardian that he was questioned about attitudes to homosexuality, Islamic State (Isis) and al-Qaida.
His replies, Farooq said, were largely academic but he stressed his personal opposition to extremist views. However, the conversation in the library was reported by the official to security guards, because it had raised “too many red flags” .
“I could not believe it. I was reading an academic textbook and minding my own business. At first I thought I’d just laugh it off as a joke,” said Farooq, who then instructed a lawyer to help him challenge and rebut the claims. [Continue reading…]
Filling classrooms to the brim with computers and tablets won’t necessarily help children get better grades. That’s the finding of a new report from the Organisation for Economic Co-operation and Development (OECD).
The report reviews the links between test results of 15-year-olds from 64 countries who took part in the OECD’s 2012 Programme for International Student Assessment (PISA) and how much the pupils used technology at home and school.
Pupils in 31 countries, not including the UK, also took part in extra online tests of digital reading, navigation and mathematics. The countries and cities that came top in these online tests were Singapore, South Korea, Hong Kong and Japan – who also perform well in paper-based tests. But pupils in these countries don’t necessarily spend a lot of time on computers in class.
The report also shows that in 2012, 96% of 15-year-old students in the 64 countries in the study reported that they have a computer at home, but only 72% reported that they used a desktop, laptop or tablet computer at school.
The OECD found that it was not the amount of digital technology used in schools that was linked with scores in the PISA tests, but what teachers ask pupils do with computers or tablets that counts. There is also an increasing digital divide between school and home.
Greg Lukianoff and Jonathan Haidt write: Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law — or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia — and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Two terms have risen quickly from obscurity into common campus parlance. Microaggressions are small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless. For example, by some campus guidelines, it is a microaggression to ask an Asian American or Latino American “Where were you born?,” because this implies that he or she is not a real American. Trigger warnings are alerts that professors are expected to issue if something in a course might cause a strong emotional response. For example, some students have called for warnings that Chinua Achebe’s Things Fall Apart describes racial violence and that F. Scott Fitzgerald’s The Great Gatsby portrays misogyny and physical abuse, so that students who have been previously victimized by racism or domestic violence can choose to avoid these works, which they believe might “trigger” a recurrence of past trauma.
Some recent campus actions border on the surreal. In April, at Brandeis University, the Asian American student association sought to raise awareness of microaggressions against Asians through an installation on the steps of an academic hall. The installation gave examples of microaggressions such as “Aren’t you supposed to be good at math?” and “I’m colorblind! I don’t see race.” But a backlash arose among other Asian American students, who felt that the display itself was a microaggression. The association removed the installation, and its president wrote an e-mail to the entire student body apologizing to anyone who was “triggered or hurt by the content of the microaggressions.” [Continue reading…]
James McWilliams writes: In January 2010, while driving from Chicago to Minneapolis, Sam McNerney played an audiobook and had an epiphany. The book was Jonah Lehrer’s How We Decide, and the epiphany was that consciousness could reside in the brain. The quest for an empirical understanding of consciousness has long preoccupied neurobiologists. But McNerney was no neurobiologist. He was a twenty-year-old philosophy major at Hamilton College. The standard course work — ancient, modern, and contemporary philosophy — enthralled him. But after this drive, after he listened to Lehrer, something changed. “I had to rethink everything I knew about everything,” McNerney said.
Lehrer’s publisher later withdrew How We Decide for inaccuracies. But McNerney was mentally galvanized for good reason. He had stumbled upon what philosophers call the “Hard Problem” — the quest to understand the enigma of the gap between mind and body. Intellectually speaking, what McNerney experienced was like diving for a penny in a pool and coming up with a gold nugget.
The philosopher Thomas Nagel drew popular attention to the Hard Problem four decades ago in an influential essay titled “What Is It Like to Be a Bat?” Frustrated with the “recent wave of reductionist euphoria,” Nagel challenged the reductive conception of mind — the idea that consciousness resides as a physical reality in the brain — by highlighting the radical subjectivity of experience. His main premise was that “an organism has conscious mental states if and only if there is something that it is like to be that organism.”
If that idea seems elusive, consider it this way: A bat has consciousness only if there is something that it is like for that bat to be a bat. Sam has consciousness only if there is something it is like for Sam to be Sam. You have consciousness only if there is something that it is like for you to be you (and you know that there is). And here’s the key to all this: Whatever that “like” happens to be, according to Nagel, it necessarily defies empirical verification. You can’t put your finger on it. It resists physical accountability.
McNerney returned to Hamilton intellectually turbocharged. This was an idea worth pondering. “It took hold of me,” he said. “It chose me — I know you hear that a lot, but that’s how it felt.” He arranged to do research in cognitive science as an independent study project with Russell Marcus, a trusted professor. Marcus let him loose to write what McNerney calls “a seventy-page hodgepodge of psychological research and philosophy and everything in between.” Marcus remembered the project more charitably, as “a huge, ambitious, wide-ranging, smart, and engaging paper.” Once McNerney settled into his research, Marcus added, “it was like he had gone into a phone booth and come out as a super-student.”
When he graduated in 2011, McNerney was proud. “I pulled it off,” he said about earning a degree in philosophy. Not that he had any hard answers to any big problems, much less the Hard Problem. Not that he had a job. All he knew was that he “wanted to become the best writer and thinker I could be.”
So, as one does, he moved to New York City.
McNerney is the kind of young scholar adored by the humanities. He’s inquisitive, open-minded, thrilled by the world of ideas, and touched with a tinge of old-school transcendentalism. What Emerson said of Thoreau — “he declined to give up his large ambition of knowledge and action for any narrow craft or profession” — is certainly true of McNerney. [Continue reading…]
Celia Deane-Drummond, a professor in theology at the University of Notre Dame, writes: Geologists claim that we are now living in the age of the “Anthropocene:” a new geological era where human domination of planet Earth is becoming indelibly written into the geological record.
Human actions are becoming slowly but surely crafted onto the material remains each generation leaves behind. The difference between climate changes that are taking place in our present century and those at the dawn of human existence is that humanity now is affecting and instigating such changes.
We are constructing our world to such an extent that we have lost sight of both our origins and our futures, caught up in the micro and macro politics of the everyday, feasting on the products of our own creations.
It is against the backdrop of the Anthropocene that Pope Francis’ upcoming encyclical will be delivered on June 18. In it, the Pope will draw on the praise poem Canticle of the Creatures, which was first penned by the patron saint of ecologists, Saint Francis of Assisi.
Pope Francis will speak to the ambiguous loss in Western societies of knowing ourselves as creatures. The world that we inhabit may be dominated by human activity, but it is still God’s world first and foremost. Once we know that the Earth is a gift, this creates a different relationship with it compared with the Earth as material for our use.
But he will not romanticize the Earth. Instead, he will speak of the need for human responsibility. And there are likely to be three facets of that responsibility to act, especially on the part of richer, consumer-driven nations of the world.
- First, on behalf of the poor.
- Second, in building relationships of peace.
- Third, in service to creation.
The Earthly world is indeed our home but we have become estranged from it through our practices of domination. [Continue reading…]
Jeff Turrentine writes: Every now and then you come across a statement by a public official that is so ridiculous, so perfect in its unabashed wrongness, you have to read it a few times to fully appreciate it as a work of demagogic art.
My current favorite in this category comes courtesy of one Scott Weber, a member of the Park County School District #6 Board of Trustees in Cody, Wyoming. A couple of weeks ago, when he and his fellow board members were supposed to be voting on whether to purchase new textbooks and reading materials for the district, Weber put a stop to the vote by taking a bold stand in defense of climate denial, political cronyism, and intellectual closed-mindedness.
Here’s what he said about one of the reading materials the board was considering for purchase, as reported by the Casper Star-Tribune:
As a board member, I will not authorize any of the $300,000 allocated for this purchase to include supplemental booklets about “global whining.” … Our Wyoming schools are largely funded by coal, oil, natural gas, mining, ranching, etc. This junk science is against community and state standards.
This junk science is against community and state standards. Stop for a moment and give that sentence the attention it deserves. For thousands of years, going back to Aristotle, humanity’s greatest minds have sought to safeguard the precepts of the scientific method by keeping them away from the corrupting influence of political culture. Defending the integrity of science from powerful people is what got Galileo imprisoned. And yet, 400 years later, here we are: watching a public official tasked with guiding the educational trajectories of his community’s children rail against the accepted science on climate change—because its conclusions threaten to undermine the local political culture. [Continue reading…]
George Monbiot writes: To seek enlightenment, intellectual or spiritual; to do good; to love and be loved; to create and to teach: these are the highest purposes of humankind. If there is meaning in life, it lies here.
Those who graduate from the leading universities have more opportunity than most to find such purpose. So why do so many end up in pointless and destructive jobs? Finance, management consultancy, advertising, public relations, lobbying: these and other useless occupations consume thousands of the brightest students. To take such jobs at graduation, as many will in the next few weeks, is to amputate life close to its base.
I watched it happen to my peers. People who had spent the preceding years laying out exultant visions of a better world, of the grand creative projects they planned, of adventure and discovery, were suddenly sucked into the mouths of corporations dangling money like angler fish.
At first they said they would do it for a year or two, “until I pay off my debts”. Soon afterwards they added: “and my mortgage”. Then it became, “I just want to make enough not to worry any more”. A few years later, “I’m doing it for my family”. Now, in middle age, they reply, “What, that? That was just a student fantasy.” [Continue reading…]