NPR reports: If the children are the future, the future might be very ill-informed.
That’s one implication of a new study from Stanford researchers that evaluated students’ ability to assess information sources and described the results as “dismaying,” “bleak” and “[a] threat to democracy.”
As content creators and social media platforms grapple with the fake news crisis, the study highlights the other side of the equation: What it looks like when readers are duped.
The researchers at Stanford’s Graduate School of Education have spent more than a year evaluating how well students across the country can evaluate online sources of information.
Middle school, high school and college students in 12 states were asked to evaluate the information presented in tweets, comments and articles. More than 7,800 student responses were collected.
In exercise after exercise, the researchers were “shocked” — their word, not ours — by how many students failed to effectively evaluate the credibility of that information. [Continue reading…]
The Irish Times reports: Teaching philosophy in schools, and promoting it in society, is urgently needed to enable citizens “to discriminate between truthful language and illusory rhetoric”, President Michael D Higgins has said.
Speaking at a function at Áras an Uachtaráin to mark World Philosophy Day, which fell this week, the President expressed concern about an “an anti-intellectualism that has fed a populism among the insecure and the excluded”.
Amid claims that we have entered a “post-truth” society, he asked how we might together and individually contribute to a “reflective atmosphere in the classrooms, in our media, in our public space”.
“The dissemination, at all levels of society, of the tools, language and methods of philosophical enquiry can, I believe, provide a meaningful component in any concerted attempt at offering a long-term and holistic response to our current predicament.”[Continue reading…]
Daniel Lattier writes: Professors usually spend about three to six months (sometimes longer) researching and writing a 25-page article to submit an article to an academic journal. And most experience a twinge of excitement when, months later, they open a letter informing them that their article has been accepted for publication, and will, therefore, be read by…
Yes, you read that correctly. The numbers reported by recent studies are pretty bleak:
– 82 percent of articles published in the humanities are not even cited once.
– Of those articles that are cited, only 20 percent have actually been read.
– Half of academic papers are never read by anyone other than their authors, peer reviewers, and journal editors.
So what’s the reason for this madness? Why does the world continue to be subjected to just under 2 million academic journal articles each year?
Well, the main reason is money and job security. The goal of all professors is to get tenure, and right now, tenure continues to be awarded based in part on how many peer-reviewed publications they have. Tenure committees treat these publications as evidence that the professor is able to conduct mature research.
Sadly, however, many academic articles today are merely exercises in what one professor I knew called “creative plagiarism”: rearrangements of previous research with a new thesis appended on to them.
Another reason is increased specialization in the modern era, which is in part due to the splitting up of universities into various disciplines and departments that each pursue their own logic. [Continue reading…]
Shawn Otto writes: Four years ago in Scientific American, I warned readers of a growing problem in American democracy. The article, entitled “Antiscience Beliefs Jeopardize U.S. Democracy,” charted how it had not only become acceptable, but often required, for politicians to embrace antiscience positions, and how those positions flew in the face of the core principles that the U.S. was founded on: That if anyone could discover the truth of something for him or herself using the tools of science, then no king, no pope and no wealthy lord was more entitled to govern the people than they were themselves. It was self-evident.
In the years since, the situation has gotten worse. We’ve seen the emergence of a “post-fact” politics, which has normalized the denial of scientific evidence that conflicts with the political, religious or economic agendas of authority. Much of this denial centers, now somewhat predictably, around climate change — but not all. If there is a single factor to consider as a barometer that evokes all others in this election, it is the candidates’ attitudes toward science.
Consider, for example, what has been occurring in Congress. Rep. Lamar Smith, the Texas Republican who chairs the House Committee on Science, Space and Technology, is a climate change denier. Smith has used his post to initiate a series of McCarthy-style witch-hunts, issuing subpoenas and demanding private correspondence and testimony from scientists, civil servants, government science agencies, attorneys general and nonprofit organizations whose work shows that global warming is happening, humans are causing it and that — surprise — energy companies sought to sow doubt about this fact.
Smith, who is a Christian Scientist and seems to revel in his role as the science community’s bête noire, is by no means alone. Climate denial has become a virtual Republican Party plank (and rejecting the Paris climate accord a literal one) with a wide majority of Congressional Republicans espousing it. Sen. Ted Cruz (R–Texas), chairman of the Senate’s Subcommittee on Space, Science and Competitiveness, took time off from his presidential campaign last December to hold hearings during the Paris climate summit showcasing well-known climate deniers repeating scientifically discredited talking points.
The situation around science has grown so partisan that Hillary Clinton turned the phrase “I believe in science” into the largest applause line of her convention speech accepting the Democratic Party nomination. Donald Trump, by contrast, is the first major party presidential nominee who is an outright climate denier, having called climate science a “hoax” numerous times. In his responses to the organization I helped found, ScienceDebate.org, which gets presidential candidates on the record on science, he told us that “there is still much that needs to be investigated in the field of ‘climate change,’” putting the term in scare quotes to cast doubt on its reality. When challenged on his hoax comments, campaign manager Kellyanne Conway affirmed that Trump doesn’t believe climate change is man-made. [Continue reading…]
Almost a year after a new set of Sustainable Development Goals for 2030 were finalised, the first report tracking global progress towards its goal for education and lifelong learning shows just how far there is still to go to make sure nobody is left behind.
The SDGs replaced the Millennium Development Goals, which reached the end of their 15-year focus in 2015. While the previous goal that focused on education had only one target – to achieve universal primary education – the equivalent SDG has seven, including on expanding secondary and university education.
So UNESCO’s 2016 Global Education Monitoring report is the first in a new era, bringing us the inaugural set of evidence to track progress to achieve these new targets.
UNESCO draws on 2014-15 data to show conclusively what we already know: that the world has failed to achieve universal primary education. In fact on current trends, only 70% of children in low income countries will complete primary school in 2030, the year of the SDG deadline. The target to achieve universal primary education, which remains within the broader SDG on education, won’t happen until 2042. On the same trajectory, the new target for universal lower secondary education will come about in 2059 and universal upper secondary in 2084.
Less than a year into the new agenda of “leave no-one behind”, the data already predicts that we will be half a century late for the first of the 2030 education targets.
The future for many young people across the Middle East and North Africa looks bleak. The World Bank records that 54% of the working age population in the Middle East and North Africa is unemployed with little prospect of any positive immediate change. An average of 28.7% of 15 to 24-year-olds in the Middle East and 30.6% of those in North Africa are unemployed according to the International Labour Organisation.
Much of the world’s response to this chronic problem has been to intervene financially: in the form of aid or debt restructuring. But through supporting economic recovery by giving generations of young people new skills and new opportunities to improve themselves, the world can help Middle Eastern societies in a more sustainable and thoughtful way.
Reliance on public sector jobs
One of the biggest challenges the World Bank identifies in this broad region is that unemployment rates are the highest among the educated, with university graduates making up 30% of the region’s unemployed. They are slowly losing optimism and hope for a better life and future.
This is largely attributed to a reliance on the public sector to provide jobs that come with steady albeit low salaries but high degrees of job security. In many North African countries or those Middle Eastern ones with high populations, other than wait in line for a public sector job, there are few other alternatives. At one end of the spectrum there’s the misery of violence and refugee camps in Jordan, Syria, Iraq and Lebanon, and the other – even if you’ve excelled – a flat and static job market for the best and brightest. No wonder so many highly-skilled people are fleeing to Europe in search of stability and new opportunities.
Jay Griffiths writes: In Mexico City, the cathedral – this stentorian thug of a cathedral – is sinking. Built to crush the indigenous temple beneath it, while its decrees pulverised indigenous thinking, Mexico City Metropolitan Cathedral is sinking under the weight of its own brutal imposition.
Walking nearby late one night, I was captivated by music. Closer, now, and I came upon an indigenous, pre-Hispanic ceremony being danced on the pavement hard by the cathedral. Copal tree resin was burning, marigolds were scattered like living coins, people in feather headdresses and jaguar masks danced to flutes, drums, rattles and shell-bells. While each cathedral column was a Columbus colonising the site, the ceremony seemed to say: We’re still here.
A young man watched me awhile, as I was taking notes, and then approached me smiling.
‘Do you understand Nahuatl?’ he asked.
‘Do you want me to explain?’
He spent an hour gently unfurling each word. Abjectly poor, his worn-out shoes no longer even covered his feet and his clothes were rags, but he shone with an inner wealth, a light that was his gift, to respect the connections of the world, between people, animals, plants and the elements. He spoke of the importance of not losing the part of ourselves that touches the heart of the Earth; of listening within, and also to the natural world. Two teachers. No one has ever said it better.
‘Your spirit is your maestro interno. Your spirit brought you here. You have your gift and destiny to complete in this world. You have to align yourself in the right direction and carry on.’ And he melted away, leaving me with tears in my eyes as if I had heard a lodestar singing its own quiet truthsong.
A few days earlier, I’d been invited to the Centre for Indigenous Arts in Papantla, in the Mexican state of Veracruz, 300km east of Mexico City. The centre was celebrating the anniversary of its founding (in 2006), and promoting indigenous education: decolonised schooling. Not by chance, it is 12 October, the day when, in 1492, Christopher Columbus arrived in the so-called New World. Here, they come not to praise Columbus but to bury his legacy because – as an act of pointed protest – this date is now widely honoured as the day of indigenous resistance. [Continue reading…]
Nicholas Tampio writes: According to the grit narrative, children in the United States are lazy, entitled and unprepared to compete in the global economy. Schools have contributed to the problem by neglecting socio-emotional skills. The solution, then, is for schools to impart the dispositions that enable American children to succeed in college and careers. According to this story, politicians, policymakers, corporate executives and parents agree that kids need more grit.
The person who has arguably done more than anyone else to elevate the concept of grit in academic and popular conversations is Angela Duckworth, professor at the Positive Psychology Center at the University of Pennsylvania. In her new book, Grit: The Power of Passion and Perseverance, she explains the concept of grit and how people can cultivate it in themselves and others.
According to Duckworth, grit is the ability to overcome any obstacle in pursuit of a long-term project: ‘To be gritty is to hold fast to an interesting and purposeful goal. To be gritty is to invest, day after week after year, in challenging practice. To be gritty is to fall down seven times and rise eight.’ Duckworth names musicians, athletes, coaches, academics and business people who succeed because of grit. Her book will be a boon for policymakers who want schools to inculcate and measure grit.
There is a time and place for grit. However, praising grit as such makes no sense because it can often lead to stupid or mean behaviour. Duckworth’s book is filled with gritty people doing things that they, perhaps, shouldn’t. [Continue reading…]
Specifically, something is undermining young people’s mental health, especially girls.
In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.
“It indicates a lot of suffering,” Twenge told Quartz.
It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:
Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.
The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).
“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.
Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.
Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.
Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.
“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.
“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]
Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.
Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.
A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”
Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.
Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)
That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]
Douglas Starr writes: About a year and a half ago, Jessica Schneider was handed a flyer by one of her colleagues in the child-advocacy community. It advertised a training session, offered under the auspices of the Illinois Principals Association (I.P.A.), in how to interrogate students. Specifically, teachers and school administrators would be taught an abbreviated version of the Reid Technique, which is used across the country by police officers, private-security personnel, insurance-fraud investigators, and other people for whom getting at the truth is part of the job. Schneider, who is a staff attorney at the Chicago Lawyers’ Committee for Civil Rights Under Law, was alarmed. She knew that some psychologists and jurists have characterized the technique as coercive and liable to produce false confessions — especially when used with juveniles, who are highly suggestible. When she expressed her concerns to Brian Schwartz, the I.P.A.’s general counsel, he said that the association had been offering Reid training for many years and found it both popular and benign. To prove it, he invited Schneider to attend a session in January of 2015.
The training was led by Joseph Buckley, the president of John E. Reid and Associates, which is based in Chicago. Like the adult version of the Reid Technique, the school version involves three basic parts: an investigative component, in which you gather evidence; a behavioral analysis, in which you interview a suspect to determine whether he or she is lying; and a nine-step interrogation, a nonviolent but psychologically rigorous process that is designed, according to Reid’s workbook, “to obtain an admission of guilt.” Most of the I.P.A. session, Schneider told me, focussed on behavioral analysis. Buckley described to trainees how patterns of body language — including slumping, failing to look directly at the interviewer, offering “evasive” responses, and showing generally “guarded” behaviors — could supposedly reveal whether a suspect was lying. (Some of the cues were downright mythological — like, for instance, the idea that individuals look left when recalling the truth and right when trying to fabricate.) Several times during the session, Buckley showed videos of interrogations involving serious crimes, such as murder, theft, and rape. None of the videos portrayed young people being questioned for typical school misbehavior, nor did any of the Reid teaching materials refer to “students” or “kids.” They were always “suspects” or “subjects.”
Laura Nirider, a professor of law at Northwestern University and the project director of the Center on Wrongful Convictions of Youth, attended the same session as Schneider. She told me that about sixty people were there. “Everybody was on the edge of their seat: ‘So this is how we can learn to get the drop on little Billy for writing graffiti on the underside of the lunchroom table,’” she said. One vice-principal told Nirider that the first thing he does when he interrogates students is take away their cell phones, “so they can’t call their mothers.” [Continue reading…]
The phrase “Natural History” is linked in most people’s minds today with places that use the phrase: the various Natural History Museums, or television programmes narrated so evocatively by renowned naturalist Sir David Attenborough.
As times have changed, used in its traditional sense the phrase now has an almost archaic ring to it, perhaps recalling the Victorian obsession with collecting butterflies or beetles, rocks or fossils, or stuffed birds and animals, or perhaps the 18th century best-seller, Gilbert White’s The Natural History of Selborne.
Once natural history was part of what was equally archaically called natural philosophy, encompassing the enquiry into all aspects of the natural world that we inhabit, from the tiniest creature to the largest, to molecules and materials, to planets and stars in outer space. These days, we call it science. Natural history specifically strives to study and understand organisms within their environment, which would these days equate to the disciplines of ecology or conservation.
In a recent article in the journal BioScience, a group of 17 scientists decry what they see as a shift away from this traditional learning (once typical parts of biology degrees) that taught students about organisms: where they live, what they eat, how they behave, their variety and relationships to their ecosystems in which they live.
Partly by the promise of a course-specific career, and perhaps partly because of poorly taught courses that can emphasise rote learning, students are enticed into more exciting fields such as biotechnology or evolutionary developmental biology (“evo-devo”), where understanding an organism is less important than understanding the function of a particular organ or limb.