Harvard Magazine: Toward the end of World War II, while thousands of Europeans were dying of hunger, 36 men at the University of Minnesota volunteered for a study that would send them to the brink of starvation. Allied troops advancing into German-occupied territories with supplies and food were encountering droves of skeletal people they had no idea how to safely renourish, and researchers at the university had designed a study they hoped might reveal the best methods of doing so. But first, their volunteers had to agree to starve.
The physical toll on these men was alarming: their metabolism slowed by 40 percent; sitting on atrophied muscles became painful; though their limbs were skeletal, their fluid-filled bellies looked curiously stout. But researchers also observed disturbing mental effects they hadn’t expected: obsessions about cookbooks and recipes developed; men with no previous interest in food thought — and talked — about nothing else. Overwhelming, uncontrollable thoughts had taken over, and as one participant later recalled, “Food became the one central and only thing really in one’s life.” There was no room left for anything else.
Though these odd behaviors were just a footnote in the original Minnesota study, to professor of economics Sendhil Mullainathan, who works on contemporary issues of poverty, they were among the most intriguing findings. Nearly 70 years after publication, that “footnote” showed something remarkable: scarcity had stolen more than flesh and muscle. It had captured the starving men’s minds.
Mullainathan is not a psychologist, but he has long been fascinated by how the mind works. As a behavioral economist, he looks at how people’s mental states and social and physical environments affect their economic actions. Research like the Minnesota study raised important questions: What happens to our minds — and our decisions — when we feel we have too little of something? Why, in the face of scarcity, do people so often make seemingly irrational, even counter-productive decisions? And if this is true in large populations, why do so few policies and programs take it into account?
In 2008, Mullainathan joined Eldar Shafir, Tod professor of psychology and public affairs at Princeton, to write a book exploring these questions. Scarcity: Why Having Too Little Means So Much (2013) presented years of findings from the fields of psychology and economics, as well as new empirical research of their own. Based on their analysis of the data, they sought to show that, just as food had possessed the minds of the starving volunteers in Minnesota, scarcity steals mental capacity wherever it occurs—from the hungry, to the lonely, to the time-strapped, to the poor.
That’s a phenomenon well-documented by psychologists: if the mind is focused on one thing, other abilities and skills — attention, self-control, and long-term planning — often suffer. Like a computer running multiple programs, Mullainathan and Shafir explain, our mental processors begin to slow down. We don’t lose any inherent capacities, just the ability to access the full complement ordinarily available for use.
But what’s most striking — and in some circles, controversial — about their work is not what they reveal about the effects of scarcity. It’s their assertion that scarcity affects anyone in its grip. Their argument: qualities often considered part of someone’s basic character — impulsive behavior, poor performance in school, poor financial decisions — may in fact be the products of a pervasive feeling of scarcity. And when that feeling is constant, as it is for people mired in poverty, it captures and compromises the mind.
This is one of scarcity’s most insidious effects, they argue: creating mindsets that rarely consider long-term best interests. “To put it bluntly,” says Mullainathan, “if I made you poor tomorrow, you’d probably start behaving in many of the same ways we associate with poor people.” And just like many poor people, he adds, you’d likely get stuck in the scarcity trap. [Continue reading…]
Jennifer Talarico writes: It isn’t surprising that many Bostonians have vivid memories of the 2013 Marathon bombing, or that many New Yorkers have very clear memories about where they were and what they were doing on 9/11.
But many individuals who were not onsite for these attacks, or not even in Boston on Apr. 15, 2013 or in New York on Sept. 11, 2001, also have vivid memories of how they learned about these events. Why would people who were not immediately or directly affected have such a long-lasting sense of knowing exactly where they were and what they were doing when they heard the news?
These recollections are called flashbulb memories. In a flashbulb memory, we recall the experience of learning about an event, not the factual details of the event itself.
There might be an advantage to recalling the elements of important events that happen to us or to those close to us, but there appears to be little benefit to recalling our experience hearing this kind of news. So why does learning about a big event create such vivid memories? And just how accurate are flashbulb memories? [Continue reading…]
Tom Jacobs writes: Surely you have noticed: A lot of people who have no idea what they are talking about are oddly certain of their superior knowledge. While this disconnect has been a problem throughout human history, new research suggests a ubiquitous feature of our high-tech world — the Internet — has made matters much worse.
In a series of studies, a Yale University research team led by psychologist Matthew Fisher shows that people who search for information on the Web emerge from the process with an inflated sense of how much they know — even regarding topics that are unrelated to the ones they Googled.
This illusion of knowledge appears to be “driven by the act of searching itself,” they write in the Journal of Experimental Psychology: General. Apparently conflating seeking information online with racking one’s brain, people consistently mistake “outsourced knowledge for internal knowledge.” [Continue reading…]
Sally Satel writes: The evil hour descended on David Morris in the summer of 2009. The former marine and war reporter was in a theater watching a movie with his then girlfriend and suddenly found himself pacing the lobby with no memory of having left his seat. Later, his girlfriend explained that Morris had fled after an explosion occurred onscreen.
He began having dreams of his buddies being ripped apart. When awake, he would imagine innocent items—an apple or a container of Chinese takeout—blowing up. Pathological vigilance took root: “Preparing for bed was like getting ready for a night patrol.” The dreams persisted. “Part of me,” he admits, “was ashamed of the dreams, of the realization that I was trapped inside a cliché: the veteran so obsessed with his own past that even his unconscious made love to it every night.”
Post-traumatic stress disorder is the subject of two new books, one by Morris and another by war reporter Mac McClelland. The symptoms are crippling: relentless nightmares, unbidden waking images, hyperarousal, sleeplessness, and phobias. As a diagnosis, it has existed colloquially for generations—“shell shock” is one name that survives in the modern idiom—and it has particular resonance because of this generation’s wars. (Most soldiers are spared it, though the public tends to think they are not. A 2012 poll found that most people believe that most post-9/11 veterans suffer from PTSD. The actual rate has been estimated at between two and 17 percent.)
Morris thinks the symptoms—a body and mind reacting in fear long after the threat to life and limb is gone—hardly encompass the experience of PTSD. Historically, we might have sought out not only shrinks but also “poetry, our families, or the clergy for solace post horror.” Profitably, Morris turns to everyone: the Greeks, the great poets of World War I, historians, anthropologists, and yes, psychiatrists and psychologists.
From such wide consultation comes a masterful synthesis. The Evil Hours interweaves memoir with a cultural history of war’s psychic aftermath. Morris chronicles the development of PTSD as an official diagnosis and its earlier incarnations in other wars. From Homer’s Odyssey to the venerated war poets, from the crusade for recognition by organized psychiatry to the modern science of fear and resilience, Morris gives a sweeping view of the condition, illuminated by meditation on sacrifice and danger and, in his words, “the enigma of survival.” [Continue reading…]
Scott Barry Kaufman writes: Psychologists Guillaume Furst, Paolo Ghisletta and Todd Lubart present an integrative model of creativity and personality that is deeply grounded in past research on the personality of creative people.
Bringing together lots of different research threads over the years, they identified three “super-factors” of personality that predict creativity: Plasticity, Divergence, and Convergence.
Plasticity consists of the personality traits openness to experience, extraversion, high energy, and inspiration. The common factor here is high drive for exploration, and those high in this super-factor of personality tend to have a lot of dopamine — “the neuromodulator of exploration” — coursing through their brains. Prior research has shown a strong link between Plasticity and creativity, especially in the arts.
Divergence consists of non-conformity, impulsivity, low agreeableness, and low conscientiousness. People high in divergence may seem like jerks, but they are often just very independent thinkers. This super-factor is close to Hans Eysenck’s concept of “Psychoticism.” Throughout his life, Eysenck argued that these non-conforming characteristics were important contributors to high creative achievements.
Finally, Convergence consists of high conscientiousness, precision, persistence, and critical sense. While not typically included in discussions of creativity, these characteristics are also important contributors to the creative process. [Continue reading…]
Lary Wallace writes: We do this to our philosophies. We redraft their contours based on projected shadows, or give them a cartoonish shape like a caricaturist emphasising all the wrong features. This is how Buddhism becomes, in the popular imagination, a doctrine of passivity and even laziness, while Existentialism becomes synonymous with apathy and futile despair. Something similar has happened to Stoicism, which is considered – when considered at all – a philosophy of grim endurance, of carrying on rather than getting over, of tolerating rather than transcending life’s agonies and adversities.
No wonder it’s not more popular. No wonder the Stoic sage, in Western culture, has never obtained the popularity of the Zen master. Even though Stoicism is far more accessible, not only does it lack the exotic mystique of Eastern practice; it’s also regarded as a philosophy of merely breaking even while remaining determinedly impassive. What this attitude ignores is the promise proffered by Stoicism of lasting transcendence and imperturbable tranquility.
It ignores gratitude, too. This is part of the tranquility, because it’s what makes the tranquility possible. Stoicism is, as much as anything, a philosophy of gratitude – and a gratitude, moreover, rugged enough to endure anything. Philosophers who pine for supreme psychological liberation have often failed to realise that they belong to a confederacy that includes the Stoics. [Continue reading…]
John Tierney writes: Just be yourself.
The advice is as maddening as it is inescapable. It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself.
But when you’re nervous, how can you be yourself? How you can force yourself to relax? How can you try not to try?
It makes no sense, but the paradox is essential to civilization, according to Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.
He calls it the paradox of wu wei, the Chinese term for “effortless action.” Pronounced “ooo-way,” it has similarities to the concept of flow, that state of effortless performance sought by athletes, but it applies to a lot more than sports. Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.
Dr. Slingerland, a professor of Asian studies at the University of British Columbia, argues that the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good. [Continue reading…]
Peter Fulham writes: Twenty-five years ago, in December, 1989, Darkness Visible, William Styron’s account of his descent into the depths of clinical depression and back, appeared in Vanity Fair. The piece revealed in unsparing detail how Styron’s lifelong melancholy at once gave way to a seductive urge to end his own life. A few months later, he released the essay as a book, augmenting the article with a recollection of when the illness first took hold of him: in Paris, as he was about to accept the 1985 Prix mondial Cino Del Duca, the French literary award. By the author’s own acknowledgement, the response from readers was unprecedented. “This was just overwhelming. It was just by the thousands that the letters came in,” he told Charlie Rose. “I had not really realized that it was going to touch that kind of a nerve.”
Styron may have been startled by the outpouring of mail, but in many ways, it’s easy to understand. The academic research on mental illness at the time was relatively comprehensive, but no one to date had offered the kind of report that Styron gave to the public: a firsthand account of what it’s like to have the monstrous condition overtake you. He also exposed the inadequacy of the word itself, which is still used interchangeably to describe a case of the blues, rather than the tempestuous agony sufferers know too well.
Depression is notoriously hard to describe, but Styron managed to split the atom. “I’d feel the horror, like some poisonous fogbank, roll in upon my mind,” he wrote in one chapter. In another: “It is not an immediately identifiable pain, like that of a broken limb. It may be more accurate to say that despair… comes to resemble the diabolical discomfort of being imprisoned in a fiercely overheated room. And because no breeze stirs this cauldron… it is entirely natural that the victim begins to think ceaselessly of oblivion.”
As someone who has fought intermittently with the same illness since college, those sentences were cathartic, just as I suspect they were for the many readers who wrote to Styron disclosing unequivocally that he had saved their lives. As brutal as depression can be, one of the main ways a person can restrain it is through solidarity. You are not alone, Styron reminded his readers, and the fog will lift. Patience is paramount. [Continue reading…]
Nina Strohminger writes: One morning after her accident, a woman I’ll call Kate awoke in a daze. She looked at the man next to her in bed. He resembled her husband, with the same coppery beard and freckles dusted across his shoulders. But this man was definitely not her husband.
Panicked, she packed a small bag and headed to her psychiatrist’s office. On the bus, there was a man she had been encountering with increasing frequency over the past several weeks. The man was clever, he was a spy. He always appeared in a different form: one day as a little girl in a sundress, another time as a bike courier who smirked at her knowingly. She explained these bizarre developments to her doctor, who was quickly becoming one of the last voices in this world she could trust. But as he spoke, her stomach sank with a dreaded realisation: this man, too, was an impostor.
Kate has Capgras syndrome, the unshakeable belief that someone – often a loved one, sometimes oneself – has been replaced with an exact replica. She also has Fregoli syndrome, the delusion that the same person is taking on a variety of shapes, like an actor donning an expert disguise. Capgras and Fregoli delusions offer hints about an extraordinary cognitive mechanism active in the healthy mind, a mechanism so exquisitely tuned that we are hardly ever aware of it. This mechanism ascribes to each person a unique identity, and then meticulously tracks and updates it. This mechanism is crucial to virtually every human interaction, from navigating a party to navigating a marriage. Without it, we quickly fall apart. [Continue reading…]
Julie Beck writes: While gossiping is a behavior that has long been frowned upon, perhaps no one has frowned quite so intensely as the 16th- and 17th-century British. Back then, gossips, or “scolds” were sometimes forced to wear a menacing iron cage on their heads, called the “branks” or “scold’s bridle.” These masks purportedly had iron spikes or bits that went in the mouth and prevented the wearer from speaking. (And of course, of course, this ghastly punishment seems to have been mostly for women who were talking too much.)
Today, people who gossip are still not very well-liked, though we tend to resist the urge to cage their heads. Progress. And yet the reflexive distaste people feel for gossip and those who gossip in general is often nowhere to be found when people find themselves actually faced with a juicy morsel about someone they know. Social topics—personal relationships, likes and dislikes, anecdotes about social activities—made up about two-thirds of all conversations in analyses done by evolutionary psychologist Robin Dunbar. The remaining one-third of their time not spent talking about other people was devoted to discussing everything else: sports, music, politics, etc.
“Language in freely forming natural conversations is principally used for the exchange of social information,” Dunbar writes. “That such topics are so overwhelmingly important to us suggests that this is a primary function of language.” He even goes so far as to say: “Gossip is what makes human society as we know it possible.”
In recent years, research on the positive effects of gossip has proliferated. Rather than just a means to humiliate people and make them cry in the bathroom, gossip is now being considered by scientists as a way to learn about cultural norms, bond with others, promote cooperation, and even, as one recent study found, allow individuals to gauge their own success and social standing. [Continue reading…]
Phys.org: A new study from Duke University finds that people will evaluate scientific evidence based on whether they view its policy implications as politically desirable. If they don’t, then they tend to deny the problem even exists.
“Logically, the proposed solution to a problem, such as an increase in government regulation or an extension of the free market, should not influence one’s belief in the problem. However, we find it does,” said co-author Troy Campbell, a Ph.D. candidate at Duke’s Fuqua School of Business. “The cure can be more immediately threatening than the problem.”
The study, “Solution Aversion: On the Relation Between Ideology and Motivated Disbelief,” appears in the November issue of the Journal of Personality and Social Psychology.
The researchers conducted three experiments (with samples ranging from 120 to 188 participants) on three different issues—climate change, air pollution that harms lungs, and crime.
“The goal was to test, in a scientifically controlled manner, the question: Does the desirability of a solution affect beliefs in the existence of the associated problem? In other words, does what we call ‘solution aversion’ exist?” Campbell said.
“We found the answer is yes. And we found it occurs in response to some of the most common solutions for popularly discussed problems.”
For climate change, the researchers conducted an experiment to examine why more Republicans than Democrats seem to deny its existence, despite strong scientific evidence that supports it.
One explanation, they found, may have more to do with conservatives’ general opposition to the most popular solution—increasing government regulation—than with any difference in fear of the climate change problem itself, as some have proposed. [Continue reading…]
Dean Keith Simonton writes: When John Forbes Nash, the Nobel Prize-winning mathematician, schizophrenic, and paranoid delusional, was asked how he could believe that space aliens had recruited him to save the world, he gave a simple response. “Because the ideas I had about supernatural beings came to me the same way that my mathematical ideas did. So I took them seriously.”
Nash is hardly the only so-called mad genius in history. Suicide victims like painters Vincent Van Gogh and Mark Rothko, novelists Virginia Woolf and Ernest Hemingway, and poets Anne Sexton and Sylvia Plath all offer prime examples. Even ignoring those great creators who did not kill themselves in a fit of deep depression, it remains easy to list persons who endured well-documented psychopathology, including the composer Robert Schumann, the poet Emily Dickinson, and Nash. Creative geniuses who have succumbed to alcoholism or other addictions are also legion.
Instances such as these have led many to suppose that creativity and psychopathology are intimately related. Indeed, the notion that creative genius might have some touch of madness goes back to Plato and Aristotle. But some recent psychologists argue that the whole idea is a pure hoax. After all, it is certainly no problem to come up with the names of creative geniuses who seem to have displayed no signs or symptoms of mental illness.
Opponents of the mad genius idea can also point to two solid facts. First, the number of creative geniuses in the entire history of human civilization is very large. Thus, even if these people were actually less prone to psychopathology than the average person, the number with mental illness could still be extremely large. Second, the permanent inhabitants of mental asylums do not usually produce creative masterworks. The closest exception that anyone might imagine is the notorious Marquis de Sade. Even in his case, his greatest (or rather most sadistic) works were written while he was imprisoned as a criminal rather than institutionalized as a lunatic.
So should we believe that creative genius is connected with madness or not? Modern empirical research suggests that we should because it has pinpointed the connection between madness and creativity clearly. The most important process underlying strokes of creative genius is cognitive disinhibition — the tendency to pay attention to things that normally should be ignored or filtered out by attention because they appear irrelevant. [Continue reading…]
David Dunning writes: Last March, during the enormous South by Southwest music festival in Austin, Texas, the late-night talk show Jimmy Kimmel Live! sent a camera crew out into the streets to catch hipsters bluffing. “People who go to music festivals pride themselves on knowing who the next acts are,” Kimmel said to his studio audience, “even if they don’t actually know who the new acts are.” So the host had his crew ask festival-goers for their thoughts about bands that don’t exist.
“The big buzz on the street,” said one of Kimmel’s interviewers to a man wearing thick-framed glasses and a whimsical T-shirt, “is Contact Dermatitis. Do you think he has what it takes to really make it to the big time?”
“Absolutely,” came the dazed fan’s reply.
The prank was an installment of Kimmel’s recurring “Lie Witness News” feature, which involves asking pedestrians a variety of questions with false premises. In another episode, Kimmel’s crew asked people on Hollywood Boulevard whether they thought the 2014 film Godzilla was insensitive to survivors of the 1954 giant lizard attack on Tokyo; in a third, they asked whether Bill Clinton gets enough credit for ending the Korean War, and whether his appearance as a judge on America’s Got Talent would damage his legacy. “No,” said one woman to this last question. “It will make him even more popular.”
One can’t help but feel for the people who fall into Kimmel’s trap. Some appear willing to say just about anything on camera to hide their cluelessness about the subject at hand (which, of course, has the opposite effect). Others seem eager to please, not wanting to let the interviewer down by giving the most boringly appropriate response: I don’t know. But for some of these interviewees, the trap may be an even deeper one. The most confident-sounding respondents often seem to think they do have some clue—as if there is some fact, some memory, or some intuition that assures them their answer is reasonable. [Continue reading…]
Daniel N Jones writes: It’s the friend who betrays you, the lover living a secret life, the job applicant with the fabricated résumé, or the sham sales pitch too good to resist. From the time humans learnt to co‑operate, we also learnt to deceive each other. For deception to be effective, individuals must hide their true intentions. But deception is hardly limited to humans. There is a never-ending arms race between the deceiver and the deceived among most living things. By studying different patterns of deception across the species, we can learn to better defend ourselves from dishonesty in the human world.
My early grasp of human deception came from the work of my adviser, the psychologist Delroy Paulhus at the University of British Columbia in Canada, who studied what he called the dark triad of personality: psychopathy, recognised by callous affect and reckless deceit; narcissism, a sense of grandiose entitlement and self-centered overconfidence; and Machiavellianism, the cynical and strategic manipulation of others.
If you look at the animal world, it’s clear that dark traits run through species from high to low. Some predators are fast, mobile and wide-ranging, executing their deceptions on as many others as they can; they resemble human psychopaths. Others are slow, stalking their prey in a specific, strategic (almost Machiavellian) way. Given the parallels between humans and other animals, I began to conceive my Mimicry Deception Theory, which argues that long- and short-term deceptive strategies cut across species, often by mimicking other lifestyles or forms.
Much of the foundational work for this idea comes from the evolutionary biologist Robert Trivers, who noted that many organisms gain an evolutionary advantage through deception. [Continue reading…]
Daniel A. Gross writes: One icy night in March 2010, 100 marketing experts piled into the Sea Horse Restaurant in Helsinki, with the modest goal of making a remote and medium-sized country a world-famous tourist destination. The problem was that Finland was known as a rather quiet country, and since 2008, the Country Brand Delegation had been looking for a national brand that would make some noise.
Over drinks at the Sea Horse, the experts puzzled over the various strengths of their nation. Here was a country with exceptional teachers, an abundance of wild berries and mushrooms, and a vibrant cultural capital the size of Nashville, Tennessee. These things fell a bit short of a compelling national identity. Someone jokingly suggested that nudity could be named a national theme — it would emphasize the honesty of Finns. Someone else, less jokingly, proposed that perhaps quiet wasn’t such a bad thing. That got them thinking.
A few months later, the delegation issued a slick “Country Brand Report.” It highlighted a host of marketable themes, including Finland’s renowned educational system and school of functional design. One key theme was brand new: silence. As the report explained, modern society often seems intolerably loud and busy. “Silence is a resource,” it said. It could be marketed just like clean water or wild mushrooms. “In the future, people will be prepared to pay for the experience of silence.”
People already do. In a loud world, silence sells. Noise-canceling headphones retail for hundreds of dollars; the cost of some weeklong silent meditation courses can run into the thousands. Finland saw that it was possible to quite literally make something out of nothing.
In 2011, the Finnish Tourist Board released a series of photographs of lone figures in the wilderness, with the caption “Silence, Please.” An international “country branding” consultant, Simon Anholt, proposed the playful tagline “No talking, but action.” And a Finnish watch company, Rönkkö, launched its own new slogan: “Handmade in Finnish silence.”
“We decided, instead of saying that it’s really empty and really quiet and nobody is talking about anything here, let’s embrace it and make it a good thing,” explains Eva Kiviranta, who manages social media for VisitFinland.com.
Silence is a peculiar starting point for a marketing campaign. After all, you can’t weigh, record, or export it. You can’t eat it, collect it, or give it away. The Finland campaign raises the question of just what the tangible effects of silence really are. Science has begun to pipe up on the subject. In recent years researchers have highlighted the peculiar power of silence to calm our bodies, turn up the volume on our inner thoughts, and attune our connection to the world. Their findings begin where we might expect: with noise.
The word “noise” comes from a Latin root meaning either queasiness or pain. According to the historian Hillel Schwartz, there’s even a Mesopotamian legend in which the gods grow so angry at the clamor of earthly humans that they go on a killing spree. (City-dwellers with loud neighbors may empathize, though hopefully not too closely.)
Dislike of noise has produced some of history’s most eager advocates of silence, as Schwartz explains in his book Making Noise: From Babel to the Big Bang and Beyond. In 1859, the British nurse and social reformer Florence Nightingale wrote, “Unnecessary noise is the most cruel absence of care that can be inflicted on sick or well.” Every careless clatter or banal bit of banter, Nightingale argued, can be a source of alarm, distress, and loss of sleep for recovering patients. She even quoted a lecture that identified “sudden noises” as a cause of death among sick children. [Continue reading…]
Jillian Hinchliffe and Seth Frey write: Although [Stephen] Booth is now retired [from the University of California, Berkeley], his work [on Shakespeare] couldn’t be more relevant. In the study of the human mind, old disciplinary boundaries have begun to dissolve and fruitful new relationships between the sciences and humanities have sprung up in their place. When it comes to the cognitive science of language, Booth may be the most prescient literary critic who ever put pen to paper. In his fieldwork in poetic experience, he unwittingly anticipated several language-processing phenomena that cognitive scientists have only recently begun to study. Booth’s work not only provides one of the most original and penetrating looks into the nature of Shakespeare’s genius, it has profound implications for understanding the processes that shape how we think.
Until the early decades of the 20th century, Shakespeare criticism fell primarily into two areas: textual, which grapples with the numerous variants of published works in order to produce an edition as close as possible to the original, and biographical. Scholarship took a more political turn beginning in the 1960s, providing new perspectives from various strains of feminist, Marxist, structuralist, and queer theory. Booth is resolutely dismissive of most of these modes of study. What he cares about is poetics. Specifically, how poetic language operates on and in audiences of a literary work.
Close reading, the school that flourished mid-century and with which Booth’s work is most nearly affiliated, has never gone completely out of style. But Booth’s approach is even more minute—microscopic reading, according to fellow Shakespeare scholar Russ McDonald. And as the microscope opens up new worlds, so does Booth’s critical lens. What makes him radically different from his predecessors is that he doesn’t try to resolve or collapse his readings into any single interpretation. That people are so hung up on interpretation, on meaning, Booth maintains, is “no more than habit.” Instead, he revels in the uncertainty caused by the myriad currents of phonetic, semantic, and ideational patterns at play. [Continue reading…]
Simultaneously using mobile phones, laptops and other media devices could be changing the structure of our brains, according to new University of Sussex research.
A study published today (24 September) in PLOS ONE reveals that people who frequently use several media devices at the same time have lower grey-matter density in one particular region of the brain compared to those who use just one device occasionally.
The research supports earlier studies showing connections between high media-multitasking activity and poor attention in the face of distractions, along with emotional problems such as depression and anxiety.
But neuroscientists Kep Kee Loh and Dr Ryota Kanai point out that their study reveals a link rather than causality and that a long-term study needs to be carried out to understand whether high concurrent media usage leads to changes in the brain structure, or whether those with less-dense grey matter are more attracted to media multitasking. [Continue reading…]