Power causes brain damage

Jerry Useem writes: If power were a prescription drug, it would come with a long list of known side effects. It can intoxicate. It can corrupt. It can even make Henry Kissinger believe that he’s sexually magnetic. But can it cause brain damage?

When various lawmakers lit into John Stumpf at a congressional hearing last fall, each seemed to find a fresh way to flay the now-former CEO of Wells Fargo for failing to stop some 5,000 employees from setting up phony accounts for customers. But it was Stumpf’s performance that stood out. Here was a man who had risen to the top of the world’s most valuable bank, yet he seemed utterly unable to read a room. Although he apologized, he didn’t appear chastened or remorseful. Nor did he seem defiant or smug or even insincere. He looked disoriented, like a jet-lagged space traveler just arrived from Planet Stumpf, where deference to him is a natural law and 5,000 a commendably small number. Even the most direct barbs—“You have got to be kidding me” (Sean Duffy of Wisconsin); “I can’t believe some of what I’m hearing here” (Gregory Meeks of New York)—failed to shake him awake.

What was going through Stumpf’s head? New research suggests that the better question may be: What wasn’t going through it?

The historian Henry Adams was being metaphorical, not medical, when he described power as “a sort of tumor that ends by killing the victim’s sympathies.” But that’s not far from where Dacher Keltner, a psychology professor at UC Berkeley, ended up after years of lab and field experiments. Subjects under the influence of power, he found in studies spanning two decades, acted as if they had suffered a traumatic brain injury—becoming more impulsive, less risk-aware, and, crucially, less adept at seeing things from other people’s point of view.

Sukhvinder Obhi, a neuroscientist at McMaster University, in Ontario, recently described something similar. Unlike Keltner, who studies behaviors, Obhi studies brains. And when he put the heads of the powerful and the not-so-powerful under a transcranial-magnetic-stimulation machine, he found that power, in fact, impairs a specific neural process, “mirroring,” that may be a cornerstone of empathy. Which gives a neurological basis to what Keltner has termed the “power paradox”: Once we have power, we lose some of the capacities we needed to gain it in the first place. [Continue reading…]

Facebooktwittermail

Daniel Everett: Becoming human without words for colors, numbers, or time

 

Facebooktwittermail

Without cultural appropriation, there would be no culture

Kenan Malik writes: What is cultural appropriation, and why is it so controversial? Susan Scafidi, a law professor at Fordham University, defines it as “taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission.” This can include the “unauthorized use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”

Appropriation suggests theft, and a process analogous to the seizure of land or artifacts. In the case of culture, however, what is called appropriation is not theft but messy interaction. Writers and artists necessarily engage with the experiences of others. Nobody owns a culture, but everyone inhabits one, and in inhabiting a culture, one finds the tools for reaching out to other cultures.

Critics of cultural appropriation insist that they are opposed not to cultural engagement, but to racism. They want to protect marginalized cultures and ensure that such cultures speak for themselves, not simply be seen through the eyes of more privileged groups.

Certainly, cultural engagement does not take place on a level playing field. Racism and inequality shape the ways in which people imagine others. Yet it is difficult to see how creating gated cultures helps promote social justice. [Continue reading…]

Cultures, unlike nations, have no borders. For that reason, cultures have historically been no more vibrant than in the places where they meet and interact.

The notion that cultural interaction requires permission, seems to me like a notion that would only make sense to someone who feels culturally deprived.

That a leading proponent of this concept is a lawyer, not an artist, seems no coincidence, since law so often attaches greater value to claims of ownership than anything else — and this brings to my mind Proudhon’s famous and relevant dictum: property is theft.

Consider jazz, a genuinely American cultural creation. This has inspired musicians around the world who have appropriated it and sustained its organic growth in such a way that its American roots can be traced without any limitation on the reach of its expansion. Jazz was made in America and now belongs to the world and in that transaction, no permission was sought or required.

Facebooktwittermail

Life-giving chemical compound found orbiting infant stars in space

AFP reports: Two teams of astronomers said Thursday that they have for the first time detected a key chemical building block of life swirling around infant stars that resemble our sun before its planets formed.

The molecule, methyl isocyanate, “plays an essential role in the formation of proteins, which are basic ingredients for life,” said Victor Rivilla, a scientist at the Astrophysics Observatory in Florence, Italy, and co-author of a study published in Monthly Notices of the Royal Astronomical Society.

The findings could offer clues on how chemicals sparked into living matter on Earth several billion years ago.

At the very least, they show that elements crucial for the emergence of life “were very likely already available at the earliest stage of solar system formation,” said Niels Ligterink, a researcher at Leiden Observatory in the Netherlands and lead author of a second study in the same journal. [Continue reading…]

Facebooktwittermail

Oldest Homo sapiens bones ever found shake foundations of the human story

The Guardian reports: Fossils recovered from an old mine on a desolate mountain in Morocco have rocked one of the most enduring foundations of the human story: that Homo sapiens arose in a cradle of humankind in East Africa 200,000 years ago.

Archaeologists unearthed the bones of at least five people at Jebel Irhoud, a former barite mine 100km west of Marrakesh, in excavations that lasted years. They knew the remains were old, but were stunned when dating tests revealed that a tooth and stone tools found with the bones were about 300,000 years old.

“My reaction was a big ‘wow’,” said Jean-Jacques Hublin, a senior scientist on the team at the Max Planck Institute for Evolutionary Anthropology in Leipzig. “I was expecting them to be old, but not that old.”

Hublin said the extreme age of the bones makes them the oldest known specimens of modern humans and poses a major challenge to the idea that the earliest members of our species evolved in a “Garden of Eden” in East Africa one hundred thousand years later.

“This gives us a completely different picture of the evolution of our species. It goes much further back in time, but also the very process of evolution is different to what we thought,” Hublin told the Guardian. “It looks like our species was already present probably all over Africa by 300,000 years ago. If there was a Garden of Eden, it might have been the size of the continent.” [Continue reading…]

Facebooktwittermail

America is awash in the wrong kinds of stories

Virginia Postrel writes: One of the rare feel-good stories of our current political moment is also terribly sad. On a train in Portland, Oregon, three very different men tried to protect two young women, one wearing a hijab, from a ranting white supremacist who turned out to be carrying a knife. The action cost two their lives, while the third is still in the hospital.

“America is about a Republican, a Democrat, and an autistic poet putting their lives on the line to protect young women from a different faith and culture simply because it is the right thing to do. You want diversity and tolerance? We just saw it,” writes Michael Cannon in an especially good appreciation, concluding “America is already great — and so long as we continue to produce men such as Rick Best, Taliesin Namkai-Meche, and Micah Fletcher, it always will be.”

Cultures are held together by stories. We define who we are — as individuals, families, organizations, and nations — by the stories we tell about ourselves. These stories express hopes, fears, and values. They create coherence out of complexity by emphasizing some things and ignoring others. Their moral worth lies not in their absolute truth or falsehood — all narratives simplify reality — but in the aspirations they express and the cultural character they shape. [Continue reading…]

Facebooktwittermail

The inflated debate over cosmic inflation

Amanda Gefter writes: On the morning of Dec. 7, 1979, a 32-year-old Alan Guth woke up with an idea. It had come into his head the previous night, but now, in the light of a California day, he could see the shape of the thing, and was itching to work through the math. He hopped on his bike and rode to his office at the Stanford Linear Accelerator Center. His excitement got him there in record time: 9 minutes, 32 seconds. At his desk, Guth neatly carried out the calculations in his notebook, forming the numbers and symbols in tight, careful lines. Then, at the top of a fresh page, he wrote in all caps: SPECTACULAR REALIZATION.

A year later and some 6,000 miles away, in Moscow, in the middle of the night, Andrei Linde, having read Guth’s paper, had his own spectacular realization. He had been working on his own idea and now he saw how to bring it to life by fixing the difficulties that plagued Guth’s theory. He woke his sleeping wife. “I think I know how the universe was created.”

Guth and Linde had worked out the beginnings of the theory of cosmic inflation. The theory would go through several incarnations over the next few decades, as kinks were worked out and details honed. But the core idea was spectacularly simple: In the earliest fraction of a second of time, a small patch of universe expanded faster than the speed of light, doubling its size again and again, growing a million trillion trillion times bigger in the blink of an eye. A little patch of world, about the size of a dime, grew into our entire observable universe.

What began as a radical notion has now become standard wisdom among physicists—except, notably, Paul Steinhardt, Anna Ijjas, and Avi Loeb. The three physicists recently wrote a scathing article in Scientific American arguing that it’s time to abandon inflation and look for a competing idea. (What idea, you ask? Steinhardt, conveniently, has one that he’s been pushing for decades.) Inflation is too unlikely to occur, too flexible to be confirmed or rejected experimentally, and too messy in its implications, the threesome argued. It “cannot be evaluated using the scientific method.”

It’s not surprising, then, that Guth and Linde—along with physicists David Kaiser and Yasunori Nomura—published a terse response in Scientific American earlier this month defending their theory. What is more surprising, perhaps, is that 29 more of the world’s leading physicists signed it—including four Nobel laureates and a Field’s medalist.

In the media flurry that followed, the disagreement between these groups of physicists was presented as a straight debate, of the kind that often occurs in science when there are multiple interpretations of data. But describing an equivalence between the opinions of Steinhardt, Ijjas, and Loeb on the one hand, and nearly the entirely cosmology community on the other, is a mistake.

The long list of signatories to the recent rebuttal letter in Scientific American puts the lie to the claim that the community is divided. When Ed Witten, Steven Weinberg, Leonard Susskind, Frank Wilczek, Juan Maldacena, Eva Silverstein, Sir Martin Rees, and Stephen Hawking (to name a few) write a letter saying you’ve gotten something wrong … well it’s probably worth considering.

The rebuttal letter also challenges us to understand more clearly why so many scientists are passionate about inflation. What is it about this theory that has the greatest minds in the known universe leaping to its defense? [Continue reading…]

Facebooktwittermail

What hyenas can tell us about the origins of intelligence

David Z. Hambrick writes: Physical similarities aside, we share a lot in common with our primate relatives. For example, as Jane Goodall famously documented, chimpanzees form lifelong bonds and show affection in much the same way as humans. Chimps can also solve novel problems, use objects as tools, and may possess “theory of mind”—an understanding that others may have different perspectives than oneself. They can even outperform humans in certain types of cognitive tasks.

These commonalities may not seem all that surprising given what we now know from the field of comparative genomics: We share nearly all of our DNA with chimpanzees and other primates. However, social and cognitive complexity is not unique to our closest evolutionary cousins. In fact, it is abundant in species with which we would seem to have very little in common—like the spotted hyena.

For more than three decades, the Michigan State University zoologist Kay Holekamp has studied the habits of the spotted hyena in Kenya’s Masai Mara National Reserve, once spending five years straight living in a tent among her oft-maligned subjects. One of the world’s longest-running studies of a wild mammal, this landmark project has revealed that spotted hyenas not only have social groups as complex as those of many primates, but are also capable of some of the same types of problem solving.

This research sheds light on one of science’s greatest mysteries—how intelligence has evolved across the animal kingdom. [Continue reading…]

Facebooktwittermail

The thoughts of a spiderweb

Joshua Sokol writes: Millions of years ago, a few spiders abandoned the kind of round webs that the word “spiderweb” calls to mind and started to focus on a new strategy. Before, they would wait for prey to become ensnared in their webs and then walk out to retrieve it. Then they began building horizontal nets to use as a fishing platform. Now their modern descendants, the cobweb spiders, dangle sticky threads below, wait until insects walk by and get snagged, and reel their unlucky victims in.

In 2008, the researcher Hilton Japyassú prompted 12 species of orb spiders collected from all over Brazil to go through this transition again. He waited until the spiders wove an ordinary web. Then he snipped its threads so that the silk drooped to where crickets wandered below. When a cricket got hooked, not all the orb spiders could fully pull it up, as a cobweb spider does. But some could, and all at least began to reel it in with their two front legs.

Their ability to recapitulate the ancient spiders’ innovation got Japyassú, a biologist at the Federal University of Bahia in Brazil, thinking. When the spider was confronted with a problem to solve that it might not have seen before, how did it figure out what to do? “Where is this information?” he said. “Where is it? Is it in her head, or does this information emerge during the interaction with the altered web?”

In February, Japyassú and Kevin Laland, an evolutionary biologist at the University of Saint Andrews, proposed a bold answer to the question. They argued in a review paper, published in the journal Animal Cognition, that a spider’s web is at least an adjustable part of its sensory apparatus, and at most an extension of the spider’s cognitive system.

This would make the web a model example of extended cognition, an idea first proposed by the philosophers Andy Clark and David Chalmers in 1998 to apply to human thought. In accounts of extended cognition, processes like checking a grocery list or rearranging Scrabble tiles in a tray are close enough to memory-retrieval or problem-solving tasks that happen entirely inside the brain that proponents argue they are actually part of a single, larger, “extended” mind.

Among philosophers of mind, that idea has racked up citations, including supporters and critics. And by its very design, Japyassú’s paper, which aims to export extended cognition as a testable idea to the field of animal behavior, is already stirring up antibodies among scientists. “I got the impression that it was being very careful to check all the boxes for hot topics and controversial topics in animal cognition,” said Alex Jordan, a collective behaviorial scientist at the Max Planck Institute in Konstanz, Germany (who nonetheless supports the idea).

While many disagree with the paper’s interpretations, the study shouldn’t be confused for a piece of philosophy. Japyassú and Laland propose ways to test their ideas in concrete experiments that involve manipulating the spider’s web — tests that other researchers are excited about. “We can break that machine; we can snap strands; we can reduce the way that animal is able to perceive the system around it,” Jordan said. “And that generates some very direct and testable hypotheses.” [Continue reading…]

Facebooktwittermail

Consciousness is not a thing

Karl Friston writes: I have a confession. As a physicist and psychiatrist, I find it difficult to engage with conversations about consciousness. My biggest gripe is that the philosophers and cognitive scientists who tend to pose the questions often assume that the mind is a thing, whose existence can be identified by the attributes it has or the purposes it fulfils.

But in physics, it’s dangerous to assume that things ‘exist’ in any conventional sense. Instead, the deeper question is: what sorts of processes give rise to the notion (or illusion) that something exists? For example, Isaac Newton explained the physical world in terms of massive bodies that respond to forces. However, with the advent of quantum physics, the real question turned out to be the very nature and meaning of the measurements upon which the notions of mass and force depend – a question that’s still debated today.

As a consequence, I’m compelled to treat consciousness as a process to be understood, not as a thing to be defined. Simply put, my argument is that consciousness is nothing more and nothing less than a natural process such as evolution or the weather. My favourite trick to illustrate the notion of consciousness as a process is to replace the word ‘consciousness’ with ‘evolution’ – and see if the question still makes sense. For example, the question What is consciousness for? becomes What is evolution for? Scientifically speaking, of course, we know that evolution is not for anything. It doesn’t perform a function or have reasons for doing what it does – it’s an unfolding process that can be understood only on its own terms. Since we are all the product of evolution, the same would seem to hold for consciousness and the self. [Continue reading…]

Facebooktwittermail

The language of prairie dogs

Ferris Jabr writes: [Con] Slobodchikoff, an emeritus professor of biology at Northern Arizona University, has been analyzing the sounds of prairie dogs for more than 30 years. Not long after he started, he learned that prairie dogs had distinct alarm calls for different predators. Around the same time, separate researchers found that a few other species had similar vocabularies of danger. What Slobodchikoff claimed to discover in the following decades, however, was extraordinary: Beyond identifying the type of predator, prairie-dog calls also specified its size, shape, color and speed; the animals could even combine the structural elements of their calls in novel ways to describe something they had never seen before. No scientist had ever put forward such a thorough guide to the native tongue of a wild species or discovered one so intricate. Prairie-dog communication is so complex, Slobodchikoff says — so expressive and rich in information — that it constitutes nothing less than language.

That would be an audacious claim to make about even the most overtly intelligent species — say, a chimpanzee or a dolphin — let alone some kind of dirt hamster with a brain that barely weighs more than a grape. The majority of linguists and animal-communication experts maintain that language is restricted to a single species: ourselves. Perhaps because it is so ostensibly entwined with thought, with consciousness and our sense of self, language is the last bastion encircling human exceptionalism. To concede that we share language with other species is to finally and fully admit that we are different from other animals only in degrees not in kind. In many people’s minds, language is the “cardinal distinction between man and animal, a sheerly dividing line as abrupt and immovable as a cliff,” as Tom Wolfe argues in his book “The Kingdom of Speech,” published last year.

Slobodchikoff thinks that dividing line is an illusion. To him, the idea that a human might have a two-way conversation with another species, even a humble prairie dog, is not a pretense; it’s an inevitability. And the notion that animals of all kinds routinely engage in sophisticated discourse with one another — that the world’s ecosystems reverberate with elaborate animal idioms just waiting to be translated — is not Doctor Dolittle-inspired nonsense; it is fact. [Continue reading…]

 

Facebooktwittermail

Roger Penrose on why consciousness does not compute

Steve Paulson writes: Once you start poking around in the muck of consciousness studies, you will soon encounter the specter of Sir Roger Penrose, the renowned Oxford physicist with an audacious—and quite possibly crackpot—theory about the quantum origins of consciousness. He believes we must go beyond neuroscience and into the mysterious world of quantum mechanics to explain our rich mental life. No one quite knows what to make of this theory, developed with the American anesthesiologist Stuart Hameroff, but conventional wisdom goes something like this: Their theory is almost certainly wrong, but since Penrose is so brilliant (“One of the very few people I’ve met in my life who, without reservation, I call a genius,” physicist Lee Smolin has said), we’d be foolish to dismiss their theory out of hand.

Penrose, 85, is a mathematical physicist who made his name decades ago with groundbreaking work in general relativity and then, working with Stephen Hawking, helped conceptualize black holes and gravitational singularities, a point of infinite density out of which the universe may have formed. He also invented “twistor theory,” a new way to connect quantum mechanics with the structure of spacetime. His discovery of certain geometric forms known as “Penrose tiles”—an ingenious design of non-repeating patterns—led to new directions of study in mathematics and crystallography.

The breadth of Penrose’s interests is extraordinary, which is evident in his recent book Fashion, Faith and Fantasy in the New Physics of the Universe—a dense 500-page tome that challenges some of the trendiest but still unproven theories in physics, from the multiple dimensions of string theory to cosmic inflation in the first moment of the Big Bang. He considers these theories to be fanciful and implausible. [Continue reading…]

Facebooktwittermail

Mountains of the mind: ‘I’ve become part of the landscape and it’s become part of me’

Kevin Rushby writes: We begin in darkness and head up towards the light. It is that time just before the dawn when it’s neither day nor night. Down near Lake Coniston, I can hear an owl and a curlew calling, both claiming the hour for themselves. “I like to come this early,” says Sion. “There’s no one else around. I can’t handle crowds. I get confused.”

It’s 4.30am and I am with Sion Jair, 67, and his partner, Wendy Kolbe, 63, and we are heading up the Old Man of Coniston, an 803-metre Lake District fell noted for its sharp ascent and great panoramas of southern lakeland. Or at least we hope so: there are some clouds massing in the east.

For Sion, this has become a daily ritual, adopted seven years ago when a visit to the doctor changed his life for ever. “I had been feeling permanently tired, and suffering some memory problems. It meant I couldn’t get out walking, you see, and when I can’t walk, I really shut down.”

After tests, the doctor diagnosed chronic anaemia from vitamin B12 deficiency. Injections usually sort that out, but Sion reacted badly to the shots and, without them, was given three years to live. Determined not to give in, he set about walking in earnest, covering around 10 miles a day. “Eventually, it worked. I reckon it cured me of the chronic fatigue,” he says.

But there was another blow. The anaemia had been masking signs of dementia. Given the particular type of condition he was suffering from, he was warned that he could expect periods of total memory loss, mood swings and eventually the inability to look after himself. Sion had become one of the estimated 25 million people worldwide suffering this progressive neurodegenerative disease, as feared now as the Black Death was in its day.

“It was quite scary,” says Sion, adding, in something of an understatement, “I didn’t like the idea.”

Sion’s response was typical of him: he walked even more. Not just the Old Man, but other fells, too: Scafell Pike, Helvellyn, Blencathra, Dollywaggon Pike – all the greats. “I’ve done them so often, I know them blindfolded.” And all this he did without any technological intermediaries, smartphone or GPS – just the steady rhythm of his feet. On one occasion he did 12 peaks and 28 miles in 22 hours, raising cash for his three favourite charities: the Alzheimer’s Society, Mountain Rescue and the Great North Air Ambulance. He also walked in Wales – he walked the Snowdon horseshoe more than 200 times – and Scotland, but it was in Coniston that he found his walking mantra. I suppose you could call it his Coniston Old Man-tra. [Continue reading…]

Facebooktwittermail

Literature’s evolution has reflected and spurred the growing complexity of society

Julie Sedivy writes: Reading medieval literature, it’s hard not to be impressed with how much the characters get done—as when we read about King Harold doing battle in one of the Sagas of the Icelanders, written in about 1230. The first sentence bristles with purposeful action: “King Harold proclaimed a general levy, and gathered a fleet, summoning his forces far and wide through the land.” By the end of the third paragraph, the king has launched his fleet against a rebel army, fought numerous battles involving “much slaughter in either host,” bound up the wounds of his men, dispensed rewards to the loyal, and “was supreme over all Norway.” What the saga doesn’t tell us is how Harold felt about any of this, whether his drive to conquer was fueled by a tyrannical father’s barely concealed contempt, or whether his legacy ultimately surpassed or fell short of his deepest hopes.

Jump ahead about 770 years in time, to the fiction of David Foster Wallace. In his short story “Forever Overhead,” the 13-year-old protagonist takes 12 pages to walk across the deck of a public swimming pool, wait in line at the high diving board, climb the ladder, and prepare to jump. But over these 12 pages, we are taken into the burgeoning, buzzing mind of a boy just erupting into puberty—our attention is riveted to his newly focused attention on female bodies in swimsuits, we register his awareness that others are watching him as he hesitates on the diving board, we follow his undulating thoughts about whether it’s best to do something scary without thinking about it or whether it’s foolishly dangerous not to think about it.

These examples illustrate Western literature’s gradual progression from narratives that relate actions and events to stories that portray minds in all their meandering, many-layered, self-contradictory complexities. I’d often wondered, when reading older texts: Weren’t people back then interested in what characters thought and felt? [Continue reading…]

Facebooktwittermail

On the happy life

Massimo Pigliucci writes: Lucius Annaeus Seneca is a towering and controversial figure of antiquity. He lived from 4 BCE to 65 CE, was a Roman senator and political adviser to the emperor Nero, and experienced exile but came back to Rome to become one of the wealthiest citizens of the Empire. He tried to steer Nero toward good governance, but in the process became his indirect accomplice in murderous deeds. In the end, he was ‘invited’ to commit suicide by the emperor, and did so with dignity, in the presence of his friends.

Seneca wrote a number of tragedies that directly inspired William Shakespeare, but was also one of the main exponents of the Stoic school of philosophy, which has made a surprising comeback in recent years. Stoicism teaches us that the highest good in life is the pursuit of the four cardinal virtues of practical wisdom, temperance, justice and courage – because they are the only things that always do us good and can never be used for ill. It also tells us that the key to a serene life is the realisation that some things are under our control and others are not: under our control are our values, our judgments, and the actions we choose to perform. Everything else lies outside of our control, and we should focus our attention and efforts only on the first category.

Seneca wrote a series of philosophical letters to his friend Lucilius when he was nearing the end of his life. The letters were clearly meant for publication, and represent a sort of philosophical testament for posterity. I chose letter 92, ‘On the Happy Life’, because it encapsulates both the basic tenets of Stoic philosophy and some really good advice that is still valid today.

The first thing to understand about this letter is the title itself: ‘happy’ here does not have the vague modern connotation of feeling good, but is the equivalent of the Greek word eudaimonia, recently adopted also by positive psychologists, and which is best understood as a life worth living. [Continue reading…]

Facebooktwittermail

Trees have their own songs

Ed Yong writes: Just as birders can identify birds by their melodious calls, David George Haskell can distinguish trees by their sounds. The task is especially easy when it rains, as it so often does in the Ecuadorian rainforest. Depending on the shapes and sizes of their leaves, the different plants react to falling drops by producing “a splatter of metallic sparks” or “a low, clean, woody thump” or “a speed-typist’s clatter.” Every species has its own song. Train your ears (and abandon the distracting echoes of a plastic rain jacket) and you can carry out a botanical census through sound alone.

“I’ve taught ornithology to students for many years,” says Haskell, a natural history writer and professor of biology at Sewanee. “And I challenge my students: Okay, now that you’ve learned the songs of 100 birds, your task is to learn the sounds of 20 trees. Can you tell an oak from a maple by ear? I have them go out, pour their attention into their ears, and harvest sounds. It’s an almost meditative experience. And from that, you realize that trees sound different, and they have amazing sounds coming from them. Our unaided ears can hear how a maple tree changes its voice as a soft leaves of early spring change into the dying one of autumn.”

This acoustic world is open to everyone, but most of us never enter it. It just seems so counter-intuitive—not to mention a little hokey—to listen to trees. But Haskell does listen, and he describes his experiences with sensuous prose in his enchanting new book The Songs of Trees. A kind of naturalist-poet, Haskell makes a habit of returning to the same places and paying “repeated sensory attention” to them. “I like to sit down and listen, and turn off the apps that come pre-installed in my body,” he says. Humans may be a visual species, but “sounds reveals things that are hidden from our eyes because the vibratory energy of the world comes around barriers and through the ground. Through sound, we come to know the place.” [Continue reading…]

Facebooktwittermail