Category Archives: Attention to the Unseen

Physicists prove surprising rule of threes

Natalie Wolchover writes: More than 40 years after a Soviet nuclear physicist proposed an outlandish theory that trios of particles can arrange themselves in an infinite nesting-doll configuration, experimentalists have reported strong evidence that this bizarre state of matter is real.

In 1970, Vitaly Efimov was manipulating the equations of quantum mechanics in an attempt to calculate the behavior of sets of three particles, such as the protons and neutrons that populate atomic nuclei, when he discovered a law that pertained not only to nuclear ingredients but also, under the right conditions, to any trio of particles in nature.

While most forces act between pairs, such as the north and south poles of a magnet or a planet and its sun, Efimov identified an effect that requires three components to spring into action. Together, the components form a state of matter similar to Borromean rings, an ancient symbol of three interconnected circles in which no two are directly linked. The so-called Efimov “trimer” could consist of a trio of protons, a triatomic molecule or any other set of three particles, as long as their properties were tuned to the right values. And in a surprising flourish, this hypothetical state of matter exhibited an unheard-of feature: the ability to range in size from practically infinitesimal to infinite. [Continue reading…]

Facebooktwittermail

Slaves of productivity

Quinn Norton writes: We dream now of making Every Moment Count, of achieving flow and never leaving, creating one project that must be better than the last, of working harder and smarter. We multitask, we update, and we conflate status with long hours worked in no paid overtime systems for the nebulous and fantastic status of being Too Important to have Time to Ourselves, time to waste. But this incarnation of the American dream is all about doing, and nothing about doing anything good, or even thinking about what one was doing beyond how to do more of it more efficiently. It was not even the surrenders to hedonism and debauchery or greed our literary dreams have recorded before. It is a surrender to nothing, to a nothingness of lived accounting.

This moment’s goal of productivity, with its all-consuming practice and unattainable horizon, is perfect for our current corporate world. Productivity never asks what it builds, just how much of it can be piled up before we leave or die. It is irrelevant to pleasure. It’s agnostic about the fate of humanity. It’s not even selfish, because production negates the self. Self can only be a denominator, holding up a dividing bar like a caryatid trying to hold up a stone roof.

I am sure this started with the Industrial Revolution, but what has swept through this generation is more recent. This idea of productivity started in the 1980s, with the lionizing of the hardworking greedy. There’s a critique of late capitalism to be had for sure, but what really devastated my generation was the spiritual malaise inherent in Taylorism’s perfectly mechanized human labor. But Taylor had never seen a robot or a computer perfect his methods of being human. By the 1980s, we had. In the age of robots we reinvented the idea of being robots ourselves. We wanted to program our minds and bodies and have them obey clocks and routines. In this age of the human robot, of the materialist mind, being efficient took the pre-eminent spot, beyond goodness or power or wisdom or even cruel greed. [Continue reading…]

Facebooktwittermail

Denying problems when we don’t like the political solutions

Phys.org: A new study from Duke University finds that people will evaluate scientific evidence based on whether they view its policy implications as politically desirable. If they don’t, then they tend to deny the problem even exists.

“Logically, the proposed solution to a problem, such as an increase in government regulation or an extension of the free market, should not influence one’s belief in the problem. However, we find it does,” said co-author Troy Campbell, a Ph.D. candidate at Duke’s Fuqua School of Business. “The cure can be more immediately threatening than the problem.”

The study, “Solution Aversion: On the Relation Between Ideology and Motivated Disbelief,” appears in the November issue of the Journal of Personality and Social Psychology.

The researchers conducted three experiments (with samples ranging from 120 to 188 participants) on three different issues—climate change, air pollution that harms lungs, and crime.

“The goal was to test, in a scientifically controlled manner, the question: Does the desirability of a solution affect beliefs in the existence of the associated problem? In other words, does what we call ‘solution aversion’ exist?” Campbell said.

“We found the answer is yes. And we found it occurs in response to some of the most common solutions for popularly discussed problems.”

For climate change, the researchers conducted an experiment to examine why more Republicans than Democrats seem to deny its existence, despite strong scientific evidence that supports it.

One explanation, they found, may have more to do with conservatives’ general opposition to the most popular solution—increasing government regulation—than with any difference in fear of the climate change problem itself, as some have proposed. [Continue reading…]

Facebooktwittermail

The complex, varied, ever changing and context-dependent microbiome

Ed Yong writes: In the late 17th century, the Dutch naturalist Anton van Leeuwenhoek looked at his own dental plaque through a microscope and saw a world of tiny cells “very prettily a-moving.” He could not have predicted that a few centuries later, the trillions of microbes that share our lives — collectively known as the microbiome — would rank among the hottest areas of biology.

These microscopic partners help us by digesting our food, training our immune systems and crowding out other harmful microbes that could cause disease. In return, everything from the food we eat to the medicines we take can shape our microbial communities — with important implications for our health. Studies have found that changes in our microbiome accompany medical problems from obesity to diabetes to colon cancer.

As these correlations have unfurled, so has the hope that we might fix these ailments by shunting our bugs toward healthier states. The gigantic probiotics industry certainly wants you to think that, although there is little evidence that swallowing a few billion yogurt-borne bacteria has more than a small impact on the trillions in our guts. The booming genre of microbiome diet books — self-help manuals for the bacterial self — peddles a similar line, even though our knowledge of microbe-manipulating menus is still in its infancy.

This quest for a healthy microbiome has led some people to take measures that are far more extreme than simply spooning up yogurt. [Continue reading…]

Facebooktwittermail

In a multiverse, what are the odds?

Natalie Wolchover and Peter Byrne write: If modern physics is to be believed, we shouldn’t be here. The meager dose of energy infusing empty space, which at higher levels would rip the cosmos apart, is a trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion times tinier than theory predicts. And the minuscule mass of the Higgs boson, whose relative smallness allows big structures such as galaxies and humans to form, falls roughly 100 quadrillion times short of expectations. Dialing up either of these constants even a little would render the universe unlivable.

To account for our incredible luck, leading cosmologists like Alan Guth and Stephen Hawking envision our universe as one of countless bubbles in an eternally frothing sea. This infinite “multiverse” would contain universes with constants tuned to any and all possible values, including some outliers, like ours, that have just the right properties to support life. In this scenario, our good luck is inevitable: A peculiar, life-friendly bubble is all we could expect to observe.

Many physicists loathe the multivere hypothesis, deeming it a cop-out of infinite proportions. But as attempts to paint our universe as an inevitable, self-contained structure falter, the multiverse camp is growing.

The problem remains how to test the hypothesis. Proponents of the multiverse idea must show that, among the rare universes that support life, ours is statistically typical. The exact dose of vacuum energy, the precise mass of our underweight Higgs boson, and other anomalies must have high odds within the subset of habitable universes. If the properties of this universe still seem atypical even in the habitable subset, then the multiverse explanation fails.

But infinity sabotages statistical analysis. In an eternally inflating multiverse, where any bubble that can form does so infinitely many times, how do you measure “typical”? [Continue reading…]

Facebooktwittermail

Cognitive disinhibition: the kernel of genius and madness

Dean Keith Simonton writes: When John Forbes Nash, the Nobel Prize-winning mathematician, schizophrenic, and paranoid delusional, was asked how he could believe that space aliens had recruited him to save the world, he gave a simple response. “Because the ideas I had about supernatural beings came to me the same way that my mathematical ideas did. So I took them seriously.”

Nash is hardly the only so-called mad genius in history. Suicide victims like painters Vincent Van Gogh and Mark Rothko, novelists Virginia Woolf and Ernest Hemingway, and poets Anne Sexton and Sylvia Plath all offer prime examples. Even ignoring those great creators who did not kill themselves in a fit of deep depression, it remains easy to list persons who endured well-documented psychopathology, including the composer Robert Schumann, the poet Emily Dickinson, and Nash. Creative geniuses who have succumbed to alcoholism or other addictions are also legion.

Instances such as these have led many to suppose that creativity and psychopathology are intimately related. Indeed, the notion that creative genius might have some touch of madness goes back to Plato and Aristotle. But some recent psychologists argue that the whole idea is a pure hoax. After all, it is certainly no problem to come up with the names of creative geniuses who seem to have displayed no signs or symptoms of mental illness.

Opponents of the mad genius idea can also point to two solid facts. First, the number of creative geniuses in the entire history of human civilization is very large. Thus, even if these people were actually less prone to psychopathology than the average person, the number with mental illness could still be extremely large. Second, the permanent inhabitants of mental asylums do not usually produce creative masterworks. The closest exception that anyone might imagine is the notorious Marquis de Sade. Even in his case, his greatest (or rather most sadistic) works were written while he was imprisoned as a criminal rather than institutionalized as a lunatic.

So should we believe that creative genius is connected with madness or not? Modern empirical research suggests that we should because it has pinpointed the connection between madness and creativity clearly. The most important process underlying strokes of creative genius is cognitive disinhibition — the tendency to pay attention to things that normally should be ignored or filtered out by attention because they appear irrelevant. [Continue reading…]

Facebooktwittermail

The quantum edge

Johnjoe McFadden writes: The point of the most famous thought-experiment in quantum physics is that the quantum world is different from our familiar one. Imagine, suggested the Austrian physicist Erwin Schrödinger, that we seal a cat inside a box. The cat’s fate is linked to the quantum world through a poison that will be released only if a single radioactive atom decays. Quantum mechanics says that the atom must exist in a peculiar state called ‘superposition’ until it is observed, a state in which it has both decayed and not decayed. Furthermore, because the cat’s survival depends on what the atom does, it would appear that the cat must also exist as a superposition of a live and a dead cat until somebody opens the box and observes it. After all, the cat’s life depends on the state of the atom, and the state of the atom has not yet been decided.

Yet nobody really believes that a cat can be simultaneously dead and alive. There is a profound difference between fundamental particles, such as atoms, which do weird quantum stuff (existing in two states at once, occupying two positions at once, tunnelling through impenetrable barriers etc) and familiar classical objects, such as cats, that apparently do none of these things. Why don’t they? Simply put, because the weird quantum stuff is very fragile.

Quantum mechanics insists that all particles are also waves. But if you want to see strange quantum effects, the waves all have to line up, so that the peaks and troughs coincide. Physicists call this property coherence: it’s rather like musical notes being in tune. If the waves don’t line up, the peaks and troughs cancel each other out, destroying coherence, and you won’t see anything odd. When you’re dealing only with a single particle’s wave, on the other hand, it’s easy to keep it ‘in tune’ – it has to line up only with itself. But lining up the waves of hundreds, millions or trillions of particles is pretty much impossible. And so the weirdness gets cancelled out inside big objects. That’s why there doesn’t seem to be anything very indeterminate about a cat.

Nevertheless, wrote Schrödinger in What Is Life? (1944), some of life’s most fundamental building blocks must, like unobserved radioactive atoms, be quantum entities able to perform counterintuitive tricks. Indeed, he went on to propose that life is different from the inanimate world precisely because it inhabits a borderland between the quantum and classical world: a region we might call the quantum edge. [Continue reading…]

Facebooktwittermail

We are all confident idiots

David Dunning writes: Last March, during the enormous South by Southwest music festival in Austin, Texas, the late-night talk show Jimmy Kimmel Live! sent a camera crew out into the streets to catch hipsters bluffing. “People who go to music festivals pride themselves on knowing who the next acts are,” Kimmel said to his studio audience, “even if they don’t actually know who the new acts are.” So the host had his crew ask festival-goers for their thoughts about bands that don’t exist.

“The big buzz on the street,” said one of Kimmel’s interviewers to a man wearing thick-framed glasses and a whimsical T-shirt, “is Contact Dermatitis. Do you think he has what it takes to really make it to the big time?”

“Absolutely,” came the dazed fan’s reply.

The prank was an installment of Kimmel’s recurring “Lie Witness News” feature, which involves asking pedestrians a variety of questions with false premises. In another episode, Kimmel’s crew asked people on Hollywood Boulevard whether they thought the 2014 film Godzilla was insensitive to survivors of the 1954 giant lizard attack on Tokyo; in a third, they asked whether Bill Clinton gets enough credit for ending the Korean War, and whether his appearance as a judge on America’s Got Talent would damage his legacy. “No,” said one woman to this last question. “It will make him even more popular.”

One can’t help but feel for the people who fall into Kimmel’s trap. Some appear willing to say just about anything on camera to hide their cluelessness about the subject at hand (which, of course, has the opposite effect). Others seem eager to please, not wanting to let the interviewer down by giving the most boringly appropriate response: I don’t know. But for some of these interviewees, the trap may be an even deeper one. The most confident-sounding respondents often seem to think they do have some clue—as if there is some fact, some memory, or some intuition that assures them their answer is reasonable. [Continue reading…]

Facebooktwittermail

Cooperation is what makes us human

Kat McGowan writes: Tales about the origins of our species always start off like this: A small band of hunter-gatherers roams the savannah, loving, warring, and struggling for survival under the African sun. They do not start like this: A fat guy falls off a New York City subway platform onto the tracks.

But what happens next is a quintessential story of who we are as human beings.

On Feb. 17, 2013, around 2:30 a.m., Garrett O’Hanlon, a U.S. Air Force Academy cadet third class, was out celebrating his 22nd birthday in New York City. He and his sister were in the subway waiting for a train when a sudden silence came over the platform, followed by a shriek. People pointed down to the tracks.

O’Hanlon turned and saw a man sprawled facedown on the tracks. “The next thing that happened, I was on the tracks, running toward him,” he says. “I honestly didn’t have a thought process.”

O’Hanlon grabbed the unconscious man by the shoulders, lifting his upper body off the tracks, but struggled to move him. He was deadweight. According to the station clock, the train would arrive in less than two minutes. From the platform, O’Hanlon’s sister was screaming at him to save himself.

Suddenly other arms were there: Personal trainer Dennis Codrington Jr. and his friend Matt Foley had also jumped down to help. “We grabbed him, one by the legs, one by the shoulders, one by the chest,” O’Hanlon says. They got the man to the edge of the platform, where a dozen or more people muscled him up and over. More hands seized the rescuers’ arms and shoulders, helping them up to safety as well.

In the aftermath of the rescue, O’Hanlon says he has been surprised that so many people have asked him why he did it. “I get stunned by the question,” he says. In his view, anybody else would’ve done the same thing. “I feel like it’s a normal reaction,” he says. “To me that’s just what people do.”

More precisely, it is something only people do, according to developmental psychologist Michael Tomasello, codirector of the Max Planck Institute for Evolutionary Anthropology.

For decades Tomasello has explored what makes humans distinctive. His conclusion? We cooperate. Many species, from ants to orcas to our primate cousins, cooperate in the wild. But Tomasello has identified a special form of cooperation. In his view, humans alone are capable of shared intentionality—they intuitively grasp what another person is thinking and act toward a common goal, as the subway rescuers did. This supremely human cognitive ability, Tomasello says, launched our species on its extraordinary trajectory. It forged language, tools, and cultures—stepping-stones to our colonization of every corner of the planet. [Continue reading…]

Facebooktwittermail

How we use memory to look at the future

Virginia Hughes writes: Over the past few decades, researchers have worked to uncover the details of how the brain organizes memories. Much remains a mystery, but scientists have identified a key event: the formation of an intense brain wave called a “sharp-wave ripple” (SWR). This process is the brain’s version of an instant replay — a sped-up version of the neural activity that occurred during a recent experience. These ripples are a strikingly synchronous neural symphony, the product of tens of thousands of cells firing over just 100 milliseconds. Any more activity than that could trigger a seizure.

Now researchers have begun to realize that SWRs may be involved in much more than memory formation. Recently, a slew of high-profile rodent studies have suggested that the brain uses SWRs to anticipate future events. A recent experiment, for example, finds that SWRs connect to activity in the prefrontal cortex, a region at the front of the brain that is involved in planning for the future.

Studies such as this one have begun to illuminate the complex relationship between memory and the decision-making process. Until a few years ago, most studies on SWRs focused only on their role in creating and consolidating memories, said Loren Frank, a neuroscientist at the University of California, San Francisco. “None of them really dealt with this issue of: How does the animal actually pull [the memory] back up again? How does it actually use this to figure out what to do?” [Continue reading…]

Facebooktwittermail

The faster we go, the more time we lose

Mark C. Taylor writes: “Sleeker. Faster. More Intuitive” (The New York Times); “Welcome to a world where speed is everything” (Verizon FiOS); “Speed is God, and time is the devil” (chief of Hitachi’s portable-computer division). In “real” time, life speeds up until time itself seems to disappear—fast is never fast enough, everything has to be done now, instantly. To pause, delay, stop, slow down is to miss an opportunity and to give an edge to a competitor. Speed has become the measure of success—faster chips, faster computers, faster networks, faster connectivity, faster news, faster communications, faster transactions, faster deals, faster delivery, faster product cycles, faster brains, faster kids. Why are we so obsessed with speed, and why can’t we break its spell?

The cult of speed is a modern phenomenon. In “The Futurist Manifesto” in 1909, Filippo Tommaso Marionetti declared, “We say that the splendor of the world has been enriched by a new beauty: the beauty of speed.” The worship of speed reflected and promoted a profound shift in cultural values that occurred with the advent of modernity and modernization. With the emergence of industrial capitalism, the primary values governing life became work, efficiency, utility, productivity, and competition. When Frederick Winslow Taylor took his stopwatch to the factory floor in the early 20th century to increase workers’ efficiency, he began a high-speed culture of surveillance so memorably depicted in Charlie Chaplin’s Modern Times. Then, as now, efficiency was measured by the maximization of rapid production through the programming of human behavior.

With the transition from mechanical to electronic technologies, speed increased significantly. The invention of the telegraph, telephone, and stock ticker liberated communication from the strictures imposed by the physical means of conveyance. Previously, messages could be sent no faster than people, horses, trains, or ships could move. By contrast, immaterial words, sounds, information, and images could be transmitted across great distances at very high speed. During the latter half of the 19th century, railway and shipping companies established transportation networks that became the backbone of national and international information networks. When the trans-Atlantic cable (1858) and transcontinental railroad (1869) were completed, the foundation for the physical infrastructure of today’s digital networks was in place.

Fast-forward 100 years. During the latter half of the 20th century, information, communications, and networking technologies expanded rapidly, and transmission speed increased exponentially. But more than data and information were moving faster. Moore’s Law, according to which the speed of computer chips doubles every two years, now seems to apply to life itself. Plugged in 24/7/365, we are constantly struggling to keep up but are always falling further behind. The faster we go, the less time we seem to have. As our lives speed up, stress increases, and anxiety trickles down from managers to workers, and parents to children. [Continue reading…]

Facebooktwittermail

The biology of deceit

Daniel N Jones writes: It’s the friend who betrays you, the lover living a secret life, the job applicant with the fabricated résumé, or the sham sales pitch too good to resist. From the time humans learnt to co‑operate, we also learnt to deceive each other. For deception to be effective, individuals must hide their true intentions. But deception is hardly limited to humans. There is a never-ending arms race between the deceiver and the deceived among most living things. By studying different patterns of deception across the species, we can learn to better defend ourselves from dishonesty in the human world.

My early grasp of human deception came from the work of my adviser, the psychologist Delroy Paulhus at the University of British Columbia in Canada, who studied what he called the dark triad of personality: psychopathy, recognised by callous affect and reckless deceit; narcissism, a sense of grandiose entitlement and self-centered overconfidence; and Machiavellianism, the cynical and strategic manipulation of others.

If you look at the animal world, it’s clear that dark traits run through species from high to low. Some predators are fast, mobile and wide-ranging, executing their deceptions on as many others as they can; they resemble human psychopaths. Others are slow, stalking their prey in a specific, strategic (almost Machiavellian) way. Given the parallels between humans and other animals, I began to conceive my Mimicry Deception Theory, which argues that long- and short-term deceptive strategies cut across species, often by mimicking other lifestyles or forms.

Much of the foundational work for this idea comes from the evolutionary biologist Robert Trivers, who noted that many organisms gain an evolutionary advantage through deception. [Continue reading…]

Facebooktwittermail

The grand illusion of time

Jim Holt writes: It was Albert Einstein who initiated the revolution in our understanding of time. In 1905, Einstein proved that time, as it had been understood by physicist and plain man alike, was a fiction. Our idea of time, Einstein realized, is abstracted from our experience with rhythmic phenomena: heartbeats, planetary rotations and revolutions, the swinging of pendulums, the ticking of clocks. Time judgments always come down to judgments of what happens at the same time — of simultaneity. “If, for instance, I say, ‘That train arrives here at seven o’clock,’ I mean something like this: ‘The pointing of the small hand of my watch to seven and the arrival of the train are simultaneous events,’” Einstein wrote. If the events in question are distant from each other, judgments of simultaneity can be made only by sending light signals back and forth. Einstein proved that whether an observer deems two events at different locations to be happening “at the same time” depends on his state of motion. Suppose, for example, that Jones is walking uptown on Fifth Avenue and Smith is walking downtown. Their relative motion results in a discrepancy of several days in what they would judge to be happening “now” in the Andromeda galaxy at the moment they pass each other on the sidewalk. For Smith, the space fleet launched to destroy life on earth is already on its way; for Jones, the Andromedan council of tyrants has not even decided whether to send the fleet.

What Einstein had shown was that there is no universal “now.” Whether two events are simultaneous is relative to the observer. And once simultaneity goes by the board, the very division of moments into “past,” “present,” and “future” becomes meaningless. Events judged to be in the past by one observer may still lie in the future of another; therefore, past and present must be equally definite, equally “real.” In place of the fleeting present, we are left with a vast frozen timescape — a four-dimensional “block universe.” Over here, you are being born; over there, you are celebrating the turn of the millennium; and over yonder, you’ve been dead for a while. Nothing is “flowing” from one event to another. As the mathematician Hermann Weyl memorably put it, “The objective world simply is; it does not happen.” [Continue reading…]

Facebooktwittermail

The healing power of silence

Daniel A. Gross writes: One icy night in March 2010, 100 marketing experts piled into the Sea Horse Restaurant in Helsinki, with the modest goal of making a remote and medium-sized country a world-famous tourist destination. The problem was that Finland was known as a rather quiet country, and since 2008, the Country Brand Delegation had been looking for a national brand that would make some noise.

Over drinks at the Sea Horse, the experts puzzled over the various strengths of their nation. Here was a country with exceptional teachers, an abundance of wild berries and mushrooms, and a vibrant cultural capital the size of Nashville, Tennessee. These things fell a bit short of a compelling national identity. Someone jokingly suggested that nudity could be named a national theme — it would emphasize the honesty of Finns. Someone else, less jokingly, proposed that perhaps quiet wasn’t such a bad thing. That got them thinking.

A few months later, the delegation issued a slick “Country Brand Report.” It highlighted a host of marketable themes, including Finland’s renowned educational system and school of functional design. One key theme was brand new: silence. As the report explained, modern society often seems intolerably loud and busy. “Silence is a resource,” it said. It could be marketed just like clean water or wild mushrooms. “In the future, people will be prepared to pay for the experience of silence.”

People already do. In a loud world, silence sells. Noise-canceling headphones retail for hundreds of dollars; the cost of some weeklong silent meditation courses can run into the thousands. Finland saw that it was possible to quite literally make something out of nothing.

In 2011, the Finnish Tourist Board released a series of photographs of lone figures in the wilderness, with the caption “Silence, Please.” An international “country branding” consultant, Simon Anholt, proposed the playful tagline “No talking, but action.” And a Finnish watch company, Rönkkö, launched its own new slogan: “Handmade in Finnish silence.”

“We decided, instead of saying that it’s really empty and really quiet and nobody is talking about anything here, let’s embrace it and make it a good thing,” explains Eva Kiviranta, who manages social media for VisitFinland.com.

Silence is a peculiar starting point for a marketing campaign. After all, you can’t weigh, record, or export it. You can’t eat it, collect it, or give it away. The Finland campaign raises the question of just what the tangible effects of silence really are. Science has begun to pipe up on the subject. In recent years researchers have highlighted the peculiar power of silence to calm our bodies, turn up the volume on our inner thoughts, and attune our connection to the world. Their findings begin where we might expect: with noise.

The word “noise” comes from a Latin root meaning either queasiness or pain. According to the historian Hillel Schwartz, there’s even a Mesopotamian legend in which the gods grow so angry at the clamor of earthly humans that they go on a killing spree. (City-dwellers with loud neighbors may empathize, though hopefully not too closely.)

Dislike of noise has produced some of history’s most eager advocates of silence, as Schwartz explains in his book Making Noise: From Babel to the Big Bang and Beyond. In 1859, the British nurse and social reformer Florence Nightingale wrote, “Unnecessary noise is the most cruel absence of care that can be inflicted on sick or well.” Every careless clatter or banal bit of banter, Nightingale argued, can be a source of alarm, distress, and loss of sleep for recovering patients. She even quoted a lecture that identified “sudden noises” as a cause of death among sick children. [Continue reading…]

Facebooktwittermail

Beyond the Bell Curve, a new universal law

Natalie Wolchover writes: Imagine an archipelago where each island hosts a single tortoise species and all the islands are connected — say by rafts of flotsam. As the tortoises interact by dipping into one another’s food supplies, their populations fluctuate.

In 1972, the biologist Robert May devised a simple mathematical model that worked much like the archipelago. He wanted to figure out whether a complex ecosystem can ever be stable or whether interactions between species inevitably lead some to wipe out others. By indexing chance interactions between species as random numbers in a matrix, he calculated the critical “interaction strength” — a measure of the number of flotsam rafts, for example — needed to destabilize the ecosystem. Below this critical point, all species maintained steady populations. Above it, the populations shot toward zero or infinity.

Little did May know, the tipping point he discovered was one of the first glimpses of a curiously pervasive statistical law.

The law appeared in full form two decades later, when the mathematicians Craig Tracy and Harold Widom proved that the critical point in the kind of model May used was the peak of a statistical distribution. Then, in 1999, Jinho Baik, Percy Deift and Kurt Johansson discovered that the same statistical distribution also describes variations in sequences of shuffled integers — a completely unrelated mathematical abstraction. Soon the distribution appeared in models of the wriggling perimeter of a bacterial colony and other kinds of random growth. Before long, it was showing up all over physics and mathematics.

“The big question was why,” said Satya Majumdar, a statistical physicist at the University of Paris-Sud. “Why does it pop up everywhere?” [Continue reading…]

Facebooktwittermail

What Shakespeare can teach science about language and the limits of the human mind

Jillian Hinchliffe and Seth Frey write: Although [Stephen] Booth is now retired [from the University of California, Berkeley], his work [on Shakespeare] couldn’t be more relevant. In the study of the human mind, old disciplinary boundaries have begun to dissolve and fruitful new relationships between the sciences and humanities have sprung up in their place. When it comes to the cognitive science of language, Booth may be the most prescient literary critic who ever put pen to paper. In his fieldwork in poetic experience, he unwittingly anticipated several language-processing phenomena that cognitive scientists have only recently begun to study. Booth’s work not only provides one of the most original and penetrating looks into the nature of Shakespeare’s genius, it has profound implications for understanding the processes that shape how we think.

Until the early decades of the 20th century, Shakespeare criticism fell primarily into two areas: textual, which grapples with the numerous variants of published works in order to produce an edition as close as possible to the original, and biographical. Scholarship took a more political turn beginning in the 1960s, providing new perspectives from various strains of feminist, Marxist, structuralist, and queer theory. Booth is resolutely dismissive of most of these modes of study. What he cares about is poetics. Specifically, how poetic language operates on and in audiences of a literary work.

Close reading, the school that flourished mid-century and with which Booth’s work is most nearly affiliated, has never gone completely out of style. But Booth’s approach is even more minute—microscopic reading, according to fellow Shakespeare scholar Russ McDonald. And as the microscope opens up new worlds, so does Booth’s critical lens. What makes him radically different from his predecessors is that he doesn’t try to resolve or collapse his readings into any single interpretation. That people are so hung up on interpretation, on meaning, Booth maintains, is “no more than habit.” Instead, he revels in the uncertainty caused by the myriad currents of phonetic, semantic, and ideational patterns at play. [Continue reading…]

Facebooktwittermail

35,000 year-old Indonesian cave paintings suggest art came out of Africa

The Guardian reports: Paintings of wild animals and hand markings left by adults and children on cave walls in Indonesia are at least 35,000 years old, making them some of the oldest artworks known.

The rock art was originally discovered in caves on the island of Sulawesi in the 1950s, but dismissed as younger than 10,000 years old because scientists thought older paintings could not possibly survive in a tropical climate.

But fresh analysis of the pictures by an Australian-Indonesian team has stunned researchers by dating one hand marking to at least 39,900 years old, and two paintings of animals, a pig-deer or babirusa, and another animal, probably a wild pig, to at least 35,400 and 35,700 years ago respectively.

The work reveals that rather than Europe being at the heart of an explosion of creative brilliance when modern humans arrived from Africa, the early settlers of Asia were creating their own artworks at the same time or even earlier.

Archaeologists have not ruled out that the different groups of colonising humans developed their artistic skills independently of one another, but an enticing alternative is that the modern human ancestors of both were artists before they left the African continent.

“Our discovery on Sulawesi shows that cave art was made at opposite ends of the Pleistocene Eurasian world at about the same time, suggesting these practices have deeper origins, perhaps in Africa before our species left this continent and spread across the globe,” said Dr Maxime Aubert, an archaeologist at the University of Wollongong. [Continue reading…]

Facebooktwittermail

When digital nature replaces nature

Diane Ackerman writes: Last summer, I watched as a small screen in a department store window ran a video of surfing in California. That simple display mesmerized high-heeled, pin-striped, well-coiffed passersby who couldn’t take their eyes off the undulating ocean and curling waves that dwarfed the human riders. Just as our ancient ancestors drew animals on cave walls and carved animals from wood and bone, we decorate our homes with animal prints and motifs, give our children stuffed animals to clutch, cartoon animals to watch, animal stories to read. Our lives trumpet, stomp, and purr with animal tales, such as The Bat Poet, The Velveteen Rabbit, Aesop’s Fables, The Wind in the Willows, The Runaway Bunny, and Charlotte’s Web. I first read these wondrous books as a grown-up, when both the adult and the kid in me were completely spellbound. We call each other by “pet” names, wear animal-print clothes. We ogle plants and animals up close on screens of one sort or another. We may not worship or hunt the animals we see, but we still regard them as necessary physical and spiritual companions. It seems the more we exile ourselves from nature, the more we crave its miracle waters. Yet technological nature can’t completely satisfy that ancient yearning.

What if, through novelty and convenience, digital nature replaces biological nature? Gradually, we may grow used to shallower and shallower experiences of nature. Studies show that we’ll suffer. Richard Louv writes of widespread “nature deficit disorder” among children who mainly play indoors — an oddity quite new in the history of humankind. He documents an upswell in attention disorders, obesity, depression, and lack of creativity. A San Diego fourth-grader once told him: “I like to play indoors because that’s where all the electrical outlets are.” Adults suffer equally. It’s telling that hospital patients with a view of trees heal faster than those gazing at city buildings and parking lots. In studies conducted by Peter H. Kahn and his colleagues at the University of Washington, office workers in windowless cubicles were given flat-screen views of nature. They reaped the benefits of greater health, happiness, and efficiency than those without virtual windows. But they weren’t as happy, healthy, or creative as people given real windows with real views of nature.

As a species, we’ve somehow survived large and small ice ages, genetic bottlenecks, plagues, world wars, and all manner of natural disasters, but I sometimes wonder if we’ll survive our own ingenuity. At first glance, it seems like we may be living in sensory overload. The new technology, for all its boons, also bedevils us with speed demons, alluring distractors, menacing highjinks, cyber-bullies, thought-nabbers, calm-frayers, and a spiky wad of miscellaneous news. Some days it feels like we’re drowning in a twittering bog of information. But, at exactly the same time, we’re living in sensory poverty, learning about the world without experiencing it up close, right here, right now, in all its messy, majestic, riotous detail. Like seeing icebergs without the cold, without squinting in the Antarctic glare, without the bracing breaths of dry air, without hearing the chorus of lapping waves and shrieking gulls. We lose the salty smell of the cold sea, the burning touch of ice. If, reading this, you can taste those sensory details in your mind, is that because you’ve experienced them in some form before, as actual experience? If younger people never experience them, can they respond to words on the page in the same way?

The farther we distance ourselves from the spell of the present, explored by all our senses, the harder it will be to understand and protect nature’s precarious balance, let alone the balance of our own human nature. [Continue reading…]

Facebooktwittermail