Science’s embarrassing fossil fuel problem

Alice Bell writes: An investigation by Greenpeace and the Climate Investigations Centre reported in the Guardian and New York Times this weekend showed that the work of Willie Soon — an apparently ‘scientific’ voice for climate scepticism — had accepted more than $1.2 million from the fossil-fuel industry over the 14 years.

As Suzanne Goldenberg’s report stresses, although those seeking to delay action to curb carbon emissions were keen to cite and fund Soon’s Harvard-Smithsonian credentials, he did not enjoy the same sort of recognition from the scientific community. He did not receive grants from Nasa or the the National Science Foundation, for example — the sorts of institutions who funded his colleagues at the Center for Astrophysics. Moreover, it appears that Soon violated ethical guidelines of the journals that published his work by not disclosing such funding. It seems to be a story of someone working outside the usual codes of modern science.

But Soon is not a singular aberration in the story of science’s relationship with the fossil fuel industry. It goes deeper than that.

Science and engineering is suffused with oil, gas and, yes, even coal. We must look this squarely in the eye if we’re going to tackle climate change.

The fossil fuel industry is sometimes labelled anti-science, but that’s far from the truth. It loves science — or at least particular bits of science — indeed it needs science. The fossil fuel industry needs the science and engineering community to train staff, to gather information and help develop new techniques. Science and engineering also provides the industry with cultural credibility and can open up powerful political spaces within which to lobby. [Continue reading…]

facebooktwittermail

Too many worlds

Philip Ball writes: In July 2011, participants at a conference on the placid shore of Lake Traunsee in Austria were polled on what they thought the meeting was about. You might imagine that this question would have been settled in advance, but since the broad theme was quantum theory, perhaps a degree of uncertainty was to be expected. The title of the conference was ‘Quantum Physics and the Nature of Reality’. The poll, completed by 33 of the participating physicists, mathematicians and philosophers, posed a range of unresolved questions about the relationship between those two things, one of which was: ‘What is your favourite interpretation of quantum mechanics?’

The word ‘favourite’ speaks volumes. Isn’t science supposed to be decided by experiment and observation, free from personal preferences? But experiments in quantum physics have been obstinately silent on what it means. All we can do is develop hunches, intuitions and, yes, cherished ideas. Of these, the survey offered no fewer than 11 to choose from (as well as ‘other’ and ‘none’).

The most popular (supported by 42 per cent of the very small sample) was basically the view put forward by Niels Bohr, Werner Heisenberg and their colleagues in the early days of quantum theory. Today it is known as the Copenhagen Interpretation. More on that below. You might not recognise most of the other alternatives, such as Quantum Bayesianism, Relational Quantum Mechanics, and Objective Collapse (which is not, as you might suppose, just saying ‘what the hell’). Maybe you haven’t heard of the Copenhagen Interpretation either. But in third place (18 per cent) was the Many Worlds Interpretation (MWI), and I suspect you do know something about that, since the MWI is the one with all the glamour and publicity. It tells us that we have multiple selves, living other lives in other universes, quite possibly doing all the things that we dream of but will never achieve (or never dare). Who could resist such an idea?

Yet resist we should. We should resist not just because MWI is unlikely to be true, or even because, since no one knows how to test it, the idea is perhaps not truly scientific at all. Those are valid criticisms, but the main reason we should hold out is that it is incoherent, both philosophically and logically. There could be no better contender for Wolfgang Pauli’s famous put-down: it is not even wrong. [Continue reading…]

facebooktwittermail

Ancient planets are almost as old as the universe

New Scientist reports: The Old Ones were already ancient when the Earth was born. Five small planets orbit an 11.2 billion-year-old star, making them about 80 per cent as old as the universe itself. That means our galaxy started building rocky planets earlier than we thought.

“Now that we know that these planets can be twice as old as Earth, this opens the possibility for the existence of ancient life in the galaxy,” says Tiago Campante at the University of Birmingham in the UK.

NASA’s Kepler space telescope spotted the planets around an orange dwarf star called Kepler 444, which is 117 light years away and about 25 per cent smaller than the sun.

Orange dwarfs are considered good candidates for hosting alien life because they can stay stable for up to 30 billion years, compared to the sun’s 10 billion years, the time it takes these stars to consume all their hydrogen. For context, the universe is currently 13.8 billion years old.

Since, as far as we know, life begins by chance, older planets would have had more time to allow life to get going and evolve. But it was unclear whether planets around such an old star could be rocky – life would have a harder time on gassy planets without a solid surface. [Continue reading…]

facebooktwittermail

Trying to read scrolls that can’t be read

The Economist: In 1752 Camillo Paderni, an artist who had been put in charge of the growing pile of antiquities being dug up at Herculaneum, a seaside town near Naples, wrote to a certain Dr Mead, who then wrote to the Royal Society in London reporting that “there were found many volumes of papyrus but turned to a sort of charcoal, and so brittle, that being touched, it fell to ashes. Yet by His Majesty’s orders he made many trials to open them, but all to no purpose; excepting some scraps containing some words.”

The excavation at Herculaneum — which, like nearby Pompeii, was buried in 79AD under ash from Mount Vesuvius — had uncovered a literary time capsule. What came to be called the Villa of the Papyri contained a library of perhaps 2,000 books, the only such collection known to have been preserved from antiquity.

Actually reading these scrolls has, however, proved both tricky and destructive — until now. For a paper just published in Nature Communications, by Vito Mocella of the Institute for Microelectronics and Microsystems, in Naples, describes a way to decipher them without unrolling them.

facebooktwittermail

Did physics get sucked down a wormhole?

Bryan Appleyard writes: The greatest story of our time may also be the greatest mistake. This is the story of our universe from the Big Bang to now with its bizarre, Dickensian cast of characters – black holes, tiny vibrating strings, the warped space-time continuum, trillions of companion universes and particles that wink in and out of existence.

It is the story told by a long list of officially accredited geniuses from Isaac Newton to Stephen Hawking. It is also the story that is retold daily in popular science fiction from Star Trek to the latest Hollywood sci-fi blockbuster Interstellar. Thanks to the movies, the physicist standing in front of a vast blackboard covered in equations became our age’s symbol of genius. The universe is weird, the TV shows and films tell us, and almost anything can happen.

But it is a story that many now believe is pointless, wrong and riddled with wishful thinking and superstition.

“Stephen Hawking,” says philosopher Roberto Mangabeira Unger, “is not part of the solution, he is part of the problem.”

The equations on the blackboard may be the problem. Mathematics, the language of science, may have misled the scientists.

“The idea,” says physicist Lee Smolin, “that the truth about nature can be wrestled from pure thought through mathematics is overdone… The idea that mathematics is prophetic and that mathematical structure and beauty are a clue to how nature ultimately works is just wrong.”

And in an explosive essay published last week in the science journal Nature astrophysicists George Ellis and Joe Silk say that the wild claims of theoretical physicists are threatening the authority of science itself.

“This battle for the heart and soul of physics,” they write, “is opening up at a time when scientific results — in topics from climate change to the theory of evolution — are being questioned by some politicians and religious fundamentalists. Potential damage to public confidence in science and to the nature of fundamental physics needs to be contained by deeper dialogue between scientists and philosophers….The imprimatur of science should be awarded only to a theory that is testable. Only then can we defend science from attack.”

Unger and Smolin have also just gone into print with a monumental book – The Singular Universe and the Reality of Time – which systematically takes apart contemporary physics and exposes much of it as, in Unger’s words, “an inferno of allegorical fabrication.” The book says it is time to return to real science which is tested against nature rather than constructed out of mathematics. Physics should no longer be seen as the ultimate science, underwriting all others. The true queen of the sciences should be history – the biography of the cosmos. [Continue reading…]

Before any physicists stop by to question whether I really understand what a wormhole is, I will without hesitation make it clear: I have no idea. It just seems like a suitable metaphor — better, say, than rabbit hole.

facebooktwittermail

The Pillars of Creation

pillars-of-creation

NBC News: In celebration of its upcoming 25th anniversary in April, the Hubble Space Telescope has returned to the site of what may be its most famous image, the wispy columns of the Eagle Nebula, and produced a stunning new picture. “The Pillars of Creation,” located 6,500 light-years away in area M16 of the distant nebula, were photographed in visible and near-infrared light with the Hubble’s upgraded equipment, and the result is as astonishing now as the original was in 1995. Hubble went online in 1990.

facebooktwittermail

Why has progress stalled?

Michael Hanlon writes: We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.

The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.

Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.

There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.

Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. [Continue reading…]

facebooktwittermail

E.O. Wilson talks about the threat to Earth’s biodiversity

facebooktwittermail

Thousands of Einstein documents now accessible online

The New York Times reports: They have been called the Dead Sea Scrolls of physics. Since 1986, the Princeton University Press and the Hebrew University of Jerusalem, to whom Albert Einstein bequeathed his copyright, have been engaged in a mammoth effort to study some 80,000 documents he left behind.

Starting on Friday, when Digital Einstein is introduced, anyone with an Internet connection will be able to share in the letters, papers, postcards, notebooks and diaries that Einstein left scattered in Princeton and in other archives, attics and shoeboxes around the world when he died in 1955.

The Einstein Papers Project, currently edited by Diana Kormos-Buchwald, a professor of physics and the history of science at the California Institute of Technology, has already published 13 volumes in print out of a projected 30. [Continue reading…]

facebooktwittermail

What is it like to be a bee?

honey-bee

In the minds of many humans, empathy is the signature of humanity and yet if this empathy extends further and includes non-humans we may be suspected of indulging in anthropomorphism — a sentimental projection of our own feelings into places where similar feelings supposedly cannot exist.

But the concept of anthropomorphism is itself a strange idea since it seems to invalidate what should be one of the most basic assumptions we can reasonably make about living creatures: that without the capacity to suffer, nothing would survive.

Just as the deadening of sensation makes people more susceptible to injury, an inability to feel pain would impede any creature’s need to avoid harm.

The seemingly suicidal draw of the moth to a flame is the exception rather than the rule. Moreover the insect is driven by a mistake, not a death wish. It is drawn towards the light, not the heat, oblivious that the two are one.

If humans indulge in projections about the feelings of others — human and non-human — perhaps we more commonly engage in negative projections: choosing to assume that feelings are absent where it would cause us discomfort to be attuned to their presence.

Our inclination is to avoid feeling too much and thus we construct neat enclosures for our concerns.

These enclosures shut out the feelings of strangers and then by extension seal away boundless life from which we have become even more estranged.

Heather Swan writes: It was a warm day in early spring when I had my first long conversation with the entomologist and science studies scholar Sainath Suryanarayanan. We met over a couple of hives I had recently inherited. One was thriving. Piles of dead bees filled the other. Parts of the comb were covered with mould and oozing something that looked like molasses.

Having recently attended a class for hobby beekeepers with Marla Spivak, an entomologist at the University of Minnesota, I was aware of the many different diseases to which bees are susceptible. American foulbrood, which was a mean one, concerned me most. Beekeepers recommended burning all of your equipment if you discovered it in your hives. Some of these bees were alive, but obviously in low spirits, and I didn’t want to destroy them unnecessarily. I called Sainath because I thought he could help me with the diagnosis.

Beekeeping, these days, is riddled with risks. New viruses, habitat loss, pesticides and mites all contribute to creating a deadly labyrinth through which nearly every bee must travel. Additionally, in 2004, mysterious bee disappearances began to plague thousands of beekeepers. Seemingly healthy bees started abandoning their homes. This strange disappearing act became known as colony collapse disorder (CCD).

Since then, the world has seen the decline of many other pollinating species, too. Because honeybees and other pollinators are responsible for pollinating at least one-third of all the food we eat, this is a serious problem globally. Diagnosing bee problems is not simple, but some answers are emerging. A ubiquitous class of pesticides called neonicotinoids have been implicated in pollinator decline, which has fuelled conversations among beekeepers, scientists, policy-makers and growers. A beekeeper facing a failing hive now has to consider not only the health of the hive itself, but also the health of the landscape around the hive. Dead bees lead beekeepers down a path of many questions. And some beekeepers have lost so many hives, they feel like giving up.

When we met at my troubled hives, Sainath brought his own hive tool and veil. He had already been down a path of many questions about bee deaths, one that started in his youth with a fascination for observing insects. When he was 14, he began his ‘Amateur Entomologist’s Record’, where he kept taxonomic notes on such things as wing textures, body shapes, colour patterns and behaviours. But the young scientist’s approach occasionally slipped to include his exuberance, describing one moment as ‘a stupefying experience!’ All this led him to study biology and chemistry in college, then to work on the behavioural ecology of paper wasps during his doctoral studies, and eventually to Minnesota to help Spivak investigate the role of pesticides in CCD.

Sainath had spent several years doing lab and field experiments with wasps and bees, but ultimately wanted to shift from traditional practices in entomology to research that included human/insect relationships. It was Sainath who made me wonder about the role of emotion in science – both in the scientists themselves and in the subjects of their experiments. I had always thought of emotion as something excised from science, but this was impossible for some scientists. What was the role of empathy in experimentation? How do we, with our human limitations, understand something as radically different from us as the honeybee? Did bees have feelings, too? If so, what did that mean for the scientist? For the science? [Continue reading…]

facebooktwittermail

Researchers announce major advance in image-recognition software

The New York Times reports: Two groups of scientists, working independently, have created artificial intelligence software capable of recognizing and describing the content of photographs and videos with far greater accuracy than ever before, sometimes even mimicking human levels of understanding.

Until now, so-called computer vision has largely been limited to recognizing individual objects. The new software, described on Monday by researchers at Google and at Stanford University, teaches itself to identify entire scenes: a group of young men playing Frisbee, for example, or a herd of elephants marching on a grassy plain.

The software then writes a caption in English describing the picture. Compared with human observations, the researchers found, the computer-written descriptions are surprisingly accurate.

The advances may make it possible to better catalog and search for the billions of images and hours of video available online, which are often poorly described and archived. At the moment, search engines like Google rely largely on written language accompanying an image or video to ascertain what it contains. [Continue reading…]

facebooktwittermail

Hydrogen cars about to go on sale. Their only emission: water

The New York Times reports: Remember the hydrogen car?

A decade ago, President George W. Bush espoused the environmental promise of cars running on hydrogen, the universe’s most abundant element. “The first car driven by a child born today,” he said in his 2003 State of the Union speech, “could be powered by hydrogen, and pollution-free.”

That changed under Steven Chu, the Nobel Prize-winning physicist who was President Obama’s first Secretary of Energy. “We asked ourselves, ‘Is it likely in the next 10 or 15, 20 years that we will convert to a hydrogen-car economy?’” Dr. Chu said then. “The answer, we felt, was ‘no.’ ” The administration slashed funding for hydrogen fuel cell research.

Attention shifted to battery electric vehicles, particularly those made by the headline-grabbing Tesla Motors.

The hydrogen car, it appeared, had died. And many did not mourn its passing, particularly those who regarded the auto companies’ interest in hydrogen technology as a stunt to signal that they cared about the environment while selling millions of highly profitable gas guzzlers.

Except the companies, including General Motors, Honda, Toyota, Daimler and Hyundai, persisted.

After many years and billions of dollars of research and development, hydrogen cars are headed to the showrooms. [Continue reading…]

facebooktwittermail

Wonder and the ends of inquiry

Lorraine Daston writes: Science and wonder have a long and ambivalent relationship. Wonder is a spur to scientific inquiry but also a reproach and even an inhibition to inquiry. As philosophers never tire of repeating, only those ignorant of the causes of things wonder: the solar eclipse that terrifies illiterate peasants is no wonder to the learned astronomer who can explain and predict it. Romantic poets accused science of not just neutralizing wonder but of actually killing it. Modern popularizations of science make much of wonder — but expressions of that passion are notably absent in professional publications. This love-hate relationship between wonder and science started with science itself.

Wonder always comes at the beginning of inquiry. “For it is owing to their wonder that men both now begin and at first began to philosophize,” explains Aristotle; Descartes made wonder “the first of the passions,” and the only one without a contrary, opposing passion. In these and many other accounts of wonder, both soul and senses are ambushed by a puzzle or a surprise, something that catches us unawares and unprepared. Wonder widens the eyes, opens the mouth, stops the heart, freezes thought. Above all, at least in classical accounts like those of Aristotle and Descartes, wonder both diagnoses and cures ignorance. It reveals that there are more things in heaven and earth than have been dreamt of in our philosophy; ideally, it also spurs us on to find an explanation for the marvel.

Therein lies the paradox of wonder: it is the beginning of inquiry (Descartes remarks that people deficient in wonder “are ordinarily quite ignorant”), but the end of inquiry also puts an end to wonder. [Continue reading…]

facebooktwittermail

Slaves of productivity

Quinn Norton writes: We dream now of making Every Moment Count, of achieving flow and never leaving, creating one project that must be better than the last, of working harder and smarter. We multitask, we update, and we conflate status with long hours worked in no paid overtime systems for the nebulous and fantastic status of being Too Important to have Time to Ourselves, time to waste. But this incarnation of the American dream is all about doing, and nothing about doing anything good, or even thinking about what one was doing beyond how to do more of it more efficiently. It was not even the surrenders to hedonism and debauchery or greed our literary dreams have recorded before. It is a surrender to nothing, to a nothingness of lived accounting.

This moment’s goal of productivity, with its all-consuming practice and unattainable horizon, is perfect for our current corporate world. Productivity never asks what it builds, just how much of it can be piled up before we leave or die. It is irrelevant to pleasure. It’s agnostic about the fate of humanity. It’s not even selfish, because production negates the self. Self can only be a denominator, holding up a dividing bar like a caryatid trying to hold up a stone roof.

I am sure this started with the Industrial Revolution, but what has swept through this generation is more recent. This idea of productivity started in the 1980s, with the lionizing of the hardworking greedy. There’s a critique of late capitalism to be had for sure, but what really devastated my generation was the spiritual malaise inherent in Taylorism’s perfectly mechanized human labor. But Taylor had never seen a robot or a computer perfect his methods of being human. By the 1980s, we had. In the age of robots we reinvented the idea of being robots ourselves. We wanted to program our minds and bodies and have them obey clocks and routines. In this age of the human robot, of the materialist mind, being efficient took the pre-eminent spot, beyond goodness or power or wisdom or even cruel greed. [Continue reading…]

facebooktwittermail

Getting beyond debates over science

Paul Voosen writes: Last year, as the summer heat broke, a congregation of climate scientists and communicators gathered at the headquarters of the American Association for the Advancement of Science, a granite edifice erected in the heart of Washington, to wail over their collective futility.

Year by year, the evidence for human-caused global warming has grown more robust. Greenhouse gases load the air and sea. Temperatures rise. Downpours strengthen. Ice melts. Yet the American public seems, from cursory glances at headlines and polls, more divided than ever on the basic existence of climate change, in spite of scientists’ many, many warnings. Their message, the attendees fretted, simply wasn’t getting through.

This worry wasn’t just about climate change, but also stem cells. Genetically modified food. Vaccines. Nuclear power. And, of course, evolution: Challenging scientific reality seems to be an increasingly common feature of American life. Some researchers have gone so far as to accuse one political party, the Republicans, of making “science denial” a bedrock principle. The authority attributed to scientists for a century is crumbling.

It is a disturbing story. It is also, in many ways, a fairy tale. So says Dan M. Kahan, a law professor at Yale University who, over the past decade, has run an insurgent research campaign into how the public understands science. Through a magpie synthesis of psychology, risk perception, anthropology, political science, and communication research, leavened with heavy doses of empiricism and idol bashing, he has exposed the tribal biases that mediate our encounters with scientific knowledge. It’s a dynamic he calls cultural cognition. [Continue reading…]

facebooktwittermail

The faster we go, the more time we lose

Mark C. Taylor writes: “Sleeker. Faster. More Intuitive” (The New York Times); “Welcome to a world where speed is everything” (Verizon FiOS); “Speed is God, and time is the devil” (chief of Hitachi’s portable-computer division). In “real” time, life speeds up until time itself seems to disappear—fast is never fast enough, everything has to be done now, instantly. To pause, delay, stop, slow down is to miss an opportunity and to give an edge to a competitor. Speed has become the measure of success—faster chips, faster computers, faster networks, faster connectivity, faster news, faster communications, faster transactions, faster deals, faster delivery, faster product cycles, faster brains, faster kids. Why are we so obsessed with speed, and why can’t we break its spell?

The cult of speed is a modern phenomenon. In “The Futurist Manifesto” in 1909, Filippo Tommaso Marionetti declared, “We say that the splendor of the world has been enriched by a new beauty: the beauty of speed.” The worship of speed reflected and promoted a profound shift in cultural values that occurred with the advent of modernity and modernization. With the emergence of industrial capitalism, the primary values governing life became work, efficiency, utility, productivity, and competition. When Frederick Winslow Taylor took his stopwatch to the factory floor in the early 20th century to increase workers’ efficiency, he began a high-speed culture of surveillance so memorably depicted in Charlie Chaplin’s Modern Times. Then, as now, efficiency was measured by the maximization of rapid production through the programming of human behavior.

With the transition from mechanical to electronic technologies, speed increased significantly. The invention of the telegraph, telephone, and stock ticker liberated communication from the strictures imposed by the physical means of conveyance. Previously, messages could be sent no faster than people, horses, trains, or ships could move. By contrast, immaterial words, sounds, information, and images could be transmitted across great distances at very high speed. During the latter half of the 19th century, railway and shipping companies established transportation networks that became the backbone of national and international information networks. When the trans-Atlantic cable (1858) and transcontinental railroad (1869) were completed, the foundation for the physical infrastructure of today’s digital networks was in place.

Fast-forward 100 years. During the latter half of the 20th century, information, communications, and networking technologies expanded rapidly, and transmission speed increased exponentially. But more than data and information were moving faster. Moore’s Law, according to which the speed of computer chips doubles every two years, now seems to apply to life itself. Plugged in 24/7/365, we are constantly struggling to keep up but are always falling further behind. The faster we go, the less time we seem to have. As our lives speed up, stress increases, and anxiety trickles down from managers to workers, and parents to children. [Continue reading…]

facebooktwittermail

Beyond the Bell Curve, a new universal law

Natalie Wolchover writes: Imagine an archipelago where each island hosts a single tortoise species and all the islands are connected — say by rafts of flotsam. As the tortoises interact by dipping into one another’s food supplies, their populations fluctuate.

In 1972, the biologist Robert May devised a simple mathematical model that worked much like the archipelago. He wanted to figure out whether a complex ecosystem can ever be stable or whether interactions between species inevitably lead some to wipe out others. By indexing chance interactions between species as random numbers in a matrix, he calculated the critical “interaction strength” — a measure of the number of flotsam rafts, for example — needed to destabilize the ecosystem. Below this critical point, all species maintained steady populations. Above it, the populations shot toward zero or infinity.

Little did May know, the tipping point he discovered was one of the first glimpses of a curiously pervasive statistical law.

The law appeared in full form two decades later, when the mathematicians Craig Tracy and Harold Widom proved that the critical point in the kind of model May used was the peak of a statistical distribution. Then, in 1999, Jinho Baik, Percy Deift and Kurt Johansson discovered that the same statistical distribution also describes variations in sequences of shuffled integers — a completely unrelated mathematical abstraction. Soon the distribution appeared in models of the wriggling perimeter of a bacterial colony and other kinds of random growth. Before long, it was showing up all over physics and mathematics.

“The big question was why,” said Satya Majumdar, a statistical physicist at the University of Paris-Sud. “Why does it pop up everywhere?” [Continue reading…]

facebooktwittermail

When digital nature replaces nature

Diane Ackerman writes: Last summer, I watched as a small screen in a department store window ran a video of surfing in California. That simple display mesmerized high-heeled, pin-striped, well-coiffed passersby who couldn’t take their eyes off the undulating ocean and curling waves that dwarfed the human riders. Just as our ancient ancestors drew animals on cave walls and carved animals from wood and bone, we decorate our homes with animal prints and motifs, give our children stuffed animals to clutch, cartoon animals to watch, animal stories to read. Our lives trumpet, stomp, and purr with animal tales, such as The Bat Poet, The Velveteen Rabbit, Aesop’s Fables, The Wind in the Willows, The Runaway Bunny, and Charlotte’s Web. I first read these wondrous books as a grown-up, when both the adult and the kid in me were completely spellbound. We call each other by “pet” names, wear animal-print clothes. We ogle plants and animals up close on screens of one sort or another. We may not worship or hunt the animals we see, but we still regard them as necessary physical and spiritual companions. It seems the more we exile ourselves from nature, the more we crave its miracle waters. Yet technological nature can’t completely satisfy that ancient yearning.

What if, through novelty and convenience, digital nature replaces biological nature? Gradually, we may grow used to shallower and shallower experiences of nature. Studies show that we’ll suffer. Richard Louv writes of widespread “nature deficit disorder” among children who mainly play indoors — an oddity quite new in the history of humankind. He documents an upswell in attention disorders, obesity, depression, and lack of creativity. A San Diego fourth-grader once told him: “I like to play indoors because that’s where all the electrical outlets are.” Adults suffer equally. It’s telling that hospital patients with a view of trees heal faster than those gazing at city buildings and parking lots. In studies conducted by Peter H. Kahn and his colleagues at the University of Washington, office workers in windowless cubicles were given flat-screen views of nature. They reaped the benefits of greater health, happiness, and efficiency than those without virtual windows. But they weren’t as happy, healthy, or creative as people given real windows with real views of nature.

As a species, we’ve somehow survived large and small ice ages, genetic bottlenecks, plagues, world wars, and all manner of natural disasters, but I sometimes wonder if we’ll survive our own ingenuity. At first glance, it seems like we may be living in sensory overload. The new technology, for all its boons, also bedevils us with speed demons, alluring distractors, menacing highjinks, cyber-bullies, thought-nabbers, calm-frayers, and a spiky wad of miscellaneous news. Some days it feels like we’re drowning in a twittering bog of information. But, at exactly the same time, we’re living in sensory poverty, learning about the world without experiencing it up close, right here, right now, in all its messy, majestic, riotous detail. Like seeing icebergs without the cold, without squinting in the Antarctic glare, without the bracing breaths of dry air, without hearing the chorus of lapping waves and shrieking gulls. We lose the salty smell of the cold sea, the burning touch of ice. If, reading this, you can taste those sensory details in your mind, is that because you’ve experienced them in some form before, as actual experience? If younger people never experience them, can they respond to words on the page in the same way?

The farther we distance ourselves from the spell of the present, explored by all our senses, the harder it will be to understand and protect nature’s precarious balance, let alone the balance of our own human nature. [Continue reading…]

facebooktwittermail