Space.com reports: It is the announcement no one wanted to hear: The most exciting astronomical discovery of 2014 has vanished. Two groups of scientists announced today (Jan. 30) that a tantalizing signal — which some scientists claimed was “smoking gun” evidence of dramatic cosmic expansion just after the birth of the universe — was actually caused by something much more mundane: interstellar dust.
In the cosmic inflation announcement, which was unveiled in March 2014, scientists with the BICEP2 experiment, claimed to have found patterns in light left over from the Big Bang that indicated that space had rapidly inflated at the beginning of the universe, about 13.8 billion years ago. The discovery also supposedly confirmed the existence of gravitational waves, theoretical ripples in space-time.
But in a statement today, scientists with the European Space Agency said that data from the agency’s Planck space observatory has revealed that interstellar dust caused more than half of the signal detected by the Antarctica-based BICEP2 experiment. The Planck spacecraft observations were not yet available last March when the BICEP2 science team made its announcement. [Continue reading…]
Category Archives: Attention to the Unseen
A conversation with Adam Curtis
Jon Ronson writes: I’ve known Adam Curtis for nearly 20 years. We’re friends. We see movies together, and once even went to Romania on a mini-break to attend an auction of Nicolae Ceausescu’s belongings. But it would be wrong to characterise our friendship as frivolous. Most of the time when we’re together I’m just intensely cross-questioning him about some new book idea I have.
Sometimes Adam will say something that seems baffling and wrong at the time, but makes perfect sense a few years later. I could give you lots of examples, but here’s one: I’m about to publish a book – So You’ve Been Publicly Shamed – about how social media is evolving into a cold and conservative place, a giant echo chamber where what we believe is constantly reinforced by people who believe the same thing, and when people step out of line in the smallest ways we destroy them. Adam was warning me about Twitter’s propensity to turn this way six years ago, when it was still a Garden of Eden. Sometimes talking to Adam feels like finding the results of some horse race of the future, where the long-shot horse wins.
I suppose it’s no surprise that Adam would notice this stuff about social media so early on. It’s what his films are almost always about – power and social control. However, people don’t only enjoy them for the subject matter, but for how they look, too – his wonderful, strange use of archive.
His new film, Bitter Lake, is his most experimental yet. And I think it’s his best. It’s still journalism: it’s about our relationship with Afghanistan, and how we don’t know what to do, and so we just repeat the mistakes of the past. But he’s allowed his use of archive to blossom crazily. Fifty percent of the film has no commentary. Instead, he’s created this dreamlike, fantastical collage from historical footage and raw, unedited news footage. Sometimes it’s just a shot of a man walking down a road in some Afghan town, and you don’t know why he’s chosen it, and then something happens and you think, ‘Ah!’ (Or, more often, ‘Oh God.’) It might be something small and odd. Or it might be something huge and terrible.
Nightmarish things happen in Bitter Lake. There are shots of people dying. It’s a film that could never be on TV. It’s too disturbing. And it’s too long as well – nearly two and a half hours. And so he’s putting it straight onto BBC iPlayer. I think, with this film, he’s invented a whole new way of telling a nonfiction story.
VICE asked the two of us to have an email conversation about his work. We started just before Christmas, and carried on until after the New Year. [Continue reading…]
Ancient planets are almost as old as the universe
New Scientist reports: The Old Ones were already ancient when the Earth was born. Five small planets orbit an 11.2 billion-year-old star, making them about 80 per cent as old as the universe itself. That means our galaxy started building rocky planets earlier than we thought.
“Now that we know that these planets can be twice as old as Earth, this opens the possibility for the existence of ancient life in the galaxy,” says Tiago Campante at the University of Birmingham in the UK.
NASA’s Kepler space telescope spotted the planets around an orange dwarf star called Kepler 444, which is 117 light years away and about 25 per cent smaller than the sun.
Orange dwarfs are considered good candidates for hosting alien life because they can stay stable for up to 30 billion years, compared to the sun’s 10 billion years, the time it takes these stars to consume all their hydrogen. For context, the universe is currently 13.8 billion years old.
Since, as far as we know, life begins by chance, older planets would have had more time to allow life to get going and evolve. But it was unclear whether planets around such an old star could be rocky – life would have a harder time on gassy planets without a solid surface. [Continue reading…]
Trying to read scrolls that can’t be read
The Economist: In 1752 Camillo Paderni, an artist who had been put in charge of the growing pile of antiquities being dug up at Herculaneum, a seaside town near Naples, wrote to a certain Dr Mead, who then wrote to the Royal Society in London reporting that “there were found many volumes of papyrus but turned to a sort of charcoal, and so brittle, that being touched, it fell to ashes. Yet by His Majesty’s orders he made many trials to open them, but all to no purpose; excepting some scraps containing some words.”
The excavation at Herculaneum — which, like nearby Pompeii, was buried in 79AD under ash from Mount Vesuvius — had uncovered a literary time capsule. What came to be called the Villa of the Papyri contained a library of perhaps 2,000 books, the only such collection known to have been preserved from antiquity.
Actually reading these scrolls has, however, proved both tricky and destructive — until now. For a paper just published in Nature Communications, by Vito Mocella of the Institute for Microelectronics and Microsystems, in Naples, describes a way to decipher them without unrolling them.
A battle for the heart and soul of physics has opened up
George Ellis and Joe Silk write: This year, debates in physics circles took a worrying turn. Faced with difficulties in applying fundamental theories to the observed Universe, some researchers called for a change in how theoretical physics is done. They began to argue — explicitly — that if a theory is sufficiently elegant and explanatory, it need not be tested experimentally, breaking with centuries of philosophical tradition of defining scientific knowledge as empirical. We disagree. As the philosopher of science Karl Popper argued: a theory must be falsifiable to be scientific.
Chief among the ‘elegance will suffice’ advocates are some string theorists. Because string theory is supposedly the ‘only game in town’ capable of unifying the four fundamental forces, they believe that it must contain a grain of truth even though it relies on extra dimensions that we can never observe. Some cosmologists, too, are seeking to abandon experimental verification of grand hypotheses that invoke imperceptible domains such as the kaleidoscopic multiverse (comprising myriad universes), the ‘many worlds’ version of quantum reality (in which observations spawn parallel branches of reality) and pre-Big Bang concepts.
These unprovable hypotheses are quite different from those that relate directly to the real world and that are testable through observations — such as the standard model of particle physics and the existence of dark matter and dark energy. As we see it, theoretical physics risks becoming a no-man’s-land between mathematics, physics and philosophy that does not truly meet the requirements of any.
The issue of testability has been lurking for a decade. String theory and multiverse theory have been criticized in popular books and articles, including some by one of us (G.E.). In March, theorist Paul Steinhardt wrote in this journal that the theory of inflationary cosmology is no longer scientific because it is so flexible that it can accommodate any observational result. Theorist and philosopher Richard Dawid and cosmologist Sean Carroll have countered those criticisms with a philosophical case to weaken the testability requirement for fundamental physics.
We applaud the fact that Dawid, Carroll and other physicists have brought the problem out into the open. But the drastic step that they are advocating needs careful debate. This battle for the heart and soul of physics is opening up at a time when scientific results — in topics from climate change to the theory of evolution — are being questioned by some politicians and religious fundamentalists. Potential damage to public confidence in science and to the nature of fundamental physics needs to be contained by deeper dialogue between scientists and philosophers. [Continue reading…]
The invasion of America
Claudio Saunt writes: Between 1776 and the present, the United States seized some 1.5 billion acres from North America’s native peoples, an area 25 times the size of the United Kingdom. Many Americans are only vaguely familiar with the story of how this happened. They perhaps recognise Wounded Knee and the Trail of Tears, but few can recall the details and even fewer think that those events are central to US history.
Their tenuous grasp of the subject is regrettable if unsurprising, given that the conquest of the continent is both essential to understanding the rise of the United States and deplorable. Acre by acre, the dispossession of native peoples made the United States a transcontinental power. To visualise this story, I created ‘The Invasion of America’, an interactive time-lapse map of the nearly 500 cessions that the United States carved out of native lands on its westward march to the shores of the Pacific. [Continue reading…]
The strange inevitability of evolution
Philip Ball writes: Is the natural world creative? Just take a look around it. Look at the brilliant plumage of tropical birds, the diverse pattern and shape of leaves, the cunning stratagems of microbes, the dazzling profusion of climbing, crawling, flying, swimming things. Look at the “grandeur” of life, the “endless forms most beautiful and most wonderful,” as Darwin put it. Isn’t that enough to persuade you?
Ah, but isn’t all this wonder simply the product of the blind fumbling of Darwinian evolution, that mindless machine which takes random variation and sieves it by natural selection? Well, not quite. You don’t have to be a benighted creationist, nor even a believer in divine providence, to argue that Darwin’s astonishing theory doesn’t fully explain why nature is so marvelously, endlessly inventive. “Darwin’s theory surely is the most important intellectual achievement of his time, perhaps of all time,” says evolutionary biologist Andreas Wagner of the University of Zurich. “But the biggest mystery about evolution eluded his theory. And he couldn’t even get close to solving it.”
What Wagner is talking about is how evolution innovates: as he puts it, “how the living world creates.” Natural selection supplies an incredibly powerful way of pruning variation into effective solutions to the challenges of the environment. But it can’t explain where all that variation came from. As the biologist Hugo de Vries wrote in 1905, “natural selection may explain the survival of the fittest, but it cannot explain the arrival of the fittest.” Over the past several years, Wagner and a handful of others have been starting to understand the origins of evolutionary innovation. Thanks to their findings so far, we can now see not only how Darwinian evolution works but why it works: what makes it possible. [Continue reading…]
The Pillars of Creation
NBC News: In celebration of its upcoming 25th anniversary in April, the Hubble Space Telescope has returned to the site of what may be its most famous image, the wispy columns of the Eagle Nebula, and produced a stunning new picture. “The Pillars of Creation,” located 6,500 light-years away in area M16 of the distant nebula, were photographed in visible and near-infrared light with the Hubble’s upgraded equipment, and the result is as astonishing now as the original was in 1995. Hubble went online in 1990.
Friluftsliv, shinrin-yoku, hygge, wabi-sabi, kaizen, gemütlichkeit, and jugaad?
Starre Vartan writes about cultural concepts most of us have never heard of: Friluftsliv translates directly from Norwegian as “free air life,” which doesn’t quite do it justice. Coined relatively recently, in 1859, it is the concept that being outside is good for human beings’ mind and spirit. “It is a term in Norway that is used often to describe a way of life that is spent exploring and appreciating nature,” Anna Stoltenberg, culture coordinator for Sons of Norway, a U.S.-based Norwegian heritage group, told MNN. Other than that, it’s not a strict definition: it can include sleeping outside, hiking, taking photographs or meditating, playing or dancing outside, for adults or kids. It doesn’t require any special equipment, includes all four seasons, and needn’t cost much money. Practicing friluftsliv could be as simple as making a commitment to walking in a natural area five days a week, or doing a day-long hike once a month.
Shinrin-yoku is a Japanese term that means “forest bathing” and unlike the Norwegian translation above, this one seems a perfect language fit (though a pretty similar idea). The idea being that spending time in the forest and natural areas is good preventative medicine, since it lowers stress, which causes or exacerbates some of our most intractable health issues. As MNN’s Catie Leary details, this isn’t just a nice idea — there’s science behind it: “The “magic” behind forest bathing boils down to the naturally produced allelochemic substances known as phytoncides, which are kind of like pheromones for plants. Their job is to help ward off pesky insects and slow the growth of fungi and bacteria. When humans are exposed to phytoncides, these chemicals are scientifically proven to lower blood pressure, relieve stress and boost the growth of cancer-fighting white blood cells. Some common examples of plants that give off phytoncides include garlic, onion, pine, tea tree and oak, which makes sense considering their potent aromas.” [Continue reading…]
Neil Postman: The man who predicted Fox News, the internet, Stephen Colbert and reality TV
Scott Timberg writes: These days, even the kind of educated person who might have once disdained TV and scorned electronic gadgets debates plot turns from “Game of Thrones” and carries an app-laden iPhone. The few left concerned about the effects of the Internet are dismissed as Luddites or killjoys who are on the wrong side of history. A new kind of consensus has shaped up as Steve Jobs becomes the new John Lennon, Amanda Palmer the new Liz Phair, and Elon Musk’s rebel cool graces magazines covers. Conservatives praise Silicon Valley for its entrepreneurial energy; a Democratic president steers millions of dollars of funding to Amazon.
It seems like a funny era for the work of a cautionary social critic, one often dubious about the wonders of technology – including television — whose most famous book came out three decades ago. But the neoliberal post-industrial world now looks chillingly like the one Neil Postman foresaw in books like “Amusing Ourselves to Death” and “Technopoly: The Surrender of Culture to Technology.” And the people asking the important questions about where American society is going are taking a page from him.
“Amusing Ourselves” didn’t argue that regular TV shows were bad or dangerous. It insisted instead that the medium would reshape every other sphere with which it engaged: By using the methods of entertainment, TV would trivialize what the book jacket calls “politics, education, religion, and journalism.”
“It just blew me away,” says D.C.-based politics writer Matt Bai, who read the 1985 book “Amusing Ourselves to Death” while trying to figure out how the press and media became obsessed with superficiality beginning in the ‘80s. “So much of what I’d been thinking about was pioneered so many years before,” says Bai – whose recent book, “All the Truth Is Out: The Week Politics Went Tabloid,” looks at the 1987 Gary Hart sex scandal that effectively ended the politician’s career. “It struck me as incredibly relevant … And the more I reported the book, the more relevant it became.”
Bai isn’t alone. While he’s hardly a household name, Postman has become an important guide to the world of the Internet though most of his work was written before its advent. Astra Taylor, a documentary filmmaker and Occupy activist, turned to his books while she was plotting out what became “The People’s Platform: Taking Back Power and Culture in the Digital Age.” Douglas Rushkoff — a media theorist whose book “Present Shock: When Everything Happens Now,” is one of the most lucid guides to our bewildering age — is indebted to his work. Michael Harris’ recent “The End of Absence” is as well. And Jaron Lanier, the virtual-reality inventor and author (“Who Owns the Future?”) who’s simultaneously critic and tech-world insider, sees Postman as an essential figure whose work becomes more crucial every year.
“There’s this kind of dialogue around technology where people dump on each other for ‘not getting it,’” Lanier says. “Postman does not seem to be vulnerable to that accusation: He was old-fashioned but he really transcended that. I don’t remember him saying, ‘When I was a kid, things were better.’ He called on fundamental arguments in very broad terms – the broad arc of human history and ethics.” [Continue reading…]
Why has progress stalled?
Michael Hanlon writes: We live in a golden age of technological, medical, scientific and social progress. Look at our computers! Look at our phones! Twenty years ago, the internet was a creaky machine for geeks. Now we can’t imagine life without it. We are on the verge of medical breakthroughs that would have seemed like magic only half a century ago: cloned organs, stem-cell therapies to repair our very DNA. Even now, life expectancy in some rich countries is improving by five hours a day. A day! Surely immortality, or something very like it, is just around the corner.
The notion that our 21st-century world is one of accelerating advances is so dominant that it seems churlish to challenge it. Almost every week we read about ‘new hopes’ for cancer sufferers, developments in the lab that might lead to new cures, talk of a new era of space tourism and super-jets that can fly round the world in a few hours. Yet a moment’s thought tells us that this vision of unparalleled innovation can’t be right, that many of these breathless reports of progress are in fact mere hype, speculation – even fantasy.
Yet there once was an age when speculation matched reality. It spluttered to a halt more than 40 years ago. Most of what has happened since has been merely incremental improvements upon what came before. That true age of innovation – I’ll call it the Golden Quarter – ran from approximately 1945 to 1971. Just about everything that defines the modern world either came about, or had its seeds sown, during this time. The Pill. Electronics. Computers and the birth of the internet. Nuclear power. Television. Antibiotics. Space travel. Civil rights.
There is more. Feminism. Teenagers. The Green Revolution in agriculture. Decolonisation. Popular music. Mass aviation. The birth of the gay rights movement. Cheap, reliable and safe automobiles. High-speed trains. We put a man on the Moon, sent a probe to Mars, beat smallpox and discovered the double-spiral key of life. The Golden Quarter was a unique period of less than a single human generation, a time when innovation appeared to be running on a mix of dragster fuel and dilithium crystals.
Today, progress is defined almost entirely by consumer-driven, often banal improvements in information technology. [Continue reading…]
E.O. Wilson talks about the threat to Earth’s biodiversity
John Muir’s last stand
Tom Butler and Eileen Crist write: In this centennial anniversary of Muir’s death, it is disturbing, but not surprising, that the man and his legacy are suffering the slings and arrows of critics. These attacks are concurrent with an ongoing assault on traditional conservation ideas and tactics from some academics, think tanks, and practitioners affiliated with large nonprofits. This body of thinkers, variously called “new conservationists,” “eco-pragmatists,” or “postmodern greens,” have articulated a set of views about where they think conservation should go in the so-called Anthropocene, the new epoch of human dominion. Wilderness preservation is not on their wish list this Christmas, though corporate partnerships are.
The postmodern greens aim to reorient conservation’s primary focus away from establishing protected areas intended to help prevent human-caused extinctions and to sustain large-scale natural ecosystems. Instead, they advocate sustainable management of the biosphere to support human aspirations, particularly for a growing global economy. If some species go extinct that may be regrettable, goes their thinking, but the bottom line is that nature is resilient. As long as “working landscapes” (places we manipulate to produce commodities) are managed well enough to sustain “ecosystem services” (things like water filtration, soil health, and crop pollination), human welfare can be supported without lots of new protected areas (habitat for other species) getting in the way of economic growth.
Some of the most prominent of these new conservationists have warned against critiquing the techno-industrial growth economy that is everywhere gobbling up wild nature. “Instead of scolding capitalism,” they write, “conservationists should partner with corporations in a science-based effort to integrate the value of nature’s benefits into their operations and cultures.” [Continue reading…]
How civilization has given humans brittle bones
Nicholas St. Fleur writes: Somewhere in a dense forest of ash and elm trees, a hunter readies his spear for the kill. He hurls his stone-tipped weapon at his prey, an unsuspecting white-tailed deer he has tracked since morning. The crude projectile pierces the animal’s hide, killing it and giving the hunter food to bring back to his family many miles away. Such was survival circa 5,000 B.C. in ancient North America.
But today, the average person barely has to lift a finger, let alone throw a spear to quell their appetite. The next meal is a mere online order away. And according to anthropologists, this convenient, sedentary way of life is making bones weak. Ahead, there’s a future of fractures, breaks, and osteoporosis. But for some anthropologists, the key to preventing aches in bones is by better understanding the skeletons of our hunter-gatherer ancestors.
“Over the vast majority of human prehistory, our ancestors engaged in far more activity over longer distances than we do today,” said Brian Richmond, an anthropologist from the American Museum of Natural History in New York, in a statement. “We cannot fully understand human health today without knowing how our bodies evolved to work in the past, so it is important to understand how our skeletons evolved within the context of those high levels of activity.”
For thousands of years, Native American hunter-gatherers trekked on strenuous ventures for food. And for those same thousands of years, dense skeletons supported their movements. But about 6,000 years later with the advent of agriculture the bones and joints of Native Americans became less rigid and more fragile. Similar transitions occurred across the world as populations shifted from foraging to farming, according to two new papers published Monday in the Proceedings of the National Academies of Sciences. [Continue reading…]
An integrated model of creativity and personality
Scott Barry Kaufman writes: Psychologists Guillaume Furst, Paolo Ghisletta and Todd Lubart present an integrative model of creativity and personality that is deeply grounded in past research on the personality of creative people.
Bringing together lots of different research threads over the years, they identified three “super-factors” of personality that predict creativity: Plasticity, Divergence, and Convergence.
Plasticity consists of the personality traits openness to experience, extraversion, high energy, and inspiration. The common factor here is high drive for exploration, and those high in this super-factor of personality tend to have a lot of dopamine — “the neuromodulator of exploration” — coursing through their brains. Prior research has shown a strong link between Plasticity and creativity, especially in the arts.
Divergence consists of non-conformity, impulsivity, low agreeableness, and low conscientiousness. People high in divergence may seem like jerks, but they are often just very independent thinkers. This super-factor is close to Hans Eysenck’s concept of “Psychoticism.” Throughout his life, Eysenck argued that these non-conforming characteristics were important contributors to high creative achievements.
Finally, Convergence consists of high conscientiousness, precision, persistence, and critical sense. While not typically included in discussions of creativity, these characteristics are also important contributors to the creative process. [Continue reading…]
Stoicism — a philosophy of gratitude
Lary Wallace writes: We do this to our philosophies. We redraft their contours based on projected shadows, or give them a cartoonish shape like a caricaturist emphasising all the wrong features. This is how Buddhism becomes, in the popular imagination, a doctrine of passivity and even laziness, while Existentialism becomes synonymous with apathy and futile despair. Something similar has happened to Stoicism, which is considered – when considered at all – a philosophy of grim endurance, of carrying on rather than getting over, of tolerating rather than transcending life’s agonies and adversities.
No wonder it’s not more popular. No wonder the Stoic sage, in Western culture, has never obtained the popularity of the Zen master. Even though Stoicism is far more accessible, not only does it lack the exotic mystique of Eastern practice; it’s also regarded as a philosophy of merely breaking even while remaining determinedly impassive. What this attitude ignores is the promise proffered by Stoicism of lasting transcendence and imperturbable tranquility.
It ignores gratitude, too. This is part of the tranquility, because it’s what makes the tranquility possible. Stoicism is, as much as anything, a philosophy of gratitude – and a gratitude, moreover, rugged enough to endure anything. Philosophers who pine for supreme psychological liberation have often failed to realise that they belong to a confederacy that includes the Stoics. [Continue reading…]
Birds can hear sounds hundreds of miles away
The Atlantic: In April, a massive thunderstorm unleashed a series of tornadoes that tore through the central and southern United States. The 84 twisters decimated homes and buildings, causing more than $1 billion in damage across 17 states. In the wake of the natural disaster, 35 people lost their lives.
Now, scientists say a peculiar event took place just two days before the storm: Flocks of songbirds fled the area en masse. Many golden-winged warblers had just finished a 1,500-mile migration to Tennessee when they suddenly flew south on a 900-mile exodus to Florida and Cuba. At that time, the storm was somewhere between 250 and 560 miles away. The researchers said that the birds somehow knew about the impending storm.
“At the same time that meteorologists on The Weather Channel were telling us this storm was headed in our direction, the birds were apparently already packing their bags and evacuating the area,” Henry Streby, a population ecologist from the University of California, Berkeley, said in a statement. He and his research team had been examining the birds’ migratory patterns when they made their discovery.
Initially, the team was studying if warblers, which weigh the same as four dimes, could carry half-gram geo-locators over long distances. After retrieving data from five of the 20 tagged birds, the team noticed the birds were nowhere near the path they’d expected. Why, the researchers wondered, would these tiny birds travel so far from their already-grueling migratory route? Upon further inspection, the scientists found that the dates the birds broke with the pattern coincided with the beginnings of the storm. In a paper reported today in the journal Current Biology, the team suggests that the birds made their “evacuation migration” because their keen sense of hearing alerted them to the incoming natural disaster. [Continue reading…]
The art of not trying
John Tierney writes: Just be yourself.
The advice is as maddening as it is inescapable. It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself.
But when you’re nervous, how can you be yourself? How you can force yourself to relax? How can you try not to try?
It makes no sense, but the paradox is essential to civilization, according to Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.
He calls it the paradox of wu wei, the Chinese term for “effortless action.” Pronounced “ooo-way,” it has similarities to the concept of flow, that state of effortless performance sought by athletes, but it applies to a lot more than sports. Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.
Dr. Slingerland, a professor of Asian studies at the University of British Columbia, argues that the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good. [Continue reading…]