Rewriting Earth’s creation story

Earth

Rebecca Boyle writes: Humanity’s trips to the moon revolutionized our view of this planet. As seen from another celestial body, Earth seemed more fragile and more precious; the iconic Apollo 8 image of Earth rising above the lunar surface helped launch the modern environmental movement. The moon landings made people want to take charge of Earth’s future. They also changed our view of its past.

Earth is constantly remaking itself, and over the eons it has systematically erased its origin story, subsuming and cannibalizing its earliest rocks. Much of what we think we know about the earliest days of Earth therefore comes from the geologically inactive moon, which scientists use like a time capsule.

Ever since Apollo astronauts toted chunks of the moon back home, the story has sounded something like this: After coalescing from grains of dust that swirled around the newly ignited sun, the still-cooling Earth would have been covered in seas of magma, punctured by inky volcanoes spewing sulfur and liquid rock. The young planet was showered in asteroids and larger structures called planetisimals, one of which sheared off a portion of Earth and formed the moon. Just as things were finally settling down, about a half-billion years after the solar system formed, the Earth and moon were again bombarded by asteroids whose onslaught might have liquefied the young planet — and sterilized it.

Geologists named this epoch the Hadean, after the Greek version of the underworld. Only after the so-called Late Heavy Bombardment quieted some 3.9 billion years ago did Earth finally start to morph into the Edenic, cloud-covered, watery world we know.

But as it turns out, the Hadean may not have been so hellish. New analysis of Earth and moon rocks suggest that instead of a roiling ball of lava, baby Earth was a world with continents, oceans of water, and maybe even an atmosphere. It might not have been bombarded by asteroids at all, or at least not in the large quantities scientists originally thought. The Hadean might have been downright hospitable, raising questions about how long ago life could have arisen on this planet. [Continue reading…]

Facebooktwittermail

Tardigrades: The most fascinating animals known to science

 

Brian Resnick writes: Paul Bartels gets a rush every time he discovers a new species of tardigrade, the phylum of microscopic animals best known for being both strangely cute and able to survive the vacuum of space.

“The first paper I wrote describing a new species, there was a maternal-paternal feeling — like I just gave birth to this new thing,” he tells me on a phone call.

The rush comes, in part, because tardigrades are the most fascinating animals known to science, able to survive in just about every environment imaginable. “There are some ecosystems in the Antarctic called nunataks where the wind blows away snow and ice, exposing outcroppings of rocks, and the only things that live on them are lichens and tardigrades,” says Bartels, an invertebrate zoologist at Warren Wilson College in North Carolina.

Pick up a piece of moss, and you’ll find tardigrades. In the soil: tardigrades. The ocean: You get it. They live on every continent, in every climate, and in every latitude. Their extreme resilience has allowed them to conquer the entire planet.

And though biologists have known about tardigrades since the dawn of the microscope, they’re only just beginning to understand how these remarkable organisms are able to survive anywhere. [Continue reading…]

Facebooktwittermail

Why neuroscientists need to study the crow

crow

Grigori Guitchounts writes: The animals of neuroscience research are an eclectic bunch, and for good reason. Different model organisms—like zebra fish larvae, C. elegans worms, fruit flies, and mice — give researchers the opportunity to answer specific questions. The first two, for example, have transparent bodies, which let scientists easily peer into their brains; the last two have eminently tweakable genomes, which allow scientists to isolate the effects of specific genes. For cognition studies, researchers have relied largely on primates and, more recently, rats, which I use in my own work. But the time is ripe for this exclusive club of research animals to accept a new, avian member: the corvid family.

Corvids, such as crows, ravens, and magpies, are among the most intelligent birds on the planet — the list of their cognitive achievements goes on and on — yet neuroscientists have not scrutinized their brains for one simple reason: They don’t have a neocortex. The obsession with the neocortex in neuroscience research is not unwarranted; what’s unwarranted is the notion that the neocortex alone is responsible for sophisticated cognition. Because birds lack this structure—the most recently evolved portion of the mammalian brain, crucial to human intelligence—neuroscientists have largely and unfortunately neglected the neural basis of corvid intelligence.

This makes them miss an opportunity for an important insight. Having diverged from mammals more than 300 million years ago, avian brains have had plenty of time to develop along remarkably different lines (instead of a cortex with its six layers of neatly arranged neurons, birds evolved groups of neurons densely packed into clusters called nuclei). So, any computational similarities between corvid and primate brains — which are so different neurally — would indicate the development of common solutions to shared evolutionary problems, like creating and storing memories, or learning from experience. If neuroscientists want to know how brains produce intelligence, looking solely at the neocortex won’t cut it; they must study how corvid brains achieve the same clever behaviors that we see in ourselves and other mammals. [Continue reading…]

Facebooktwittermail

Walking improves creativity

Olivia Goldhill writes: For centuries, great thinkers have instinctively stepped out the door and begun walking, or at the very least pacing, when they needed to boost creativity. Charles Dickens routinely walked for 30 miles a day, while the philosopher Friedrich Nietzsche declared, “All truly great thoughts are conceived while walking.”

But in recent years, as lives have become increasingly sedentary, the idea has been put to the test. The precise physiology is unknown, but professors and therapists are turning what was once an unquestioned instinct into a certainty: Walking influences our thinking, and somehow improves creativity.

Last year, researchers at Stanford found that people perform better on creative divergent thinking tests during and immediately after walking. The effect was similar regardless of whether participants took a stroll inside or stayed inside, walking on a treadmill and staring at a wall. The act of walking itself, rather than the sights encountered on a saunter, was key to improving creativity, they found. [Continue reading…]

Facebooktwittermail

The social practice of self-betrayal in career-driven America

nyc (1)

Talbot Brewer writes: I don’t know how careers are seen in other countries, but in the United States we are exhorted to view them as the primary locus of self-realization. The question before you when you are trying to choose a career is to figure out “What Color is Your Parachute?” (the title of a guide to job searches that has been a perennial best seller for most of my lifetime). The aim, to quote the title of another top-selling guide to career choices, is to “Do What You Are.”

These titles tell us something about what Americans expect to find in a career: themselves, in the unlikely form of a marketable commodity. But why should we expect that the inner self waiting to be born corresponds to some paid job or profession? Are we really all in possession of an inner lawyer, an inner beauty products placement specialist, or an inner advertising executive, just waiting for the right job opening? Mightn’t this script for our biographies serve as easily to promote self-limitation or self-betrayal as to further self-actualization?

We spend a great deal of our youth shaping ourselves into the sort of finished product that potential employers will be willing to pay dearly to use. Beginning at a very early age, schooling practices and parental guidance and approval are adjusted, sometimes only semi-consciously, so as to inculcate the personal capacities and temperament demanded by the corporate world. The effort to sculpt oneself for this destiny takes a more concerted form in high school and college. We choose courses of study, and understand the importance of success in these studies, largely with this end in view.

Even those who rebel against these forces of acculturation are deeply shaped by them. What we call “self-destructive” behavior in high school might perhaps be an understandable result of being dispirited by the career prospects that are recommended to us as sufficient motivation for our studies. As a culture we have a curious double-mindedness about such reactions. It is hard to get through high school in the United States without being asked to read J.D. Salinger’s Catcher in the Rye — the story of one Holden Caulfield’s angst-ridden flight from high school, fueled by a pervasive sense that the adult world is irredeemably phony. The ideal high school student is supposed to find a soul-mate in Holden and write an insightful paper about his telling cultural insights, submitted on time in twelve-point type with double spacing and proper margins and footnotes, so as to ensure the sort of grade that will keep the student on the express train to the adult world whose irredeemable phoniness he has just skillfully diagnosed. [Continue reading…]

Facebooktwittermail

Atheists in America

Emma Green writes: In general, Americans do not like atheists. In studies, they say they feel coldly toward nonbelievers; it’s estimated that more than half of the population say they’d be less likely to vote for a presidential candidate who didn’t believe in God.

This kind of deep-seated suspicion is a long-standing tradition in the U.S. In his new book, Village Atheists, the Washington University in St. Louis professor Leigh Eric Schmidt writes about the country’s early “infidels” — one of many fraught terms nonbelievers have used to describe themselves in history — and the conflicts they went through. While the history of atheists is often told as a grand tale of battling ideas, Schmidt set out to tell stories of “mundane materiality,” chronicling the lived experiences of atheists and freethinkers in 19th- and 20th-century America.

His findings both confirm and challenge stereotypes around atheists today. While it’s true that the number of nonbelievers is the United States is growing, it’s still small — roughly 3 percent of U.S. adults self-identify as atheists. And while more and more Americans say they’re not part of any particular religion, they’ve historically been in good company: At the end of the 19th century, Schmidt estimated, around a tenth of Americans may have been unaffiliated from any church or religious institution.

As the visibility and number of American atheists has changed over time, the group has gone through its own struggles over identity. Even today, atheists are significantly more likely to be white, male, and highly educated than the rest of the population, a demographic fact perhaps tied to the long legacy of misogyny and marginalization of women within the movement. At times, nonbelievers have advocated on behalf of minority religious rights and defended immigrants. But they’ve also been among the most vocal American nativists, rallying against Mormons, Catholics, and evangelical Protestants alike.

Schmidt and I discussed the history of atheists in the United States, from the suspicion directed toward them to the suspicions they have cast on others. Our conversation has been edited and condensed for clarity. [Continue reading…]

Facebooktwittermail

How U.S. history makes people and places disappear

shadow14

Aileen McGraw writes: When Lauret Edith Savoy first heard the word “colored” at five years old, she saw herself as exactly that — full of veins as blue as the sky. Not long after, she learned another definition, steeped in racism. “Words full of spit showed that I could be hated for being ‘colored,’” she writes. “By the age of eight I wondered if I should hate in return.” Out of this painful history, Savoy has created something rich and productive — a body of work that examines the complex relationships between land, identity, and history.

Today, Savoy, who is of African American, Euro-American, and Native American descent, works as a geologist, a writer, and a professor of environmental studies at Mount Holyoke College. Her writing — described by New York Magazine’s “Vulture” as John McPhee meets James Baldwin — straddles science and the humanities.

Her most recent book Trace: Memory, History, Race, and the American Landscape explores the tendency of U.S. history to erase or rewrite — both literally and in memory — the stories of marginalized or dispossessed people and places that have been deemed unworthy, unsavory, or shameful. In eight densely researched, ruminative essays, Savoy uses her own family histories to trace moments in American history that have been largely forgotten: for example, the history of segregated Army nurses, like her mother, during World War II, or that of Charles Drew, the African-American physician who developed the first blood bank and was fired for trying to end the federally sanctioned policy of segregating blood. Savoy approaches the “environment” in the broadest sense: “Not just as surroundings; not just as the air, water, and land on which we depend, or that we pollute; not just as global warming — but as sets of circumstances, conditions, and contexts in which we live and die — in which each of us is intimately part.”

Nautilus recently spoke to Savoy over email about this relationship between landscape and identity, the meaning of biodiversity, and the power of the stories we tell. [Continue reading…]

Facebooktwittermail

England’s forgotten Muslim history

Jerry Brotton writes: Britain is divided as never before. The country has turned its back on Europe, and its female ruler has her sights set on trade with the East. As much as this sounds like Britain today, it also describes the country in the 16th century, during the golden age of its most famous monarch, Queen Elizabeth I.

One of the more surprising aspects of Elizabethan England is that its foreign and economic policy was driven by a close alliance with the Islamic world, a fact conveniently ignored today by those pushing the populist rhetoric of national sovereignty.

From the moment of her accession to the throne in 1558, Elizabeth began seeking diplomatic, commercial and military ties with Muslim rulers in Iran, Turkey and Morocco — and with good reasons. In 1570, when it became clear that Protestant England would not return to the Catholic faith, the pope excommunicated Elizabeth and called for her to be stripped of her crown. Soon, the might of Catholic Spain was against her, an invasion imminent. English merchants were prohibited from trading with the rich markets of the Spanish Netherlands. Economic and political isolation threatened to destroy the newly Protestant country.

Elizabeth responded by reaching out to the Islamic world. Spain’s only rival was the Ottoman Empire, ruled by Sultan Murad III, which stretched from North Africa through Eastern Europe to the Indian Ocean. The Ottomans had been fighting the Hapsburgs for decades, conquering parts of Hungary. Elizabeth hoped that an alliance with the sultan would provide much needed relief from Spanish military aggression, and enable her merchants to tap into the lucrative markets of the East. For good measure she also reached out to the Ottomans’ rivals, the shah of Persia and the ruler of Morocco. [Continue reading…]

Facebooktwittermail

Ethical shifts come with thinking in a different language

Julie Sedivy writes: What defines who we are? Our habits? Our aesthetic tastes? Our memories? If pressed, I would answer that if there is any part of me that sits at my core, that is an essential part of who I am, then surely it must be my moral center, my deep-seated sense of right and wrong.

And yet, like many other people who speak more than one language, I often have the sense that I’m a slightly different person in each of my languages — more assertive in English, more relaxed in French, more sentimental in Czech. Is it possible that, along with these differences, my moral compass also points in somewhat different directions depending on the language I’m using at the time?

Psychologists who study moral judgments have become very interested in this question. Several recent studies have focused on how people think about ethics in a non-native language — as might take place, for example, among a group of delegates at the United Nations using a lingua franca to hash out a resolution. The findings suggest that when people are confronted with moral dilemmas, they do indeed respond differently when considering them in a foreign language than when using their native tongue.

In a 2014 paper led by Albert Costa, volunteers were presented with a moral dilemma known as the “trolley problem”: imagine that a runaway trolley is careening toward a group of five people standing on the tracks, unable to move. You are next to a switch that can shift the trolley to a different set of tracks, thereby sparing the five people, but resulting in the death of one who is standing on the side tracks. Do you pull the switch?

Most people agree that they would. But what if the only way to stop the trolley is by pushing a large stranger off a footbridge into its path? People tend to be very reluctant to say they would do this, even though in both scenarios, one person is sacrificed to save five. But Costa and his colleagues found that posing the dilemma in a language that volunteers had learned as a foreign tongue dramatically increased their stated willingness to shove the sacrificial person off the footbridge, from fewer than 20% of respondents working in their native language to about 50% of those using the foreign one. [Continue reading…]

Facebooktwittermail

What to do about Liberia’s island colony of abandoned lab chimps?

By Ben Garrod, Anglia Ruskin University

The story of Liberia’s former research chimpanzees is both well-known and contentious. A non-profit blood bank, the New York Blood Centre (NYBC), set up a virus-testing laboratory in the country in 1974, and wild chimpanzees were trapped from their forests and housed within the “Vilab II” facility. They were subjected to medical experiments and were intentionally infected with hepatitis and other pathogens to help develop a range of vaccines.

By 2005, the director of Vilab II, Alfred M Prince, announced that all research had been terminated and that the NYBC had started to make “lifetime care” arrangements for the chimpanzees through an endowment. Over the next ten years, the chimps were “retired” to a series of small islands in a river estuary, receiving food, water and necessary captive care (at a cost of around US$20,000 a month).

Then, in March 2015, the NYBC withdrew its help and financial support and disowned Prince’s commitments. The move left about 85 chimps to fend for themselves. Escape is impossible, as chimpanzees are incapable of swimming well, and many are suspected to have likely died from a lack of food and water.

Although the Liberian government owns the chimps as a legal technicality, the day-to-day management of the chimps and the experiments were carried out by NYBC and it in no way absolves it from ultimate responsibility. But it has used this to distance itself from calls for it to continue funding care. In a statement last year it said it had had “unproductive discussions” with the Liberian government and that it “never had any obligation for care for the chimps, contractual or otherwise”. It has also said that it can “no longer sustain diverting millions of dollars away from our lifesaving mission”.

Understandably, animal rights groups are vocally opposing the blood bank’s actions.

[Read more…]

Facebooktwittermail

Nature is being renamed ‘natural capital’ – but is it really the planet that will profit?

By Sian Sullivan, Bath Spa University

The four-yearly World Conservation Congress of the International Union for the Conservation of Nature has just taken place in Hawai’i. The congress is the largest global meeting on nature’s conservation. This year a controversial motion was debated regarding incorporating the language and mechanisms of “natural capital” into IUCN policy.

But what is “natural capital”? And why use it to refer to “nature”?

Motion 63 on “Natural Capital”, adopted at the congress, proposes the development of a “natural capital charter” as a framework “for the application of natural capital approaches and mechanisms”. In “noting that concepts and language of natural capital are becoming widespread within conservation circles and IUCN”, the motion reflects IUCN’s adoption of “a substantial policy position” on natural capital. Eleven programmed sessions scheduled for the congress included “natural capital” in the title. Many are associated with the recent launch of the global Natural Capital Protocol, which brings together business leaders to create a world where business both enhances and conserves nature.

At least one congress session discussed possible “unforeseen impacts of natural capital on broader issues of equitability, ethics, values, rights and social justice”. This draws on widespread concerns around the metaphor that nature-is-as-capital-is. Critics worry about the emphasis on economic, as opposed to ecological, language and models, and a corresponding marginalisation of non-economic values that elicit care for the natural world.

[Read more…]

Facebooktwittermail

Sugar industry funded research as early as 1960s to coverup health hazards, report says

The Associated Press reports: The sugar industry began funding research that cast doubt on sugar’s role in heart disease — in part by pointing the finger at fat — as early as the 1960s, according to an analysis of newly uncovered documents.

The analysis published Monday in the journal JAMA Internal Medicine is based on correspondence between a sugar trade group and researchers at Harvard University, and is the latest example showing how food and beverage makers attempt to shape public understanding of nutrition.

In 1964, the group now known as the Sugar Assn. internally discussed a campaign to address “negative attitudes toward sugar” after studies began emerging linking sugar with heart disease, according to documents dug up from public archives. The following year the group approved “Project 226,” which entailed paying Harvard researchers today’s equivalent of $48,900 for an article reviewing the scientific literature, supplying materials they wanted reviewed, and receiving drafts of the article.

The resulting article published in 1967 concluded there was “no doubt” that reducing cholesterol and saturated fat was the only dietary intervention needed to prevent heart disease. The researchers overstated the consistency of the literature on fat and cholesterol while downplaying studies on sugar, according to the analysis. [Continue reading…]

Facebooktwittermail