Author Archives: Attention to the Unseen

NASA’s overlooked duty to look inward

Elisa Gabbert writes: In 1942, not long after the attack on Pearl Harbor, the poet Archibald MacLeish wrote an essay called “The Image of Victory,” in which he asked what winning the Second World War, the “airman’s war,” would mean for posterity. MacLeish believed that pilots could do more than bring victory; by literally rising above the conflicts on the ground, they could also reshape our very understanding of the planet. “Never in all their history have men been able truly to conceive of the world as one: a single sphere, a globe, having the qualities of a globe, a round earth in which all the directions eventually meet, in which there is no center because every point, or none, is center — an equal earth which all men occupy as equals,” he wrote. The airplane, he felt, was both an engine of perspective and a symbol of unity.

MacLeish could not, perhaps, have imagined the sight of a truly whole Earth. But, twenty-six years after his essay appeared, the three-man crew of Apollo 8 reached the highest vantage point in history, becoming the first humans to witness Earth rising over the surface of the moon. The most iconic photograph of our planet, popularly known as “The Blue Marble,” was taken by their successors on Apollo 17, in 1972. In it, Earth appears in crisp focus, brightly lit, as in studio portraiture, against a black backdrop. The picture clicked with the cultural moment. As the neuroscientist Gregory Petsko observed, in 2011, in an essay on the consciousness-shifting power of images, it became a symbol of the budding environmentalist movement. “Our whole planet suddenly, in this image, seemed tiny, vulnerable, and incredibly lonely against the vast blackness of the cosmos,” Petsko wrote. “Regional conflict and petty differences could be dismissed as trivial compared with environmental dangers that threatened all of humanity.” Apollo 17 marked America’s last mission to the moon, and the last time that humans left Earth’s orbit.

It was always part of NASA’s mission to look inward, not just outward. The National Aeronautics and Space Act of 1958, which established the agency, claimed as its first objective “the expansion of human knowledge of phenomena in the atmosphere and space.” NASA’s early weather satellites were followed, in the seventies and eighties, by a slew of more advanced instruments, which supplied data on the ozone layer, crops and vegetation, and even insect infestations. They allowed scientists to recognize and measure the symptoms of climate change, and their decades’ worth of data helped the Intergovernmental Panel on Climate Change conclude, in 2007, that global warming is “very likely” anthropogenic. According to a report released last month by NASA’s inspector general, the agency’s Earth Science Division helps commercial, government, and military organizations around the world locate areas at risk for storm-related flooding, predict malaria outbreaks, develop wildfire models, assess air quality, identify remote volcanoes whose toxic emissions contribute to acid rain, and determine the precise length of a day. [Continue reading…]

Facebooktwittermail

How a guy from a Montana trailer park overturned 150 years of biology

Ed Yong writes: In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.

At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.

Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Spribille. “That was the bottleneck of my life.”

Throughout his undergraduate and postgraduate work, Spribille became an expert on the organisms that had grabbed his attention during his time in the Montana forests — lichens.

You’ve seen lichens before, but unlike Spribille, you may have ignored them. They grow on logs, cling to bark, smother stones. At first glance, they look messy and undeserving of attention. On closer inspection, they are astonishingly beautiful. They can look like flecks of peeling paint, or coralline branches, or dustings of powder, or lettuce-like fronds, or wriggling worms, or cups that a pixie might drink from. They’re also extremely tough. They grow in the most inhospitable parts of the planet, where no plant or animal can survive.

Lichens have an important place in biology. In the 1860s, scientists thought that they were plants. But in 1868, a Swiss botanist named Simon Schwendener revealed that they’re composite organisms, consisting of fungi that live in partnership with microscopic algae. This “dual hypothesis” was met with indignation: it went against the impetus to put living things in clear and discrete buckets. The backlash only collapsed when Schwendener and others, with good microscopes and careful hands, managed to tease the two partners apart.

Schwendener wrongly thought that the fungus had “enslaved” the alga, but others showed that the two cooperate. The alga uses sunlight to make nutrients for the fungus, while the fungus provides minerals, water, and shelter. This kind of mutually beneficial relationship was unheard of, and required a new word. Two Germans, Albert Frank and Anton de Bary, provided the perfect one — symbiosis, from the Greek for ‘together’ and ‘living’. [Continue reading…]

Facebooktwittermail

Arabic translators did far more than just preserve Greek philosophy

By Peter Adamson, Aeon, November 4, 2016

In European antiquity, philosophers largely wrote in Greek. Even after the Roman conquest of the Mediterranean and the demise of paganism, philosophy was strongly associated with Hellenic culture. The leading thinkers of the Roman world, such as Cicero and Seneca, were steeped in Greek literature; Cicero even went to Athens to pay homage to the home of his philosophical heroes. Tellingly, the emperor Marcus Aurelius went so far as to write his Meditations in Greek. Cicero, and later Boethius, did attempt to initiate a philosophical tradition in Latin. But during the early Middle Ages, most of Greek thought was accessible in Latin only partially and indirectly.

Elsewhere, the situation was better. In the eastern part of the Roman Empire, the Greek-speaking Byzantines could continue to read Plato and Aristotle in the original. And philosophers in the Islamic world enjoyed an extraordinary degree of access to the Hellenic intellectual heritage. In 10th-century Baghdad, readers of Arabic had about the same degree of access to Aristotle that readers of English do today.

This was thanks to a well-funded translation movement that unfolded during the Abbasid caliphate, beginning in the second half of the eighth century. Sponsored at the highest levels, even by the caliph and his family, this movement sought to import Greek philosophy and science into Islamic culture. Their empire had the resources to do so, not just financially but also culturally. From late antiquity to the rise of Islam, Greek had survived as a language of intellectual activity among Christians, especially in Syria. So when Muslim aristocrats decided to have Greek science and philosophy translated into Arabic, it was to Christians that they turned. Sometimes, a Greek work might even be translated first into Syriac, and only then into Arabic. It was an immense challenge. Greek is not a semitic language, so they were moving from one language group to another: more like translating Finnish into English than Latin into English. And there was, at first, no established terminology for expressing philosophical ideas in Arabic.

Continue reading

Facebooktwittermail

The case against sugar

Gary Taubes writes: ‘Virtually zero.’ That’s a reasonable estimate of the probability that public health authorities in the foreseeable future will successfully curb the worldwide epidemics of obesity and diabetes, at least according to Margaret Chan, the director general of the World Health Organization (WHO) – a person who should know. Virtually zero is the likelihood, Chan said at the National Academy of Medicine’s annual meeting in October, that she and her many colleagues worldwide will successfully prevent ‘a bad situation’ from ‘getting much worse’. That Chan also described these epidemics as a ‘slow-motion disaster’ suggests the critical nature of the problem: ‘population-wide’ explosions in the prevalence of obesity along with increases in the occurrence of diabetes that frankly strain the imagination: a disease that leads to blindness, kidney failure, amputation, heart disease and premature death, and that was virtually non-existent in hospital inpatient records from the mid-19th century, now afflicts one in 11 Americans; in some populations, as many as one in two adults are diabetic.

In the midst of such a public health crisis, the obvious question to ask is why. Many reasons can be imagined for any public health failure, but we have no precedents for a failure of this magnitude. As such, the simplest explanation is that we’re not targeting the right agent of disease; that our understanding of the aetiology of both obesity and diabetes is somehow flawed, perhaps tragically so.

Researchers in harder sciences have a name for such situations: ‘pathological science’, defined by the Nobel Laureate chemist Irving Langmuir in 1953 as ‘the science of things that aren’t so’. Where experimental investigation is prohibitively expensive or impossible to do, mistaken assumptions, misconceived paradigms and pathological science can survive indefinitely. Whether this is the case with the current epidemics is an all-too-regrettable possibility: perhaps we’ve simply misconceived the reality of the link between diet, lifestyle and the related disorders of obesity and diabetes? As the Oxford scholar Robert Burton suggested in The Anatomy of Melancholy (1621), in cases in which the cures are ‘imperfect, lame, and to no purpose’ it’s quite possible that the causes are misunderstood. [Continue reading…]

Facebooktwittermail

Humans have been altering Earth for millennia, but only now are we wise to what we’re doing

David Grinspoon writes: As a planetary astrobiologist, I am focused on the major transitions in planetary evolution and the evolving relationship between planets and life. The scientific community is converging on the idea that we have entered a new epoch of Earth history, one in which the net activity of humans has become an agent of global change as powerful as the great forces of nature that shape continents and propel the evolution of species. This concept has garnered a lot of attention, and justly so. Thinking about the new epoch – often called the Anthropocene, or the age of humanity – challenges us to look at ourselves in the mirror of deep time, measured not in centuries or even in millennia, but over millions and billions of years. And yet much of the recent discussion and debate over the Anthropocene still does not come to terms with its full meaning and importance.

Various markers have been proposed for the starting date of the Anthropocene, such as the rise in CO2, isotopes from nuclear tests, the ‘Columbian exchange’ of species between hemispheres when Europeans colonised the Americas, or more ancient human modifications of the landscape or climate. The question in play here is: when did our world gain a quality that is uniquely human? Many species have had a major influence on the globe, but they don’t each get their own planetary transition in the geologic timescale. When did humans begin changing things in a way that no other species has ever changed Earth before? Making massive changes in landscapes is not unique to us. Beavers do plenty of that, for example, when they build dams, alter streams, cut down forests and create new meadows. Even changing global climate and initiating mass extinction is not a human first. Photosynthetic bacteria did that some 2.5 billion years ago.

What distinguishes humans from other world-changing organisms must be related to our great cleverness and adaptability; the power that comes from communicating, planning and working in social groups; transmitting knowledge from one generation to the next; and applying these skills toward altering our surroundings and expanding our habitable domains. However, people have been engaged in these activities for tens of thousands of years, and have produced many different environmental modifications proposed as markers of the Anthropocene’s beginning. Therefore, those definitions strike me as incomplete. Until now, the people causing the disturbances had no way of recognising or even conceiving of a global change. Yes, humans have been altering our planet for millennia, but there is something going on now that was not happening when we started doing all that world-changing. [Continue reading…]

Facebooktwittermail