Author Archives: Attention to the Unseen

China is at the forefront of manipulating DNA to create a new class of superhumans

G. Owen Schaefer writes: Would you want to alter your future children’s genes to make them smarter, stronger, or better looking? As the state of science brings prospects like these closer to reality, an international debate has been raging over the ethics of enhancing human capacities with biotechnologies such as so-called smart pills, brain implants, and gene editing. This discussion has only intensified in the past year with the advent of the CRISPR-cas9 gene editing tool, which raises the specter of tinkering with our DNA to improve traits like intelligence, athleticism, and even moral reasoning.

So are we on the brink of a brave new world of genetically enhanced humanity? Perhaps. And there’s an interesting wrinkle: It’s reasonable to believe that any seismic shift toward genetic enhancement will not be centered in Western countries like the US or the UK, where many modern technologies are pioneered. Instead, genetic enhancement is more likely to emerge out of China.

Numerous surveys among Western populations have found significant opposition to many forms of human enhancement. For example, a recent Pew study of 4,726 Americans found that most would not want to use a brain chip to improve their memory, and a plurality view such interventions as morally unacceptable. [Continue reading…]

Facebooktwittermail

A unified theory of evolution requires input from Darwin and Lamarck

lichen8

Michael Skinner writes: The unifying theme for much of modern biology is based on Charles Darwin’s theory of evolution, the process of natural selection by which nature selects the fittest, best-adapted organisms to reproduce, multiply and survive. The process is also called adaptation, and traits most likely to help an individual survive are considered adaptive. As organisms change and new variants thrive, species emerge and evolve. In the 1850s, when Darwin described this engine of natural selection, the underlying molecular mechanisms were unknown. But over the past century, advances in genetics and molecular biology have outlined a modern, neo-Darwinian theory of how evolution works: DNA sequences randomly mutate, and organisms with the specific sequences best adapted to the environment multiply and prevail. Those are the species that dominate a niche, until the environment changes and the engine of evolution fires up again.

But this explanation for evolution turns out to be incomplete, suggesting that other molecular mechanisms also play a role in how species evolve. One problem with Darwin’s theory is that, while species do evolve more adaptive traits (called phenotypes by biologists), the rate of random DNA sequence mutation turns out to be too slow to explain many of the changes observed. Scientists, well-aware of the issue, have proposed a variety of genetic mechanisms to compensate: genetic drift, in which small groups of individuals undergo dramatic genetic change; or epistasis, in which one set of genes suppress another, to name just two.

Yet even with such mechanisms in play, genetic mutation rates for complex organisms such as humans are dramatically lower than the frequency of change for a host of traits, from adjustments in metabolism to resistance to disease. The rapid emergence of trait variety is difficult to explain just through classic genetics and neo-Darwinian theory. To quote the prominent evolutionary biologist Jonathan B L Bard, who was paraphrasing T S Eliot: ‘Between the phenotype and genotype falls the shadow.’

And the problems with Darwin’s theory extend out of evolutionary science into other areas of biology and biomedicine. For instance, if genetic inheritance determines our traits, then why do identical twins with the same genes generally have different types of diseases? And why do just a low percentage (often less than 1 per cent) of those with many specific diseases share a common genetic mutation? If the rate of mutation is random and steady, then why have many diseases increased more than 10-fold in frequency in only a couple decades? How is it that hundreds of environmental contaminants can alter disease onset, but not DNA sequences? In evolution and biomedicine, the rates of phenotypic trait divergence is far more rapid than the rate of genetic variation and mutation – but why?

Part of the explanation can be found in some concepts that Jean-Baptiste Lamarck proposed 50 years before Darwin published his work. Lamarck’s theory, long relegated to the dustbin of science, held, among other things, ‘that the environment can directly alter traits, which are then inherited by generations to come’. [Continue reading…]

Facebooktwittermail

The moment when science went modern

structure2

Lorraine Daston writes: The history of science is punctuated by not one, not two, but three modernities: the first, in the seventeenth century, known as “the Scientific Revolution”; the second, circa 1800, often referred to as “the second Scientific Revolution”; and the third, in the first quarter of the twentieth century, when relativity theory and quantum mechanics not only overturned the achievements of Galileo and Newton but also challenged our deepest intuitions about space, time, and causation.

Each of these moments transformed science, both as a body of knowledge and as a social and political force. The first modernity of the seventeenth century displaced the Earth from the center of the cosmos, showered Europeans with new discoveries, from new continents to new planets, created new forms of inquiry such as field observation and the laboratory experiment, added prediction to explanation as an ideal toward which science should strive, and unified the physics of heaven and earth in Newton’s magisterial synthesis that served as the inspiration for the political reformers and revolutionaries of the Enlightenment. The second modernity of the early nineteenth century unified light, heat, electricity, magnetism, and gravitation into the single, fungible currency of energy, put that energy to work by creating the first science-based technologies to become gigantic industries (e.g., the manufacture of dyestuffs from coal tar derivatives), turned science into a salaried profession and allied it with state power in every realm, from combating epidemics to waging wars. The third modernity, of the early twentieth century, toppled the certainties of Newton and Kant, inspired the avant-garde in the arts, and paved the way for what were probably the two most politically consequential inventions of the last hundred years: the mass media and the atomic bomb.

The aftershocks of all three of these earthquakes of modernity are still reverberating today: in heated debates, from Saudi Arabia to Sri Lanka to Senegal, about the significance of the Enlightenment for human rights and intellectual freedom; in the assessment of how science-driven technology and industrialization may have altered the climate of the entire planet; in anxious negotiations about nuclear disarmament and utopian visions of a global polity linked by the worldwide Net. No one denies the world-shaking and world-making significance of any of these three moments of scientific modernity.

Yet from the perspective of the scientists themselves, the experience of modernity coincides with none of these seismic episodes. The most unsettling shift in scientific self-understanding — about what science was and where it was going — began in the middle decades of the nineteenth century, reaching its climax circa 1900. It was around that time that scientists began to wonder uneasily about whether scientific progress was compatible with scientific truth. If advances in knowledge were never-ending, could any scientific theory or empirical result count as real knowledge — true forever and always? Or was science, like the monarchies of Europe’s anciens régimes and the boundaries of its states and principalities, doomed to perpetual revision and revolution? [Continue reading…]

Facebooktwittermail

Digging our own graves in deep time

By David Farrier, Aeon, October 31, 2016

Late one summer night in 1949, the British archaeologist Jacquetta Hawkes went out into her small back garden in north London, and lay down. She sensed the bedrock covered by its thin layer of soil, and felt the hard ground pressing her flesh against her bones. Shimmering through the leaves and out beyond the black lines of her neighbours’ chimney pots were the stars, beacons ‘whose light left them long before there were eyes on this planet to receive it’, as she put it in A Land (1951), her classic book of imaginative nature writing.

We are accustomed to the idea of geology and astronomy speaking the secrets of ‘deep time, the immense arc of non-human history that shaped the world as we perceive it. Hawkes’s lyrical meditation mingles the intimate and the eternal, the biological and the inanimate, the domestic with a sense of deep time that is very much of its time. The state of the topsoil was a matter of genuine concern in a country wearied by wartime rationing, while land itself rises into focus just as Britain is rethinking its place in the world. But in lying down in her garden, Hawkes also lies on the far side of a fundamental boundary. A Land was written at the cusp of the Holocene; we, on the other hand, read it in the Anthropocene.

The Anthropocene, or era of the human, denotes how industrial civilisation has changed the Earth in ways that are comparable with deep-time processes. The planet’s carbon and nitrogen cycles, ocean chemistry and biodiversity – each one the product of millions of years of slow evolution – have been radically and permanently disrupted by human activity. The development of agriculture 10,000 years ago, and the Industrial Revolution in the middle of the 19th century, have both been proposed as start dates for the Anthropocene. But a consensus has gathered around the Great Acceleration – the sudden and dramatic jump in consumption that began around 1950, followed by a huge rise in global population, an explosion in the use of plastics, and the collapse of agricultural diversity.

Continue reading

Facebooktwittermail

To identify risky drivers, insurer will track language use in social media

Financial Times reports: UK-based insurer Admiral has come up with a way to crunch through social media posts to work out who deserves a lower premium. People who seem cautious and deliberate in their choice of words are likely to pay a lot less than those with overconfident remarks. [Continue reading…]

Facebooktwittermail