Category Archives: Attention to the Unseen

Spontaneity is at the heart of science

Henry Cowles writes: There is a theory in psychology called the theory theory. It’s a theory about theories. While this might sound obvious, the theory theory leads to counterintuitive conclusions. A quarter-century ago, psychologists began to point out important links between the development of scientific theories and how everyday thinking, including children’s thinking, works. According to theory theorists, a child learns by constructing a theory of the world and testing it against experience. In this sense, children are little scientists – they hypothesise on the basis of observations, test their hypotheses experimentally, and then revise their views in light of the evidence they gather.

According to Alison Gopnik, a theory theorist at the University of California, Berkeley, the analogy works both ways. It’s not just that ‘children are little scientists’, she wrote in her paper ‘The Scientist as Child’ (1996), ‘but that scientists are big children.’ Depending on where you look, you can see the scientific method in a child, or spot the inner child in a scientist. Either way, the theory theory makes it easy to see connections between elementary learning and scientific theorising.

This should be pretty surprising. After all, scientists go through a lot of training in order to think the way they do. Their results are exact; their methods exacting. Most of us share the sense that scientific thinking is difficult, even for scientists. This perceived difficulty has bolstered (at least until recently) the collective respect for scientific expertise on which the support of cutting-edge research depends. It’s also what gives the theory theory its powerful punch. If science is so hard, how can children – and, some theory theorists argue, even infants – think like scientists in any meaningful sense? Indeed, in the age of what Erik M. Conway and Naomi Oreskes call “the merchants of doubt” (not to say in the age of Trump), isn’t it dangerous to suggest that science is a matter of child’s play?

To gain purchase on this question, let’s take a step back. Claims that children are scientists rest on a certain idea about what science is. For theory theorists – and for many of the rest of us – science is about producing theories. How we do that is often represented as a short list of steps, such as ‘observe’, ‘hypothesise’, and ‘test’, steps that have been emblazoned on posters and recited in debates for the past century. But where did this idea that science is a set of steps – a method – come from? As it turns out, we don’t need to go back to Isaac Newton or the Scientific Revolution to find the history of ‘the scientific method’ in this sense. The image of science that most of us hold, even most scientists, comes from a surprising place: modern child psychology. The scientific method as we know it today comes from psychological studies of children only a century ago. [Continue reading…]

Facebooktwittermail

Roads have sliced the world into 600,000 pieces

Nathaniel Scharping writes: Ever since our ancestors cut rough paths through the wilderness, humanity has been laying down trails. From footpaths to highways, a global network of roads binds communities and facilitates the exchange of goods and ideas. But there is a flip side to this creeping tangle of pathways: The roads that bring us closer also serve divide ecosystems into smaller parcels, turning vast expanses into a jigsaw of human mobility.

In a study published in Science, an international team of researchers attempted to quantify the extent to which roads have sliced up the globe. They used data from OpenStreetMap, a crowd-sourced mapping project, to chart how much land is covered by roads. For the purposes of their project, they defined a roadway as everything within a kilometer of the physical road itself (studies have shown measurable impacts on the environment extending out at least that far).

They estimated that roughly 20 percent of land is occupied by roads, not including Greenland and Antarctica. Although that leaves 80 percent as open space, this land is far from whole. Transected by highways and streets, the road-free areas are cut up into some 600,000 individual parcels. Half of these are less than a square mile, while only 7 percent span more than 60 square miles. The true impact of roads seems to be the gradual tessellation of once-cohesive landscapes. [Continue reading…]

Facebooktwittermail

How a guy from a Montana trailer park overturned 150 years of biology

Ed Yong writes: In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.

At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.

Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Spribille. “That was the bottleneck of my life.”

Throughout his undergraduate and postgraduate work, Spribille became an expert on the organisms that had grabbed his attention during his time in the Montana forests — lichens.

You’ve seen lichens before, but unlike Spribille, you may have ignored them. They grow on logs, cling to bark, smother stones. At first glance, they look messy and undeserving of attention. On closer inspection, they are astonishingly beautiful. They can look like flecks of peeling paint, or coralline branches, or dustings of powder, or lettuce-like fronds, or wriggling worms, or cups that a pixie might drink from. They’re also extremely tough. They grow in the most inhospitable parts of the planet, where no plant or animal can survive.

Lichens have an important place in biology. In the 1860s, scientists thought that they were plants. But in 1868, a Swiss botanist named Simon Schwendener revealed that they’re composite organisms, consisting of fungi that live in partnership with microscopic algae. This “dual hypothesis” was met with indignation: it went against the impetus to put living things in clear and discrete buckets. The backlash only collapsed when Schwendener and others, with good microscopes and careful hands, managed to tease the two partners apart.

Schwendener wrongly thought that the fungus had “enslaved” the alga, but others showed that the two cooperate. The alga uses sunlight to make nutrients for the fungus, while the fungus provides minerals, water, and shelter. This kind of mutually beneficial relationship was unheard of, and required a new word. Two Germans, Albert Frank and Anton de Bary, provided the perfect one — symbiosis, from the Greek for ‘together’ and ‘living’. [Continue reading…]

Facebooktwittermail

Humans have been altering Earth for millennia, but only now are we wise to what we’re doing

David Grinspoon writes: As a planetary astrobiologist, I am focused on the major transitions in planetary evolution and the evolving relationship between planets and life. The scientific community is converging on the idea that we have entered a new epoch of Earth history, one in which the net activity of humans has become an agent of global change as powerful as the great forces of nature that shape continents and propel the evolution of species. This concept has garnered a lot of attention, and justly so. Thinking about the new epoch – often called the Anthropocene, or the age of humanity – challenges us to look at ourselves in the mirror of deep time, measured not in centuries or even in millennia, but over millions and billions of years. And yet much of the recent discussion and debate over the Anthropocene still does not come to terms with its full meaning and importance.

Various markers have been proposed for the starting date of the Anthropocene, such as the rise in CO2, isotopes from nuclear tests, the ‘Columbian exchange’ of species between hemispheres when Europeans colonised the Americas, or more ancient human modifications of the landscape or climate. The question in play here is: when did our world gain a quality that is uniquely human? Many species have had a major influence on the globe, but they don’t each get their own planetary transition in the geologic timescale. When did humans begin changing things in a way that no other species has ever changed Earth before? Making massive changes in landscapes is not unique to us. Beavers do plenty of that, for example, when they build dams, alter streams, cut down forests and create new meadows. Even changing global climate and initiating mass extinction is not a human first. Photosynthetic bacteria did that some 2.5 billion years ago.

What distinguishes humans from other world-changing organisms must be related to our great cleverness and adaptability; the power that comes from communicating, planning and working in social groups; transmitting knowledge from one generation to the next; and applying these skills toward altering our surroundings and expanding our habitable domains. However, people have been engaged in these activities for tens of thousands of years, and have produced many different environmental modifications proposed as markers of the Anthropocene’s beginning. Therefore, those definitions strike me as incomplete. Until now, the people causing the disturbances had no way of recognising or even conceiving of a global change. Yes, humans have been altering our planet for millennia, but there is something going on now that was not happening when we started doing all that world-changing. [Continue reading…]

Facebooktwittermail

Freeman Dyson on working with the greatest physicists of the 20th century

Steve Paulson writes: One gets the sense that Freeman Dyson has seen everything. It’s not just that at 92 he’s had a front row seat on scientific breakthroughs for the past century, or that he’s been friends and colleagues with many of the giants of 20th-century physics, from Hans Bethe and Wolfgang Pauli to Robert Oppenheimer and Richard Feynman. Dyson is one of the great sages of the science world. If you want to get a sense of where science has come from and where it might be headed, Dyson is your man.

Dyson grew up in England with a gift for numbers and calculating. During World War II, he worked with the British Royal Air Force to pinpoint bombing targets in Germany. After the war, he moved to the United States where he got to know many of the physicists who’d built the atomic bomb. Like a lot of scientists from that era, excitement over the bomb helped launch his career in physics, and later he dreamed of building a fleet of spaceships that would travel around the solar system, powered by nuclear bombs. Perhaps it’s no accident that Dyson became an outspoken critic of nuclear weapons during the Cold War.

For more than six decades, Princeton’s Institute for Advanced Study has been his intellectual home. Dyson has described himself as a fox rather than a hedgehog. He says scientists who jump from one project to the next have more fun. Though no longer an active scientist, he continues to track developments in science and technology. Dyson seems to be happy living in a universe filled with answered questions, and he likes the fact that physics has so far failed to unify the classical world of stars and the quantum world of atoms.

When I approached Dyson about an interview on the idea of the heroic in science, he responded, “I prefer telling stories to talking philosophy.” In the end, I got both stories and big ideas. Dyson isn’t shy about making sweeping pronouncements—whether on the archaic requirements of the Ph.D. system or the pitfalls of Big Science—but his manner is understated and his dry sense of humor is always just below the surface. [Continue reading…]

Facebooktwittermail

Why most planets will either be lush or dead

David Grinspoon writes: Can a planet be alive? Lynn Margulis, a giant of late 20th-century biology, who had an incandescent intellect that veered toward the unorthodox, thought so. She and chemist James Lovelock together theorized that life must be a planet-altering phenomenon and the distinction between the “living” and “nonliving” parts of Earth is not as clear-cut as we think. Many members of the scientific community derided their theory, called the Gaia hypothesis, as pseudoscience, and questioned their scientific integrity. But now Margulis and Lovelock may have their revenge. Recent scientific discoveries are giving us reason to take this hypothesis more seriously. At its core is an insight about the relationship between planets and life that has changed our understanding of both, and is shaping how we look for life on other worlds.

Studying Earth’s global biosphere together, Margulis and Lovelock realized that it has some of the properties of a life form. It seems to display “homeostasis,” or self‐regulation. Many of Earth’s life‐sustaining qualities exhibit remarkable stability. The temperature range of the climate; the oxygen content of the atmosphere; the pH, chemistry, and salinity of the ocean—all these are biologically mediated. All have, for hundreds of millions of years, stayed within a range where life can thrive. Lovelock and Margulis surmised that the totality of life is interacting with its environments in ways that regulate these global qualities. They recognized that Earth is, in a sense, a living organism. Lovelock named this creature Gaia.

Margulis and Lovelock showed that the Darwinian picture of biological evolution is incomplete. Darwin identified the mechanism by which life adapts due to changes in the environment, and thus allowed us to see that all life on Earth is a continuum, a proliferation, a genetic diaspora from a common root. In the Darwinian view, Earth was essentially a stage with a series of changing backdrops to which life had to adjust. Yet, what or who was changing the sets? Margulis and Lovelock proposed that the drama of life does not unfold on the stage of a dead Earth, but that, rather, the stage itself is animated, part of a larger living entity, Gaia, composed of the biosphere together with the “nonliving” components that shape, respond to, and cycle through the biota of Earth. Yes, life adapts to environmental change, shaping itself through natural selection. Yet life also pushes back and changes the environment, alters the planet. This is now as obvious as the air you are breathing, which has been oxygenated by life. So evolution is not a series of adaptations to inanimate events, but a system of feedbacks, an exchange. Life has not simply molded itself to the shifting contours of a dynamic Earth. Rather, life and Earth have shaped each other as they’ve co-evolved. When you start looking at the planet in this way, then you see coral reefs, limestone cliffs, deltas, bogs, and islands of bat guano as parts of this larger animated entity. You realize that the entire skin of Earth, and its depths as well, are indeed alive. [Continue reading…]

Facebooktwittermail

Nobody is home

Charles Leadbeater writes: Heidegger detested René Descartes’s dictum ‘I think, therefore I am’ which located the search for identity in our brains. There, it was secured by a rational process of thought, detached from a physical world that presented itself to the knowing subject as a puzzle to be solved. Descartes’s ideas launched a great inward turn in philosophy with the subject at the centre of the drama confronting the objective world about which he tries to gain knowledge.

Had Heidegger ever come up with a saying to sum up his philosophy it would have been: ‘I dwell, therefore I am.’ For him, identity is bound up with being in the world, which in turn means having a place in it. We don’t live in the abstract space favoured by philosophers, but in a particular place, with specific features and history. We arrive already entangled with the world, not detached from it. Our identity is not secured just in our heads but through our bodies too, how we feel and how we are moved, literally and emotionally.

Instead of presenting it as a puzzle to be solved, Heidegger’s world is one we should immerse ourselves in and care for: it is part of the larger ‘being’ where we all belong. As [Jeff] Malpas puts it, Heidegger argues that we should release ourselves to the world, to find our part in its larger ebb and flow, rather than seek to detach ourselves from it in order to dominate it. [Continue reading…]

Facebooktwittermail

New evidence that Lucy, our most famous ancestor, had superstrong arms

The Washington Post reports: In Ethiopia, she is known as “Dinkinesh” — Amharic for “you are marvelous.” It’s an apt name for one of the most complete ancient hominid skeletons ever found, an assemblage of fossilized bones that has given scientists unprecedented insight into the history of humanity.

You probably know her as Lucy.

Discovered in 1974, wedged into a gully in Ethiopia’s Awash Valley, the delicate, diminutive skeleton is both uncannily familiar and alluringly strange. In some ways, the 3.2-million-year-old Australopithecus was a lot like us; her hips, feet and long legs were clearly made for walking. But she also had long arms and dexterous curved fingers, much like modern apes that still swing from the trees.

So, for decades scientists have wondered: Who exactly was Lucy? Was she lumbering and land-bound, like us modern humans? Or did she retain some of the ancient climbing abilities that made her ancestors — and our own — champions of the treetops?

A new study suggests she was a little of both: Though her lower limbs were adapted for bipedalism, she had exceptionally strong arm bones that allowed her to haul herself up branches, researchers reported Wednesday in the journal PLoS One. [Continue reading…]

Facebooktwittermail

Dino-killing asteroid may have punctured Earth’s crust

Live Science reports: After analyzing the crater from the cosmic impact that ended the age of dinosaurs, scientists now say the object that smacked into the planet may have punched nearly all the way through Earth’s crust, according to a new study.

The finding could shed light on how impacts can reshape the faces of planets and how such collisions can generate new habitats for life, the researchers said.

Asteroids and comets occasionally pelt Earth’s surface. Still, for the most part, changes to the planet’s surface result largely from erosion due to rain and wind, “as well as plate tectonics, which generates mountains and ocean trenches,” said study co-author Sean Gulick, a marine geophysicist at the University of Texas at Austin.

In contrast, on the solar system’s other rocky planets, erosion and plate tectonics typically have little, if any, influence on the planetary surfaces. “The key driver of surface changes on those planets is constantly getting hit by stuff from space,” Gulick told Live Science. [Continue reading…]

Facebooktwittermail

How industrialization brought about the decline of vertebrate species

Nathan Collins writes: With climate change and deforestation threatening biodiversity around the world, it’s fair to wonder just how rapidly threatened species have been declining, and when exactly those declines began. The answer is bleak: Among threatened vertebrates, rapid losses began in the late 19th century, and numbers have since declined by about 25 percent per decade, according to a new study.

“Although preservation of biodiversity is vital to a sustainable human society, rapid population decline (RPD) continues to be widespread” across plant and animal populations, Haipeng Li and a team of Chinese and American biologists write in Proceedings of the National Academy of Sciences.

Understanding the severity and origins of these population losses could help conservationists protect endangered species and possibly help promote public awareness of the threat, the researchers argue. But there’s a problem: Good data on plant and animal population sizes only goes back about four decades, and populations surely declined prior to that.

Fortunately, modern biologists have a way to circumvent that: DNA. [Continue reading…]

Facebooktwittermail

The idea of political correctness is central to the culture wars of American politics

Scott Barry Kaufman writes: In a recent study, Christine Brophy and Jordan Peterson conducted a very illuminating analysis of the personality of political correctness. They created a very comprehensive 192-item PC scale measuring PC-related language, beliefs, and emotions based on their reading of news articles, books, and research papers on political correctness. Their PC battery employed a variety of question types, and tapped into the beliefs, language, and emotional sensitivity of politically correct individuals. The list was reviewed and added to by faculty and graduate students, and 332 participants completed the new PC scale, along with questionnaires on personality, IQ, and disgust sensitivity.
What did they find?

The researchers found that PC exists, can be reliably measured, and has two major dimensions. They labeled the first dimension “PC-Egalitarianism” and the second dimension “PC-Authoritarianism”. Interestingly, they found that PC is not a purely left-wing phenomenon, but is better understood as the manifestation of a general offense sensitivity, which is then employed for either liberal or conservative ends.

Nevertheless, while both dimensions of political correctness involve offense sensitivity, they found some critical differences. PC-Egalitarians tended to attribute a cultural basis for group differences, believed that differences in group power springs from societal injustices, and tended to support policies to prop up historically disadvantages groups. Therefore, the emotional response of this group to discriminating language appears to stem from an underlying motivation to achieve diversity through increased equality, and any deviation from equality is assumed to be caused by culture. Their beliefs lead to advocating for a more democratic governance.

In contrast, PC-Authoritarians tended to attribute a biological basis for group differences, supported censorship of material that offends, and supported policies of harsher punitive justice for transgressors. Therefore, this dimension of PC seems to reflect more of an indiscriminate or general sensitivity to offense, and seems to stem from an underlying motivation to achieve security and stability for those in distress. Their beliefs lead to advocating for a more autocratic governance to achieve uniformity. [Continue reading…]

Facebooktwittermail

Kerouac’s French-Canadian roots hold the key to his literary identity

Deni Ellis Béchard writes: The real-life backstory of Jack Kerouac’s unpublished novel is classic beat generation. It was December 1952, and tensions were running high as Jack and his friend Neal Cassady — the inspiration for the character of Dean Moriarty in On the Road — drove from San Francisco to Mexico City.

Whereas Neal was looking for adventure and a chance to stock up on weed, Jack was in a difficult period. His first novel, The Town and the City, published under the name John Kerouac in 1950, had met with lukewarm reviews and poor sales. In April 1951, he had written On the Road on a (now famous) 120-foot-long scroll, but hadn’t been able to find a publisher. He was thirty and had been laid off by the railroad after a bout of phlebitis in his leg.

Kerouac decided to convalesce in Mexico City with William S. Burroughs, who would later author Naked Lunch. Three months earlier, Burroughs had performed a William Tell act with his wife, Joan, while they were drunk and accidentally shot her in the head, killing her. Shortly after Kerouac’s arrival, Burroughs skipped bail and fled the country. Neal Cassady went home. Alone, living in a rooftop apartment in Mexico City, Jack wrote a short novel over the course of five days.

The first line reads: Dans l’moi d’Octobre, 1935, (dans la nuit de nos vra vie bardasseuze) y’arriva une machine du West, de Denver, sur le chemin pour New York. Written in the language of Kerouac’s childhood — a French-Canadian patois then commonly spoken in parts of New England — the line has an epic, North American ring. Kerouac would later translate it as “In the month of October, 1935, in the night of our real restless lives, a car came from the West, from Denver, on the road for New York.”

The novel’s title is Sur le chemin — “On the Road.” But it is not the On the Road we all know (which would be translated in France as Sur la route). It was the On the Road of Kerouac’s vernacular — chemin being used in the title to mean both “path” and “road.”

Over the course of his literary career, Kerouac redefined the archetype of the American man, and he has since become so integral to American culture that his identity as an immigrant writer is often forgotten. He was born in 1922 as Jean-Louis Lebris de Kérouac to parents from Quebec. He spoke French at home and grew up in the French-Canadian community of Lowell, Massachusetts. In one of his letters, he wrote, “The English language is a tool lately found . . . so late (I never spoke English before I was six or seven). At 21, I was still somewhat awkward and illiterate sounding in my [English] speech and writings.”

In 1954, Kerouac created a list of everything he had written and included Sur le chemin among his “completed novels” — even though it would remain in his archives for more than six decades before publication was finally arranged this year. Sur le chemin and his other French writings provide a key to unlocking his more famous works, revealing a man just as obsessed with the difficulty of living between two languages as he was with his better-known spiritual quests.

In particular, they help explain the path — le chemin — he took as he developed his influential style, which changed the way many writers throughout the world have thought about prose. To this day, Kerouac remains one of the most translated authors, and one whose work is shared across generations. His unpublished French works shine a light on how the voice and ideas of an iconic American figure emerged from the experiences of French-Canadian immigrants — a group whose language and culture remain largely unknown to mainstream America. [Continue reading…]

Facebooktwittermail

Why we have globalization to thank for Thanksgiving

By Farok J. Contractor, Rutgers University

As Americans sit down to their Thanksgiving Day feasts, some may recall the story of the “Pilgrim Fathers” who founded one of the first English settlements in North America in 1620, at what is today the town of Plymouth, Massachusetts.

The history we know is one of English settlers seeking religious freedom in a New World but instead finding “a hideous and desolate wilderness, full of wilde beasts and wilde men.”

What many Americans don’t realize, however, is that the story of those early settlers’ struggle, which culminated in what we remember today as the first Thanksgiving feast, is also a tale of globalization, many centuries before the word was even coined.

Crossing the Atlantic began a century before the Pilgrims’ passage to the New World aboard the Mayflower. By the 1600s, trans-Atlantic travel had became increasingly common. It was because of globalization that those first settlers were able to survive in an inhospitable and unforgiving land. And the turkey on Thanksgiving tables may not be a bird native to the U.S. but is more likely a (re)import from Europe.

Two short stories will help me explain. As a professor of international business at Rutgers University, I have been fascinated by the history of trade going back millennia, and how most Americans do not know the background story of Thanksgiving Day.

Continue reading

Facebooktwittermail

Meet the frail, small-brained people who first trekked out of Africa

Science magazine reports: On a promontory high above the sweeping grasslands of the Georgian steppe, a medieval church marks the spot where humans have come and gone along Silk Road trade routes for thousands of years. But 1.77 million years ago, this place was a crossroads for a different set of migrants. Among them were saber-toothed cats, Etruscan wolves, hyenas the size of lions—and early members of the human family.

Here, primitive hominins poked their tiny heads into animal dens to scavenge abandoned kills, fileting meat from the bones of mammoths and wolves with crude stone tools and eating it raw. They stalked deer as the animals drank from an ancient lake and gathered hackberries and nuts from chestnut and walnut trees lining nearby rivers. Sometimes the hominins themselves became the prey, as gnaw marks from big cats or hyenas on their fossilized limb bones now testify.

“Someone rang the dinner bell in gully one,” says geologist Reid Ferring of the University of North Texas in Denton, part of an international team analyzing the site. “Humans and carnivores were eating each other.”

This is the famous site of Dmanisi, Georgia, which offers an unparalleled glimpse into a harsh early chapter in human evolution, when primitive members of our genus Homo struggled to survive in a new land far north of their ancestors’ African home, braving winters without clothes or fire and competing with fierce carnivores for meat. The 4-hectare site has yielded closely packed, beautifully preserved fossils that are the oldest hominins known outside of Africa, including five skulls, about 50 skeletal bones, and an as-yet-unpublished pelvis unearthed 2 years ago. “There’s no other place like it,” says archaeologist Nick Toth of Indiana University in Bloomington. “It’s just this mother lode for one moment in time.”

Until the discovery of the first jawbone at Dmanisi 25 years ago, researchers thought that the first hominins to leave Africa were classic H. erectus (also known as H. ergaster in Africa). These tall, relatively large-brained ancestors of modern humans arose about 1.9 million years ago and soon afterward invented a sophisticated new tool, the hand ax. They were thought to be the first people to migrate out of Africa, making it all the way to Java, at the far end of Asia, as early as 1.6 million years ago. But as the bones and tools from Dmanisi accumulate, a different picture of the earliest migrants is emerging. [Continue reading…]

Facebooktwittermail

Huddled mice could change the way we think about evolution

Stuart P Wilson, University of Sheffield and James V Stone, University of Sheffield

Adapt or die. That’s the reality for an animal species when it is faced with a harsh environment. Until now, many scientists have assumed that the more challenging an animal’s environment, the greater the pressure to adapt and the faster its genes evolve. But we have just published new research in Royal Society Open Science that shows that genes might actually evolve faster when the pressure to adapt is reduced.

We built a simple computer model of how evolution may be affected by the way animals interact with each other when they’re in groups. Specifically, we looked at what happens to animals that huddle together to keep warm.

We found that when animals huddle in larger groups, their genes for regulating temperature evolve faster, even though there is less pressure to adapt to the cold environment because of the warmth of the huddle. This shows that an organism’s evolution doesn’t just depend on its environment but also on how it behaves.

When animals such as rats and mice huddle together in groups, they can maintain a high body temperature without using as much energy as they would on their own. We wanted to understand how this kind of group behaviour would affect a species’ evolution.

To do this, we built a computer model simulating how the species’ genes changed and were passed on over multiple generations. When the effects of huddling were built into the computer model, the reduced pressure to adapt was actually found to accelerate evolution of the genes controlling heat production and heat loss.

Continue reading

Facebooktwittermail

China is at the forefront of manipulating DNA to create a new class of superhumans

G. Owen Schaefer writes: Would you want to alter your future children’s genes to make them smarter, stronger, or better looking? As the state of science brings prospects like these closer to reality, an international debate has been raging over the ethics of enhancing human capacities with biotechnologies such as so-called smart pills, brain implants, and gene editing. This discussion has only intensified in the past year with the advent of the CRISPR-cas9 gene editing tool, which raises the specter of tinkering with our DNA to improve traits like intelligence, athleticism, and even moral reasoning.

So are we on the brink of a brave new world of genetically enhanced humanity? Perhaps. And there’s an interesting wrinkle: It’s reasonable to believe that any seismic shift toward genetic enhancement will not be centered in Western countries like the US or the UK, where many modern technologies are pioneered. Instead, genetic enhancement is more likely to emerge out of China.

Numerous surveys among Western populations have found significant opposition to many forms of human enhancement. For example, a recent Pew study of 4,726 Americans found that most would not want to use a brain chip to improve their memory, and a plurality view such interventions as morally unacceptable. [Continue reading…]

Facebooktwittermail

A unified theory of evolution requires input from Darwin and Lamarck

lichen8

Michael Skinner writes: The unifying theme for much of modern biology is based on Charles Darwin’s theory of evolution, the process of natural selection by which nature selects the fittest, best-adapted organisms to reproduce, multiply and survive. The process is also called adaptation, and traits most likely to help an individual survive are considered adaptive. As organisms change and new variants thrive, species emerge and evolve. In the 1850s, when Darwin described this engine of natural selection, the underlying molecular mechanisms were unknown. But over the past century, advances in genetics and molecular biology have outlined a modern, neo-Darwinian theory of how evolution works: DNA sequences randomly mutate, and organisms with the specific sequences best adapted to the environment multiply and prevail. Those are the species that dominate a niche, until the environment changes and the engine of evolution fires up again.

But this explanation for evolution turns out to be incomplete, suggesting that other molecular mechanisms also play a role in how species evolve. One problem with Darwin’s theory is that, while species do evolve more adaptive traits (called phenotypes by biologists), the rate of random DNA sequence mutation turns out to be too slow to explain many of the changes observed. Scientists, well-aware of the issue, have proposed a variety of genetic mechanisms to compensate: genetic drift, in which small groups of individuals undergo dramatic genetic change; or epistasis, in which one set of genes suppress another, to name just two.

Yet even with such mechanisms in play, genetic mutation rates for complex organisms such as humans are dramatically lower than the frequency of change for a host of traits, from adjustments in metabolism to resistance to disease. The rapid emergence of trait variety is difficult to explain just through classic genetics and neo-Darwinian theory. To quote the prominent evolutionary biologist Jonathan B L Bard, who was paraphrasing T S Eliot: ‘Between the phenotype and genotype falls the shadow.’

And the problems with Darwin’s theory extend out of evolutionary science into other areas of biology and biomedicine. For instance, if genetic inheritance determines our traits, then why do identical twins with the same genes generally have different types of diseases? And why do just a low percentage (often less than 1 per cent) of those with many specific diseases share a common genetic mutation? If the rate of mutation is random and steady, then why have many diseases increased more than 10-fold in frequency in only a couple decades? How is it that hundreds of environmental contaminants can alter disease onset, but not DNA sequences? In evolution and biomedicine, the rates of phenotypic trait divergence is far more rapid than the rate of genetic variation and mutation – but why?

Part of the explanation can be found in some concepts that Jean-Baptiste Lamarck proposed 50 years before Darwin published his work. Lamarck’s theory, long relegated to the dustbin of science, held, among other things, ‘that the environment can directly alter traits, which are then inherited by generations to come’. [Continue reading…]

Facebooktwittermail

Digging our own graves in deep time

By David Farrier, Aeon, October 31, 2016

Late one summer night in 1949, the British archaeologist Jacquetta Hawkes went out into her small back garden in north London, and lay down. She sensed the bedrock covered by its thin layer of soil, and felt the hard ground pressing her flesh against her bones. Shimmering through the leaves and out beyond the black lines of her neighbours’ chimney pots were the stars, beacons ‘whose light left them long before there were eyes on this planet to receive it’, as she put it in A Land (1951), her classic book of imaginative nature writing.

We are accustomed to the idea of geology and astronomy speaking the secrets of ‘deep time, the immense arc of non-human history that shaped the world as we perceive it. Hawkes’s lyrical meditation mingles the intimate and the eternal, the biological and the inanimate, the domestic with a sense of deep time that is very much of its time. The state of the topsoil was a matter of genuine concern in a country wearied by wartime rationing, while land itself rises into focus just as Britain is rethinking its place in the world. But in lying down in her garden, Hawkes also lies on the far side of a fundamental boundary. A Land was written at the cusp of the Holocene; we, on the other hand, read it in the Anthropocene.

The Anthropocene, or era of the human, denotes how industrial civilisation has changed the Earth in ways that are comparable with deep-time processes. The planet’s carbon and nitrogen cycles, ocean chemistry and biodiversity – each one the product of millions of years of slow evolution – have been radically and permanently disrupted by human activity. The development of agriculture 10,000 years ago, and the Industrial Revolution in the middle of the 19th century, have both been proposed as start dates for the Anthropocene. But a consensus has gathered around the Great Acceleration – the sudden and dramatic jump in consumption that began around 1950, followed by a huge rise in global population, an explosion in the use of plastics, and the collapse of agricultural diversity.

Continue reading

Facebooktwittermail