Brooke Borel writes: Battles fought 542 million years before today helped fuel a blast that brought humans and most animals into existence. The great Cambrian Explosion was a period of unprecedented one-upmanship. Beastly claws crushed through thin skin, and soft-bodied creatures evolved shells shaped like scythes, sickles, and shields.
For about a billion years prior, the cells and genes that would later create animals were evolving in microscopic organisms who inhabited the oceans of Earth. These essential molecular changes may only be inferred today because they’re not preserved in fossils. The earliest traces of animals, about 580 million years old, appear soft, with no sign of claws, teeth, limbs, or brains. Then, within 54 million years (a relative blink but still, 270 times the duration of humans’ existence thus far), most of the main animal groups around today originated. This rapid rate of increase in animal architectures has never since been repeated.
A simple species count does not do justice to the power of the Cambrian Explosion. Species have continuously formed over time. A new type of moth may have antennae that are furrier than its sisters; a new species of dinosaur may be distinguished by clawed wings and vicious front fangs. But a new phylum — a major branch on the tree of life, the upper-level ranking that separates an insect from a pterodactyl — is rarely born.
Most of today’s 30 to 40 animal phyla originated in the Cambrian, and have persisted through time with hundreds of variations on a theme. [Continue reading…]
Category Archives: Attention to the Unseen
Sino-Tibetan populations shed light on human cooperation
One of the big questions in anthropology is why humans, unlike most animals, cooperate with those we are not closely related to. Exactly what has driven this behaviour is not well understood. Anthropologists suspect it could be down to the fact that women have usually left their homes after marriage to go and live with their husband’s family. This creates links between distant families, which may explain our tendency to cooperate beyond our own households.
Now our study on the Tibetan borderlands of China, published in Nature Communications, shows that it is indeed the case that cooperation is greater in populations where females disperse for marriage.
A natural experiment in social structure
There are a lot of different theories about the link between dispersal, kinship and cooperation, which is what we wanted to test. Anthropologists believe that dispersal leads to cooperation through links between families, and some evolutionary models predict that when nobody moves this leads to residents competing for the same resources and greater conflict between kin. But there are also models that suggest the opposite is true – that if nobody moves, neighbours are more likely to be related, leading to more cooperation in the neighbourhood.
Einstein was wrong. ‘Spooky action’ is real
Delft University of Technology reports: In 1935, Einstein asked a profound question about our understanding of Nature: are objects only influenced by their nearby environment? Or could, as predicted by quantum theory, looking at one object sometimes instantaneously affect another far-away object? Einstein did not believe in quantum theory’s prediction, famously calling it “spooky action”.
Exactly 80 years later, a team of scientists led by professor Ronald Hanson from Delft University of Technology finally performed what is seen as the ultimate test against Einstein’s worldview: the loophole-free Bell test. The scientists found that two electrons, separated 1.3 km from each other on the Delft University campus, can indeed have an invisible and instantaneous connection: the spooky action is real.
The experiment, published in Nature today, breaks the last standing defence of Einstein’s iconic 1935 paper: it closes all the loopholes present in earlier experiments. The Delft experiment not only closes a chapter in one of the most intriguing debates in science, it may also enable a radically new form of secure communications that is fundamentally impossible to ‘eavesdrop’ into.
“Quantum mechanics states that a particle such as an electron can be in two different states at the same time, and even in two different places, as long as it is not observed. This is called ‘superposition’ and it is a very counter-intuitive concept”, says lead scientist Professor Ronald Hanson. Hanson’s group works with trapped electrons, which have a tiny magnetic effect known as a “spin” that can be pointing up, or down, or – when in superposition – up and down at the same time. “Things get really interesting when two electrons become entangled. Both are then up and down at the same time, but when observed one will always be down and the other one up. They are perfectly correlated, when you observe one, the other one will always be opposite. That effect is instantaneous, even if the other electron is in a rocket at the other end of the galaxy”, says Hanson. [Continue reading…]
Long before going to Europe, humans ventured east to Asia
LiveScience reports: Teeth from a cave in China suggest that modern humans lived in Asia much earlier than previously thought, and tens of thousands of years before they reached Europe, researchers say.
This discovery yields new information about the dispersal of modern humans from Africa to the rest of the world, and could shed light on how modern humans and Neanderthals interacted, the scientists added.
Modern humans first originated about 200,000 years ago in Africa. When and how the modern human lineage dispersed from Africa has long been controversial.
Previous research suggested the exodus from Africa began between 70,000 and 40,000 years ago. However, recent research hinted that modern humans might have begun their march across the globe as early as 130,000 years ago. [Continue reading…]
Old and new: How the brain evokes a sense of familiarity
Science News reports: It’s happened to all of us at one time or another: You’re walking through a crowd, and suddenly a face seems incredibly familiar — so much so that you do a double-take. Who is that? How do you know them? You have no idea, but something about their face nags at you. You know you’ve seen it before.
The reason you know that face is in part because of your perirhinal cortex. This is an area of the brain that helps us to determine familiarity, or whether we have seen an object before. A new study of brain cells in this area finds that firing these neurons at one frequency makes the brain treat novel images as old hat. But firing these same neurons at another frequency can make the old new again.
“Novelty and familiarity are both really important,” says study coauthor Rebecca Burwell, a neuroscientist at Brown University in Providence, R.I. “They are important for learning and memory and decision making.” Finding a cache of food and knowing it is new could be useful for an animal’s future. So is recognizing a familiar place where the pickings were good in the past.
But knowing that something is familiar is not quite the same thing as knowing what that thing is. “You’re in a crowd and you see a familiar face, and there’s a feeling,” Burwell explains. “You can’t identify them, you don’t know where you met them, but there’s a sense of familiarity.” It’s different from recalling where you met the person, or even who the person is. This is a sense at the base of memory. And while scientists knew the perirhinal cortex was involved in this sense of familiarity, how that feeling of new or old was coded in the brain wasn’t fully understood. [Continue reading…]
Our moral identity makes us who we are
Nina Strohminger writes: e morning after her accident, a woman I’ll call Kate awoke in a daze. She looked at the man next to her in bed. He resembled her husband, with the same coppery beard and freckles dusted across his shoulders. But this man was definitely not her husband.
Panicked, she packed a small bag and headed to her psychiatrist’s office. On the bus, there was a man she had been encountering with increasing frequency over the past several weeks. The man was clever, he was a spy. He always appeared in a different form: one day as a little girl in a sundress, another time as a bike courier who smirked at her knowingly. She explained these bizarre developments to her doctor, who was quickly becoming one of the last voices in this world she could trust. But as he spoke, her stomach sank with a dreaded realisation: this man, too, was an impostor.
Kate has Capgras syndrome, the unshakeable belief that someone – often a loved one, sometimes oneself – has been replaced with an exact replica. She also has Fregoli syndrome, the delusion that the same person is taking on a variety of shapes, like an actor donning an expert disguise. Capgras and Fregoli delusions offer hints about an extraordinary cognitive mechanism active in the healthy mind, a mechanism so exquisitely tuned that we are hardly ever aware of it. This mechanism ascribes to each person a unique identity, and then meticulously tracks and updates it. This mechanism is crucial to virtually every human interaction, from navigating a party to navigating a marriage. Without it, we quickly fall apart. [Continue reading…]
Humans are natural polymaths, at our best when we turn our minds to many things
Robert Twigger writes: I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.
We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?
The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’ [Continue reading…]
New study indicates Earth’s inner core was formed 1-1.5 billion years ago
Phys.org reports: There have been many estimates for when the earth’s inner core was formed, but scientists from the University of Liverpool have used new data which indicates that the Earth’s inner core was formed 1 – 1.5 billion years ago as it “froze” from the surrounding molten iron outer core.
The inner core is Earth’s deepest layer. It is a ball of solid iron just larger than Pluto which is surrounded by a liquid outer core. The inner core is a relatively recent addition to our planet and establishing when it was formed is a topic of vigorous scientific debate with estimates ranging from 0.5 billion to 2 billion years ago.
In a new study published in Nature, researchers from the University’s School of Environmental Sciences analysed magnetic records from ancient igneous rocks and found that there was a sharp increase in the strength of the Earth’s magnetic field between 1 and 1.5 billion years ago.
This increased magnetic field is a likely indication of the first occurrence of solid iron at Earth’s centre and the point in Earth’s history at which the solid inner core first started to “freeze” out from the cooling molten outer core.
Liverpool palaeomagnetism expert and the study’s lead author, Dr Andy Biggin, said: “This finding could change our understanding of the Earth’s interior and its history.” [Continue reading…]
There is no known physics theory that is true at every scale — there may never be
Lawrence M Krauss writes: Whenever you say anything about your daily life, a scale is implied. Try it out. “I’m too busy” only works for an assumed time scale: today, for example, or this week. Not this century or this nanosecond. “Taxes are onerous” only makes sense for a certain income range. And so on.
Surely the same restriction doesn’t hold true in science, you might say. After all, for centuries after the introduction of the scientific method, conventional wisdom held that there were theories that were absolutely true for all scales, even if we could never be empirically certain of this in advance. Newton’s universal law of gravity, for example, was, after all, universal! It applied to falling apples and falling planets alike, and accounted for every significant observation made under the sun, and over it as well.
With the advent of relativity, and general relativity in particular, it became clear that Newton’s law of gravity was merely an approximation of a more fundamental theory. But the more fundamental theory, general relativity, was so mathematically beautiful that it seemed reasonable to assume that it codified perfectly and completely the behavior of space and time in the presence of mass and energy.
The advent of quantum mechanics changed everything. When quantum mechanics is combined with relativity, it turns out, rather unexpectedly in fact, that the detailed nature of the physical laws that govern matter and energy actually depend on the physical scale at which you measure them. This led to perhaps the biggest unsung scientific revolution in the 20th century: We know of no theory that both makes contact with the empirical world, and is absolutely and always true. [Continue reading…]
Video: The hidden life of the cell
Human Genome Project: Twenty-five years of big biology
Eric D. Green, James D. Watson& Francis S. Collins write: Twenty-five years ago, the newly created US National Center for Human Genome Research (now the National Human Genome Research Institute; NHGRI), which the three of us have each directed, joined forces with US and international partners to launch the Human Genome Project (HGP). What happened next represents one of the most historically significant scientific endeavours: a 13-year quest to sequence all three billion base pairs of the human genome.
Even just a few years ago, discussions surrounding the HGP focused mainly on what insights the project had brought or would bring to our understanding of human disease. Only now is it clear that, as well as dramatically accelerating biomedical research, the HGP initiated a new way of doing science.
As biology’s first large-scale project, the HGP paved the way for numerous consortium-based research ventures. The NHGRI alone has been involved in launching more than 25 such projects since 2000. These have presented new challenges to biomedical research — demanding, for instance, that diverse groups from different countries and disciplines come together to share and analyse vast data sets. [Continue reading…]
The Independent reports: The most comprehensive study of the human genome has discovered that a sizeable minority of people are walking around with some of their genes missing without any apparent ill-effects, scientists have found.
A project to sequence and analyse the entire genetic code of more than 2,500 people drawn from 26 different ethnic populations from around the world has revealed that some genes do not seem to be as essential for health and life as previously believed.
The finding is just one to have emerged from the 1,000 Genomes Project set up in 2008 to study the genetic variation in at least this number of people in order to understand the variety of DNA types within the human population, the researchers said. [Continue reading…]
Technology is implicated in an assault on empathy
Sherry Turkle writes: Studies of conversation both in the laboratory and in natural settings show that when two people are talking, the mere presence of a phone on a table between them or in the periphery of their vision changes both what they talk about and the degree of connection they feel. People keep the conversation on topics where they won’t mind being interrupted. They don’t feel as invested in each other. Even a silent phone disconnects us.
In 2010, a team at the University of Michigan led by the psychologist Sara Konrath put together the findings of 72 studies that were conducted over a 30-year period. They found a 40 percent decline in empathy among college students, with most of the decline taking place after 2000.
Across generations, technology is implicated in this assault on empathy. We’ve gotten used to being connected all the time, but we have found ways around conversation — at least from conversation that is open-ended and spontaneous, in which we play with ideas and allow ourselves to be fully present and vulnerable. But it is in this type of conversation — where we learn to make eye contact, to become aware of another person’s posture and tone, to comfort one another and respectfully challenge one another — that empathy and intimacy flourish. In these conversations, we learn who we are.
Of course, we can find empathic conversations today, but the trend line is clear. It’s not only that we turn away from talking face to face to chat online. It’s that we don’t allow these conversations to happen in the first place because we keep our phones in the landscape. [Continue reading…]
As human power keeps growing, our ability to harm or benefit other animals grows with it
Yuval Noah Harari writes: This is the basic lesson of evolutionary psychology: a need shaped thousands of generations ago continues to be felt subjectively even if it is no longer necessary for survival and reproduction in the present. Tragically, the agricultural revolution gave humans the power to ensure the survival and reproduction of domesticated animals while ignoring their subjective needs. In consequence, domesticated animals are collectively the most successful animals in the world, and at the same time they are individually the most miserable animals that have ever existed.
The situation has only worsened over the last few centuries, during which time traditional agriculture gave way to industrial farming. In traditional societies such as ancient Egypt, the Roman empire or medieval China, humans had a very partial understanding of biochemistry, genetics, zoology and epidemiology. Consequently, their manipulative powers were limited. In medieval villages, chickens ran free between the houses, pecked seeds and worms from the garbage heap, and built nests in the barn. If an ambitious peasant tried to lock 1,000 chickens inside a crowded coop, a deadly bird-flu epidemic would probably have resulted, wiping out all the chickens, as well as many villagers. No priest, shaman or witch doctor could have prevented it. But once modern science had deciphered the secrets of birds, viruses and antibiotics, humans could begin to subject animals to extreme living conditions. With the help of vaccinations, medications, hormones, pesticides, central air-conditioning systems and automatic feeders, it is now possible to cram tens of thousands of chickens into tiny coops, and produce meat and eggs with unprecedented efficiency.
The fate of animals in such industrial installations has become one of the most pressing ethical issues of our time, certainly in terms of the numbers involved. These days, most big animals live on industrial farms. We imagine that our planet is populated by lions, elephants, whales and penguins. That may be true of the National Geographic channel, Disney movies and children’s fairytales, but it is no longer true of the real world. The world contains 40,000 lions but, by way of contrast, there are around 1 billion domesticated pigs; 500,000 elephants and 1.5 billion domesticated cows; 50 million penguins and 20 billion chickens. [Continue reading…]
Imagining strange new lifeforms could help us discover our own origins
By Michael Page, University of Huddersfield
From the earliest of times, philosophers and scientists have tried to understand the relationship between animate and inanimate matter. But the origin of life remains one of the major scientific riddles to be solved.
The building blocks of life as we know it essentially consist of four groups of chemicals: proteins, nucleic acids, lipids (fats) and carbohydrates. There was much excitement about the possibility of finding amino acids (the ingredients for proteins) on comets or distant planets because some scientists believe that life on Earth, or at least its building blocks, may have originally come from outer space and been deposited by meteorites.
But there are now extensive examples of how natural processes on Earth can convert simple molecules into these building blocks. Scientists have demonstrated in the lab how to make amino acids, simple sugars, lipids and even nucleotides – the basic units of DNA – from very simple chemicals, under conditions that could have existed on early earth. What still eludes them is the point in the process when a chemical stew becomes an organism. How did the first lifeforms become alive?
Bible Belt atheist
Jason Cohn and Camille Servan-Schreiber: Growing up in Los Angeles and Paris, we both were raised secular and embraced atheism early and easily. It’s not that we didn’t ponder life’s mysteries; it’s just that after we reasoned away our religious questions, we stopped worrying about them and moved on. When we learned about the former pastor Jerry DeWitt’s struggles with being an “outed” atheist in rural Louisiana, we realized for the first time just how difficult being an atheist can be in some communities, where religion is woven deeply into the social fabric. [Continue reading…]
Paleogenetics is helping to solve the great mystery of prehistory: How did humans spread out over the earth?
Jacob Mikanowski writes: Most of human history is prehistory. Of the 200,000 or more years that humans have spent on Earth, only a tiny fraction have been recorded in writing. Even in our own little sliver of geologic time, the 12,000 years of the Holocene, whose warm weather and relatively stable climate incubated the birth of agriculture, cities, states, and most of the other hallmarks of civilisation, writing has been more the exception than the rule.
Professional historians can’t help but pity their colleagues on the prehistoric side of the fence. Historians are accustomed to drawing on vast archives, but archaeologists must assemble and interpret stories from scant material remains. In the annals of prehistory, cultures are designated according to modes of burial such as ‘Single Grave’, or after styles of arrowhead, such as ‘Western Stemmed Point’. Whole peoples are reduced to styles of pottery, such as Pitted Ware, Corded Ware or Funnel Beaker, all of them spread across the map in confusing, amoeba-like blobs.
In recent years, archaeologists have become reluctant to infer too much from assemblages of ceramics, weapons and grave goods. For at least a generation, they have been drilled on the mantra that ‘pots are not people’. Material culture is not a proxy for identity. Artefacts recovered from a dig can provide a wealth of information about a people’s mode of subsistence, funeral rites and trade contacts, but they are not a reliable guide to their language or ethnicity – or their patterns of migration.
Before the Second World War, prehistory was seen as a series of invasions, with proto-Celts and Indo-Aryans swooping down on unsuspecting swaths of Europe and Asia like so many Vikings, while megalith builders wandered between continents in indecisive meanders. After the Second World War, this view was replaced by the processual school, which attributed cultural changes to internal adaptations. Ideas and technologies might travel, but people by and large stayed put. Today, however, migration is making a comeback.
Much of this shift has to do with the introduction of powerful new techniques for studying ancient DNA. The past five years have seen a revolution in the availability and scope of genetic testing that can be performed on prehistoric human and animal remains. Ancient DNA is tricky to work with. Usually it’s degraded, chemically altered and cut into millions of short fragments. But recent advances in sequencing technology have made it possible to sequence whole genomes from samples reaching back thousands, and tens of thousands, of years. Whole-genome sequencing yields orders of magnitude more data than organelle-based testing, and allows geneticists to make detailed comparisons between individuals and populations. Those comparisons are now illuminating new branches of the human family tree. [Continue reading…]
A Flemish family care system
Mike Jay writes: Half an hour on the slow train from Antwerp, surrounded by flat, sparsely populated farmlands, Geel (pronounced, roughly, ‘Hyale’) strikes the visitor as a quiet, tidy but otherwise unremarkable Belgian market town. Yet its story is unique. For more than 700 years its inhabitants have taken the mentally ill and disabled into their homes as guests or ‘boarders’. At times, these guests have numbered in the thousands, and arrived from all over Europe. There are several hundred in residence today, sharing their lives with their host families for years, decades or even a lifetime. One boarder recently celebrated 50 years in the Flemish town, arranging a surprise party at the family home. Friends and neighbours were joined by the mayor and a full brass band.
Among the people of Geel, the term ‘mentally ill’ is never heard: even words such as ‘psychiatric’ and ‘patient’ are carefully hedged with finger-waggling and scare quotes. The family care system, as it’s known, is resolutely non-medical. When boarders meet their new families, they do so, as they always have, without a backstory or clinical diagnosis. If a word is needed to describe them, it’s often a positive one such as ‘special’, or at worst, ‘different’. This might in fact be more accurate than ‘mentally ill’, since the boarders have always included some who would today be diagnosed with learning difficulties or special needs. But the most common collective term is simply ‘boarders’, which defines them at the most pragmatic level by their social, not mental, condition. These are people who, whatever their diagnosis, have come here because they’re unable to cope on their own, and because they have no family or friends who can look after them.
The origins of the Geel story lie in the 13th century, in the martyrdom of Saint Dymphna, a legendary seventh-century Irish princess whose pagan father went mad with grief after the death of his Christian wife and demanded that Dymphna marry him. To escape the king’s incestuous passion, Dymphna fled to Europe and holed up in the marshy flatlands of Flanders. Her father finally tracked her down in Geel, and when she refused him once more, he beheaded her. Over time, she became revered as a saint with powers of intercession for the mentally afflicted, and her shrine attracted pilgrims and tales of miraculous cures. [Continue reading…]
Why futurism has a cultural blindspot
Tom Vanderbilt writes: In early 1999, during the halftime of a University of Washington basketball game, a time capsule from 1927 was opened. Among the contents of this portal to the past were some yellowing newspapers, a Mercury dime, a student handbook, and a building permit. The crowd promptly erupted into boos. One student declared the items “dumb.”
Such disappointment in time capsules seems to run endemic, suggests William E. Jarvis in his book Time Capsules: A Cultural History. A headline from The Onion, he notes, sums it up: “Newly unearthed time capsule just full of useless old crap.” Time capsules, after all, exude a kind of pathos: They show us that the future was not quite as advanced as we thought it would be, nor did it come as quickly. The past, meanwhile, turns out to not be as radically distinct as we thought.
In his book Predicting the Future, Nicholas Rescher writes that “we incline to view the future through a telescope, as it were, thereby magnifying and bringing nearer what we can manage to see.” So too do we view the past through the other end of the telescope, making things look farther away than they actually were, or losing sight of some things altogether.
These observations apply neatly to technology. We don’t have the personal flying cars we predicted we would. Coal, notes the historian David Edgerton in his book The Shock of the Old, was a bigger source of power at the dawn of the 21st century than in sooty 1900; steam was more significant in 1900 than 1800.
But when it comes to culture we tend to believe not that the future will be very different than the present day, but that it will be roughly the same. Try to imagine yourself at some future date. Where do you imagine you will be living? What will you be wearing? What music will you love?
Chances are, that person resembles you now. As the psychologist George Lowenstein and colleagues have argued, in a phenomenon they termed “projection bias,” people “tend to exaggerate the degree to which their future tastes will resemble their current tastes.” [Continue reading…]