University of New England, Australia: We humans like to think of ourselves as unique for many reasons, not least of which being our ability to communicate with words. But ground-breaking research by an expert from the University of New England shows that our ‘misunderstood cousins,’ the Neanderthals, may well have spoken in languages not dissimilar to the ones we use today.
Pinpointing the origin and evolution of speech and human language is one of the longest running and most hotly debated topics in the scientific world. It has long been believed that other beings, including the Neanderthals with whom our ancestors shared the Earth for thousands of years, simply lacked the necessary cognitive capacity and vocal hardware for speech.
Associate Professor Stephen Wroe, a zoologist and palaeontologist from UNE, along with an international team of scientists and the use of 3D x-ray imaging technology, made the revolutionary discovery challenging this notion based on a 60,000 year-old Neanderthal hyoid bone discovered in Israel in 1989.
“To many, the Neanderthal hyoid discovered was surprising because its shape was very different to that of our closest living relatives, the chimpanzee and the bonobo. However, it was virtually indistinguishable from that of our own species. This led to some people arguing that this Neanderthal could speak,” A/Professor Wroe said.
“The obvious counterargument to this assertion was that the fact that hyoids of Neanderthals were the same shape as modern humans doesn’t necessarily mean that they were used in the same way. With the technology of the time, it was hard to verify the argument one way or the other.”
However advances in 3D imaging and computer modelling allowed A/Professor Wroe’s team to revisit the question.
“By analysing the mechanical behaviour of the fossilised bone with micro x-ray imaging, we were able to build models of the hyoid that included the intricate internal structure of the bone. We then compared them to models of modern humans. Our comparisons showed that in terms of mechanical behaviour, the Neanderthal hyoid was basically indistinguishable from our own, strongly suggesting that this key part of the vocal tract was used in the same way.
“From this research, we can conclude that it’s likely that the origins of speech and language are far, far older than once thought.”
The ability to discern the emotions of others provides the foundation for emotional intelligence. How well-developed this faculty is seems to have little to do with the strength of other markers of intelligence, indeed, as a new study seems to imply, there may be little reason to see in emotional intelligence much that is uniquely human.
Scientific American: [A]lthough dogs have the capacity to understand more than 100 words, studies have demonstrated Fido can’t really speak human languages or comprehend them with the same complexity that we do. Yet researchers have now discovered that dog and human brains process the vocalizations and emotions of others more similarly than previously thought. The findings suggest that although dogs cannot discuss relativity theory with us, they do seem to be wired in a way that helps them to grasp what we feel by attending to the sounds we make.
To compare active human and dog brains, postdoctoral researcher Attila Andics and his team from MTA-ELTE Comparative Ethology Research Group in Hungary trained 11 dogs to lie still in an fMRI brain scanner for several six minute intervals so that the researchers could perform the same experiment on both human and canine participants. Both groups listened to almost two hundred dog and human sounds — from whining and crying to laughter and playful barking — while the team scanned their brain activity.
The resulting study, published in Current Biology today, reveals both that dog brains have voice-sensitive regions and that these neurological areas resemble those of humans. Sharing similar locations in both species, they process voices and emotions of other individuals similarly. Both groups respond with greater neural activity when they listen to voices reflecting positive emotions such as laughing than to negative sounds that include crying or whining. Dogs and people, however, respond more strongly to the sounds made by their own species. “Dogs and humans meet in a very similar social environment but we didn’t know before just how similar the brain mechanisms are to process this social information,” Andics says. [Continue reading...]
The Telegraph reports: Footprints left behind by what may be one our first human ancestors to arrive in Britain have been discovered on a beach in Norfolk.
The preserved tracks, which consisted of 49 imprints in a soft sedimentary rock, are believed to be around 900,000 years old and could transform scientists understanding of how early humans moved around the world.
The footprints were found in what scientists have described as a “million to one” discovery last summer when heavy seas washed sand off the foreshore in Happisburgh, Norfolk.
The find has only now been made public and are thought to be the oldest evidence of early humans in northern Europe yet to be discovered. [Continue reading...]
To an adult, most of the connotations of home seem positive: safety, stability, familiarity, comfort, nurturing. Yet as Ian Tattersall points out, to be tied to one place in a changing environment marked a turning point in human evolution — the juncture at which we placed ourselves in opposition to nature.
Archaeologists begin to see proto-houses during the Ice Age, some 15,000 years ago. Hunter-gatherers at the Ukrainian site of Mezhirich built four oval-to-circular huts that ranged from 120 to 240 square feet in area, and were clad in tons of mammoth bones. Out there on the treeless tundra, their occupants would have cooperated in hunting reindeer and other grazers that migrated seasonally through the area. The Mezhirich people dug pits in the permafrost that acted as natural “freezers” to preserve their meat and let them spend several months at a time in the “village.” With so much labor invested in the construction of their houses, it is hard to imagine that the Mezhirich folk did not somehow feel “at home” there.
But if an archaeologist had to pick an example of the earliest structures that most resembled our modern idea of home, it would probably be the round houses built by the semi-sedentary Natufians, an ancient people who lived around the eastern end of the Mediterranean Sea (Israel, Syria, and environs) at the end of the last Ice Age, some 12,000 years ago. A typical Natufian village consisted of several circular huts each measuring about 10 to 20 feet in diameter; these villages testify to a revolutionary change in human living arrangements. Finally, people were regularly living in semi-permanent settlements, in which the houses were clearly much more than simple shelters against the elements. The Natufians were almost certainly witness to a dramatic change in society.
The end of the Ice Age was a time of transition from a hunter-gatherer mode of subsistence to an agricultural way of life. But it also involved a Faustian bargain. Adopting a fixed residence went hand-in-hand with cultivating fields and domesticating animals. It allowed families to grow, providing additional labor to till the fields. But becoming dependent on the crops they grew meant that people found themselves in opposition to the environment: The rain didn’t fall and the sun didn’t shine at the farmers’ convenience. They locked themselves into a lifestyle, and to make the field continuously productive to feed their growing families, they had to modify their landscape.
Suzanne Moore writes: The last time I put my own atheism through the spin cycle rather than simply wiping it clean was when I wanted to make a ceremony after the birth of my third child. Would it be a blessing? From who? What does the common notion of a new baby as a gift mean? How would we make it meaningful to the people we invited who were from different faiths? And, importantly, what would it look like?
One of the problems I have with the New Atheism is that it fixates on ethics, ignoring aesthetics at its peril. It tends also towards atomisation, relying on abstracts such as “civic law” to conjure a collective experience. But I love ritual, because it is through ritual that we remake and strengthen our social bonds. As I write, down the road there is a memorial being held for Lou Reed, hosted by the local Unitarian church. Most people there will have no belief in God but will feel glad to be part of a shared appreciation of a man whose god was rock’n'roll.
When it came to making a ceremony, I really did not want the austerity of some humanist events I have attended, where I feel the sensual world is rejected. This is what I mean about aesthetics. Do we cede them to the religious and just look like a bunch of Calvinists? I found myself turning to flowers, flames and incense. Is there anything more beautiful than the offerings made all over the world, of tiny flames and blossom on leaves floating on water?
Already, I am revealing a kind of neo-paganism that hardcore rationalist will find unacceptable. But they find most human things unacceptable. For me, not believing in God does not mean one has to forgo poetry, magic, the chaos of ritual, the remaking of shared bonds. I fear ultra-orthodox atheism has come to resemble a rigid and patriarchal faith itself. [Continue reading...]
The New York Times reports: Early in the 20th century, two brothers discovered a nearly complete Neanderthal skeleton in a pit inside a cave at La Chapelle-aux-Saints, in southwestern France. The discovery raised the possibility that these evolutionary relatives of ours intentionally buried their dead — at least 50,000 years ago, before the arrival of anatomically modern humans in Europe.
These and at least 40 subsequent discoveries, a few as far from Europe as Israel and Iraq, appeared to suggest that Neanderthals, long thought of as brutish cave dwellers, actually had complex funeral practices. Yet a significant number of researchers have since objected that the burials were misinterpreted, and might not represent any advance in cognitive and symbolic behavior.
Now an international team of scientists is reporting that a 13-year re-examination of the burials at La Chapelle-aux-Saints supports the earlier claims that the burials were intentional.
The researchers — archaeologists, geologists and paleoanthropologists — not only studied the skeleton from the original excavations, but found more Neanderthal remains, from two children and an adult. They also studied the bones of other animals in the cave, mainly bison and reindeer, and the geology of the burial pits.
The findings, in this week’s issue of Proceedings of the National Academy of Sciences, “buttress claims for complex symbolic behavior among Western European Neanderthals,” the scientists reported.
William Rendu, the paper’s lead author and a researcher at the Center for International Research in the Humanities and Social Sciences in New York, said in an interview that the geology of the burial pits “cannot be explained by natural events” and that “there is no sign of weathering and scavenging by animals,” which means the bodies were covered soon after death.
“While we cannot know if this practice was part of a ritual or merely pragmatic,” Dr. Rendu said in a statement issued by New York University, “the discovery reduces the behavioral distance between them and us.” [Continue reading...]
Dominique Mosbergen writes: Researchers from the University of Adelaide in Australia argue in an upcoming book, The Dynamic Human, that humans really aren’t much smarter than other creatures — and that some animals may actually be brighter than we are.
“For millennia, all kinds of authorities — from religion to eminent scholars — have been repeating the same idea ad nauseam, that humans are exceptional by virtue that they are the smartest in the animal kingdom,” the book’s co-author Dr. Arthur Saniotis, a visiting research fellow with the university’s School of Medical Sciences, said in a written statement. “However, science tells us that animals can have cognitive faculties that are superior to human beings.”
Not to mention, ongoing research on intelligence and primate brain evolution backs the idea that humans aren’t the cleverest creatures on Earth, co-author Dr. Maciej Henneberg, a professor also at the School of Medical Sciences, told The Huffington Post in an email.
The researchers said the belief in the superiority of that human intelligence can be traced back around 10,000 years to the Agricultural Revolution, when humans began domesticating animals. The idea was reinforced with the advent of organized religion, which emphasized human beings’ superiority over other creatures. [Continue reading...]
At various times in my life, I’ve crossed paths with people possessing immense wealth and power, providing me with glimpses of the mindset of those who regard themselves as the most important people on this planet.
From what I can tell, the concentration of great power does not coincide with the expression of great intelligence. What is far more evident is a great sense of entitlement, which is to say a self-validating sense that power rests where power belongs and that the inequality in its distribution is a reflection of some kind of natural order.
Since this self-serving perception of hierarchical order operates among humans and since humans as a species wield so much more power than any other, it’s perhaps not surprising that we exhibit the same kind of hubris collectively that we see individually in the most dominant among us.
Nevertheless, it is becoming increasingly clear that our sense of superiority is rooted in ignorance.
Amit Majmudar writes: There may come a time when we cease to regard animals as inferior, preliminary iterations of the human—with the human thought of as the pinnacle of evolution so far—and instead regard all forms of life as fugue-like elaborations of a single musical theme.
Animals are routinely superhuman in one way or another. They outstrip us in this or that perceptual or physical ability, and we think nothing of it. It is only our kind of superiority (in the use of tools, basically) that we select as the marker of “real” superiority. A human being with an elephant’s hippocampus would end up like Funes the Memorious in the story by Borges; a human being with a dog’s olfactory bulb would become a Vermeer of scent, but his art would be lost on the rest of us, with our visually dominated brains. The poetry of the orcas is yet to be translated; I suspect that the whale sagas will have much more interesting things in them than the tablets and inscriptions of Sumer and Akkad.
If science should ever persuade people of this biological unity, it would be of far greater benefit to the species than penicillin or cardiopulmonary bypass; of far greater benefit to the planet than the piecemeal successes of environmental activism. We will have arrived, by study and reasoning, at the intuitive, mystical insights of poets.
The pursuit of artificial intelligence has been driven by the assumption that if human intelligence can be replicated or advanced upon by machines then this accomplishment will in various ways serve the human good. At the same time, thanks to the technophobia promoted in some dystopian science fiction, there is a popular fear that if machines become smarter than people we will end up becoming their slaves.
It turns out that even if there are some irrational fears wrapped up in technophobia, there are good reasons to regard computing devices as a threat to human intelligence.
It’s not that we are creating machines that harbor evil designs to take over the world, but simply that each time we delegate a function of the brain to an external piece of circuitry, our mental faculties inevitably atrophy.
Use it or lose it applies just as much to the brain as it does to any other part of the body.
Carolyn Gregoire writes: Take a moment to think about the last time you memorized someone’s phone number. Was it way back when, perhaps circa 2001? And when was the last time you were at a dinner party or having a conversation with friends, when you whipped out your smartphone to Google the answer to someone’s question? Probably last week.
Technology changes the way we live our daily lives, the way we learn, and the way we use our faculties of attention — and a growing body of research has suggested that it may have profound effects on our memories (particularly the short-term, or working, memory), altering and in some cases impairing its function.
The implications of a poor working memory on our brain functioning and overall intelligence levels are difficult to over-estimate.
“The depth of our intelligence hinges on our ability to transfer information from working memory, the scratch pad of consciousness, to long-term memory, the mind’s filing system,” Nicholas Carr, author of The Shallows: What The Internet Is Doing To Our Brains, wrote in Wired in 2010. “When facts and experiences enter our long-term memory, we are able to weave them into the complex ideas that give richness to our thought.”
While our long-term memory has a nearly unlimited capacity, the short-term memory has more limited storage, and that storage is very fragile. “A break in our attention can sweep its contents from our mind,” Carr explains.
Meanwhile, new research has found that taking photos — an increasingly ubiquitous practice in our smartphone-obsessed culture — actually hinders our ability to remember that which we’re capturing on camera.
Concerned about premature memory loss? You probably should be. Here are five things you should know about the way technology is affecting your memory.
1. Information overload makes it harder to retain information.
Even a single session of Internet usage can make it more difficult to file away information in your memory, says Erik Fransén, computer science professor at Sweden’s KTH Royal Institute of Technology. And according to Tony Schwartz, productivity expert and author of The Way We’re Working Isn’t Working, most of us aren’t able to effectively manage the overload of information we’re constantly bombarded with. [Continue reading...]
As I pointed out in a recent post, the externalization of intelligence long preceded the creation of smart phones and personal computers. Indeed, it goes all the way back to the beginning of civilization when we first learned how to transform language into a material form as the written word, thereby creating a substitute for memory.
Plato foresaw the consequences of writing.
In Phaedrus, he describes an exchange between the god Thamus, king and ruler of all Egypt, and the god Theuth, who has invented writing. Theuth, who is very proud of what he has created says: “This invention, O king, will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.” But Thamus points out that while one man has the ability to invent, the ability to judge an invention’s usefulness or harmfulness belongs to another.
If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.
Bedazzled by our ingenuity and its creations, we are fast forgetting the value of this quality that can never be implanted in a machine (or a text): wisdom.
David Perlmutter, MD writes: While gluten makes up the lion’s share of protein in wheat, research reveals that modern wheat is capable of producing more than 23,000 different proteins, any one of which could trigger a potentially damaging inflammatory response. One protein in particular is wheat germ agglutinin (WGA). WGA is classified as a lectin — a term for a protein produced by an organism to protect itself from predation.
All grains produce lectins, which selectively bind to unique proteins on the surfaces of bacteria, fungi, and insects. These proteins are found throughout the animal kingdom. One protein in particular for which WGA has an extremely high affinity is N-Acetylglucosamine. N-Acetylglucosamine richly adorns the casing of insects and plays an important role in the structure of the cellular walls of bacteria. More importantly, it is a key structural component in humans in a variety of tissues, including tendons, joint surfaces, cartilage, the lining of the entire digestive tract, and even the lining of the hundreds of miles of blood vessels found within each of us.
It is precisely the ability of WGA to bind to proteins lining the gut that raises concern amongst medical researchers. When WGA binds to these proteins, it may leave these cells less well protected against the harmful effects of the gut contents.
WGA may also have direct toxic effects on the heart, endocrine, and immune systems, and even the brain. In fact, so readily does WGA make its way into the brain that scientists are actually testing it as a possible means of delivering medicines in an attempt to treat Alzheimer’s disease.
And again, the concern here is not just for a small segment of the population who happened to inherit susceptibility for sensitivity to gluten. This is a concern as it relates to all humans. As medical researcher Sayer Ji stated, “What is unique about WGA is that it can do direct damage to the majority of tissues in the human body without requiring a specific set of genetic susceptibilities and/or immune-mediated articulations. This may explain why chronic inflammatory and degenerative conditions are endemic to wheat-consuming populations even when overt allergies or intolerances to wheat gluten appear exceedingly rare.”
The gluten issue is indeed very real and threatening. But it now seems clear that lectin proteins found in wheat may harbor the potential for even more detrimental effects on human health. It is particularly alarming to consider the fact that there is a move to actually genetically modify wheat to enhance its WGA content.
Scientific research is now giving us yet another reason to reconsider the merits of our daily bread. The story of WGA’s potential destructive effects on human health is just beginning to be told. We should embrace the notion that low levels of exposure to any toxin over an extended period can lead to serious health issues. And this may well characterize the under-recognized threat of wheat consumption for all humans.
The New York Times reports: Scientists have found the oldest DNA evidence yet of humans’ biological history. But instead of neatly clarifying human evolution, the finding is adding new mysteries.
In a paper in the journal Nature, scientists reported Wednesday that they had retrieved ancient human DNA from a fossil dating back about 400,000 years, shattering the previous record of 100,000 years.
The fossil, a thigh bone found in Spain, had previously seemed to many experts to belong to a forerunner of Neanderthals. But its DNA tells a very different story. It most closely resembles DNA from an enigmatic lineage of humans known as Denisovans. Until now, Denisovans were known only from DNA retrieved from 80,000-year-old remains in Siberia, 4,000 miles east of where the new DNA was found.
The mismatch between the anatomical and genetic evidence surprised the scientists, who are now rethinking human evolution over the past few hundred thousand years. It is possible, for example, that there are many extinct human populations that scientists have yet to discover. They might have interbred, swapping DNA. Scientists hope that further studies of extremely ancient human DNA will clarify the mystery.
“Right now, we’ve basically generated a big question mark,” said Matthias Meyer, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and a co-author of the new study. [Continue reading...]
The New York Times reports: The genome of a young boy buried at Mal’ta near Lake Baikal in eastern Siberia some 24,000 years ago has turned out to hold two surprises for anthropologists.
The first is that the boy’s DNA matches that of Western Europeans, showing that during the last Ice Age people from Europe had reached farther east across Eurasia than previously supposed. Though none of the Mal’ta boy’s skin or hair survives, his genes suggest he would have had brown hair, brown eyes and freckled skin.
The second surprise is that his DNA also matches a large proportion — about 25 percent — of the DNA of living Native Americans. The first people to arrive in the Americas have long been assumed to have descended from Siberian populations related to East Asians. It now seems that they may be a mixture between the Western Europeans who had reached Siberia and an East Asian population.
The Mal’ta boy was 3 to 4 years old and was buried under a stone slab wearing an ivory diadem, a bead necklace and a bird-shaped pendant. Elsewhere at the same site about 30 Venus figurines were found of the kind produced by the Upper Paleolithic cultures of Europe. The remains were excavated by Russian archaeologists over a 20-year period ending in 1958 and stored in museums in St. Petersburg.
There they lay for some 50 years until they were examined by a team led by Eske Willerslev of the University of Copenhagen. Dr. Willerslev, an expert in analyzing ancient DNA, was seeking to understand the peopling of the Americas by searching for possible source populations in Siberia. He extracted DNA from bone taken from the child’s upper arm, hoping to find ancestry in the East Asian peoples from whom Native Americans are known to be descended.
But the first results were disappointing. The boy’s mitochondrial DNA belonged to the lineage known as U, which is commonly found among the modern humans who first entered Europe about 44,000 years ago. The lineages found among Native Americans are those designated A, B, C, D and X, so the U lineage pointed to contamination of the bone by the archaeologists or museum curators who had handled it, a common problem with ancient DNA projects. “The study was put on low speed for about a year because I thought it was all contamination,” Dr. Willerslev said.
His team proceeded anyway to analyze the nuclear genome, which contains the major part of human inheritance. They were amazed when the nuclear genome also turned out to have partly European ancestry. Examining the genome from a second Siberian grave site, that of an adult who died 17,000 years ago, they found the same markers of European origin. Together, the two genomes indicate that descendants of the modern humans who entered Europe had spread much farther east across Eurasia than had previously been assumed and occupied Siberia during an extremely cold period starting 20,000 years ago that is known as the Last Glacial Maximum.
The other surprise from the Mal’ta boy’s genome was that it matched to both Europeans and Native Americans but not to East Asians. Dr. Willerslev’s interpretation was that the ancestors of Native Americans had already separated from the East Asian population when they interbred with the people of the Mal’ta culture, and that this admixed population then crossed over the Beringian land bridge that then lay between Siberia and Alaska to become a founding population of Native Americans. [Continue reading...]
UCLA Newsroom: Why do the faces of some primates contain so many different colors — black, blue, red, orange and white — that are mixed in all kinds of combinations and often striking patterns while other primate faces are quite plain?
UCLA biologists reported last year on the evolution of 129 primate faces in species from Central and South America. This research team now reports on the faces of 139 Old World African and Asian primate species that have been diversifying over some 25 million years.
With these Old World monkeys and apes, the species that are more social have more complex facial patterns, the biologists found. Species that have smaller group sizes tend to have simpler faces with fewer colors, perhaps because the presence of more color patches in the face results in greater potential for facial variation across individuals within species. This variation could aid in identification, which may be a more difficult task in larger groups.
Species that live in the same habitat with other closely related species tend to have more complex facial patterns, suggesting that complex faces may also aid in species recognition, the life scientists found.
“Humans are crazy for Facebook, but our research suggests that primates have been relying on the face to tell friends from competitors for the last 50 million years and that social pressures have guided the evolution of the enormous diversity of faces we see across the group today,” said Michael Alfaro, an associate professor of ecology and evolutionary biology in the UCLA College of Letters and Science and senior author of the study.
“Faces are really important to how monkeys and apes can tell one another apart,” he said. “We think the color patterns have to do both with the importance of telling individuals of your own species apart from closely related species and for social communication among members of the same species.” [Continue reading...]