Michael Schulson writes: In the 1970s, a young American anthropologist named Michael Dove set out for Indonesia, intending to solve an ethnographic mystery. Then a graduate student at Stanford, Dove had been reading about the Kantu’, a group of subsistence farmers who live in the tropical forests of Borneo. The Kantu’ practise the kind of shifting agriculture known to anthropologists as swidden farming, and to everyone else as slash-and-burn. Swidden farmers usually grow crops in nutrient-poor soil. They use fire to clear their fields, which they abandon at the end of each growing season.
Like other swidden farmers, the Kantu’ would establish new farming sites ever year in which to grow rice and other crops. Unlike most other swidden farmers, the Kantu’ choose where to place these fields through a ritualised form of birdwatching. They believe that certain species of bird – the Scarlet-rumped Trogon, the Rufous Piculet, and five others – are the sons-in-law of God. The appearances of these birds guide the affairs of human beings. So, in order to select a site for cultivation, a Kantu’ farmer would walk through the forest until he spotted the right combination of omen birds. And there he would clear a field and plant his crops.
Dove figured that the birds must be serving as some kind of ecological indicator. Perhaps they gravitated toward good soil, or smaller trees, or some other useful characteristic of a swidden site. After all, the Kantu’ had been using bird augury for generations, and they hadn’t starved yet. The birds, Dove assumed, had to be telling the Kantu’ something about the land. But neither he, nor any other anthropologist, had any notion of what that something was.
He followed Kantu’ augurers. He watched omen birds. He measured the size of each household’s harvest. And he became more and more confused. Kantu’ augury is so intricate, so dependent on slight alterations and is-the-bird-to-my-left-or-my-right contingencies that Dove soon found there was no discernible correlation at all between Piculets and Trogons and the success of a Kantu’ crop. The augurers he was shadowing, Dove told me, ‘looked more and more like people who were rolling dice’. [Continue reading...]
Nautilus: Neil Shubin has been going backward his whole life. “I teach anatomy but I want to understand why things look the way they do,” says the paleontologist and professor of organismal biology and anatomy at the University of Chicago. “And to understand the fundamental questions you have to go ever deeper into history. So I have gone backward from humans to fish to planets.”
Shubin, 53, is referring to his two books, Your Inner Fish and The Universe Within, which detail the atoms and molecules, genes and cells, sculpted by evolution into the common bonds of life. In 2004, on Ellesmere Island in the Arctic, Shubin discovered one of the key links in animal evolution, the fish known as Tiktaalik, that, he writes, “was specialized for a rather extraordinary function: it was capable of doing push-ups.”
Shubin and his team learned from Tiktaalik fossils that the big fish with the flat head had a shoulder, elbow, and wrist composed of the same bones in a human’s upper arm, forearm, and wrist. Tiktaalik used those bones to navigate shallow streams and ponds “and even to flop around on the mudflats along the banks.” Here was the creature from the lagoon that revealed how animals evolved from fish to us. [Continue reading...]
The New York Times reports: Most geneticists agree that Native Americans are descended from Siberians who crossed into America 26,000 to 18,000 years ago via a land bridge over the Bering Strait. But while genetic analysis of modern Native Americans lends support to this idea, strong fossil evidence has been lacking.
Now a nearly complete skeleton of a prehistoric teenage girl, newly discovered in an underwater cave in the Yucatán Peninsula, establishes a clear link between the ancient and modern peoples, scientists say.
Writing in the journal Science, the researchers report that they analyzed mitochondrial DNA — genetic material passed down through the mother — that was extracted from the skeleton’s wisdom tooth by divers. The analysis reveals that the girl, who lived at least 12,000 years ago, belonged to an Asian-derived genetic lineage seen only in Native Americans. [Continue reading...]
The Guardian reports: Scientists have concluded that Neanderthals were not the primitive dimwits they are commonly portrayed to have been.
The view of Neanderthals as club-wielding brutes is one of the most enduring stereotypes in science, but researchers who trawled the archaeological evidence say the image has no basis whatsoever.
They said scientists had fuelled the impression of Neanderthals being less than gifted in scores of theories that purport to explain why they died out while supposedly superior modern humans survived.
Wil Roebroeks at Leiden University in the Netherlands said: “The connotation is generally negative. For instance, after incidents with the Dutch Ajax football hooligans about a week ago, one Dutch newspaper piece pleaded to make football stadiums off-limits for such ‘Neanderthals’.”
The Neanderthals are believed to have lived between roughly 350,000 and 40,000 years ago, their populations spreading from Portugal in the west to the Altai mountains in central Asia in the east. They vanished from the fossil record when modern humans arrived in Europe.
The reasons for the demise of the Neanderthals have long been debated in the scientific community, but many explanations assume that modern humans had a cognitive edge that manifested itself in more cooperative hunting, better weaponry and innovation, a broader diet, or other major advantages.
Roebroeks and his colleague, Dr Paola Villa at the University of Colorado Museum in Boulder, trawled through the archaeological records to look for evidence of modern human superiority that underpinned nearly a dozen theories about the Neanderthals’ demise and found that none of them stood up.
“The explanations make good stories, but the only problem is that there is no archaeology to back them up,” said Roebroeks.
Villa said part of the misunderstanding had arisen because researchers compared Neanderthals with their successors, the modern humans who lived in the Upper Palaeolithic, rather than the humans who lived at the same time. That is like saying people in the 19th century were less intelligent than those in the 21st because they didn’t have laptops and space travel.
“The evidence for cognitive inferiority is simply not there,” said Villa. “What we are saying is that the conventional view of Neanderthals is not true.” The study is published in the journal Plos One. [Continue reading...]
It’s always worth remembering that modernity as it is lived (rather than as it is written about) is nothing more than a name for the present — that point which stands right on the edge of an unknown future. In this sense all humans and other hominids have lived in a modern condition and their innovations have been defined by what was contemporary.
If comparisons can usefully be made between humans and their closest kin at different points in history, rather than judge them on the basis of the artifacts they have created, a more interesting question is how well each has been attuned to the environment that supports them.
That attunement probably cannot be scientifically quantified since in part it would have to be measured through attributes that might leave no physical traces — such as knowledge about the medicinal properties of plants.
Since the arc of human progress has largely been defined by our increasing ability to cut ourselves off from the world in which we live, in terms of environmental attunement, the human of today is less advanced than a Neanderthal.
Phys.org reports: A team of European researchers is suggesting that humans dispersed out of Africa in multiple waves, rather than in just one, and that it occurred much earlier than has been previously thought. In their paper published in Proceedings of the National Academy of Sciences, the group describes how they built migration models based on gene flow and skull characteristics to predict human migration out of Africa.
Scientists have generally agreed that humans first migrated out of Africa 40,000 to 70,000 years ago, culminating in settlements that span the globe. That estimate has been rocked in recent years however, by discoveries of stone artifacts in the Arabian Desert that date back at least 100,000 years (close to the time that modern humans were thought to have arisen). In this new effort, the researchers have expanded on the idea that humans may have left Africa sooner than most had thought, and that it likely happened via multiple routes, rather than just one.
The models the team built took into account genetic dispersal and human skull shape—they created four possible model scenarios of migration—two that showed a single path out of Africa and two that showed multiple paths. The first of the single migration paths involved people traveling north along the Nile valley then turning right when they hit the Mediterranean Sea. The second involved people meandering along the Arabian Peninsula until making their way to Asia. The multi-path migration models involved people marching out of Africa along several paths, both north and south of the Arabian Peninsula. [Continue reading...]
Nature Communications reports: The gut microbiota is responsible for many aspects of human health and nutrition, but most studies have focused on “western” populations. An international collaboration of researchers, including researchers of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, has for the first time analysed the gut microbiota of a modern hunter-gatherer community, the Hadza of Tanzania. The results of this work show that Hadza harbour a unique microbial profile with features yet unseen in any other human group, supporting the notion that Hadza gut bacteria play an essential role in adaptation to a foraging subsistence pattern. The study further shows how the intestinal flora may have helped our ancestors adapt and survive during the Paleolithic.
Bacterial populations have co-evolved with humans over millions of years, and have the potential to help us adapt to new environments and foods. Studies of the Hadza offer an especially rare opportunity for scientists to learn how humans survive by hunting and gathering, in the same environment and using similar foods as our ancestors did.
The research team, composed of anthropologists, microbial ecologists, molecular biologists, and analytical chemists, and led in part by Stephanie Schnorr and Amanda Henry of the Max Planck Institute for Evolutionary Anthropology, compared the Hadza gut microbiota to that of urban living Italians, representative of a “westernized” population. Their results, published recently in Nature Communications, show that the Hadza have a more diverse gut microbe ecosystem, i.e. more bacterial species compared to the Italians. “This is extremely relevant for human health”, says Stephanie Schnorr. “Several diseases emerging in industrialized countries, like IBS, colorectal cancer, obesity, type II diabetes, Crohn’s disease and others, are significantly associated with a reduction in gut microbial diversity.” [Continue reading...]
Jeff Leach recently accompanied some Hadza hunters and observed the way they handled a recently killed adult Impala: Before the two Hadza men I was with jumped in to help skin and gut the Impala, I quickly took swabs of each of their hands (and 1 hour after, 3 hours after, and so on) to assess how the skin (palm) microbiota change throughout the day/week of a typical Hadza (We’ve sampled the hands [and stools] of 150+ Hadza men, women, and children so far). As they slowly and methodically dismembered the animal, they carefully placed the stomach and its still steaming contents on the fleshy side of the recently removed hide. In a separate area, they piled the fatty internal organs (which men are only allowed to eat by the way). Once the animal had been processed more or less, I was amazed to see all three men take a handful of the partially digested plant material from the recently removed stomach to scrub off the copious amounts of blood that now covered their hands and foreman’s. This was followed by a final “cleaning” with dry grass for good measure.
While I was fascinated by the microbe-laden stomach contents being used as hand scrubber – presumably transferring an extraordinary diversity of microbes from the Impala gut to the hands of the Hadza – I was not prepared for what they did next. Once they had cleaned out – by hand – the contents of the stomach (“cleaned” is a generous word), they carved pieces of the stomach into bite-sized chunks and consumed it sushi-style. By which I mean they didn’t cook it or attempt to kill or eliminate the microbes from the gut of the Impala in anyway. And if this unprecedented transfer of microbes from the skin, blood, and stomach of another mammal wasn’t enough, they then turned their attention to the colon of the Impala.
After removing the poo pellets (which we collect samples of as well), they tossed the tubular colon onto a hastily built fire. However, it only sat on the fire for a minute at best and clearly not long enough to terminate the menagerie of invisible microbes clinging to the inside wall of the colon. They proceeded to cut the colon into chunks and to eat more or less raw. For myself, I kindly turned down offers to taste either the raw stomach or the partially cooked colon – but did eat some tasty Impala ribs I thoroughly turned on a stick over the fire to a microbial-free state of well done.
The Hadza explained that this is what they always do, and have always done (though I suspect sushi-style eating of innards is not an every-kill ritual. But….). Whether it’s an Impala, Dik Dik, Zebra, bush pig, Kudu or any other of the myriad of mammals they hunt and eat, becoming one with the deceased’s microbes in any number of ways is common place – same goes for 700 plus species of birds they hunt (minus abundant amounts of stomach contents for hand sanitizer!). While less obvious than at the “kill site,” the transfer of microbes continued back in camp when women, children and other men handled the newly arrived raw meat, internal organs, and skin. The transfer continued as the hunters engaged (touching) other members of the camp.
The breathtaking exchange (horizontal transfer) of microbes between the Hadza and their environment is more or less how it’s been for eons until humans started walling ourselves off from the microbial world through the many facets of globalization. Rather than think of ourselves as isolated islands of microbes, the Hadza teach us that we are better thought of as an archipelago of islands, once seamlessly connected to one another and to a larger metacommunity of microbes via a microbial super highway that runs through the gut and skin/feathers of every animal and water source on the landscape (for those of you keeping up with your homework, this is Macroecology 101). The same can be said for plants and their extraordinary diversity of microbes above (phyllosphere) and below ground (rhizosphere) that the Hadza, and once all humans, interacted with on a nearly continuous basis.
Christian Science Monitor: The first experiment in “melting pot” politics in North America appears to have emerged nearly 1,000 years ago in the bottom lands of the Mississippi River near today’s St. Louis, according to archaeologists piecing together the story of the rise and fall of the native American urban complex known as Cahokia.
During its heyday, Cahokia’s population reached an estimated 20,000 people – a level the continent north of the Rio Grande wouldn’t see again until the eve of the American Revolution and the growth of New York and Philadelphia.
Cahokia’s ceremonial center, seven miles northeast of St. Louis’s Gateway Arch, boasted 120 earthen mounds, including a broad, tiered mound some 10 stories high. In East St. Louis, one of two major satellites hosts another 50 earthen mounds, as well as residences. St. Louis hosted another 26 mounds and associated dwellings.
These are three of the four largest native-American mound centers known, “all within spitting distance of one another,” says Thomas Emerson, Illinois State Archaeologist and a member of a team testing the melting-pot idea. “That’s some kind of large, integrated complex to some degree.”
Where did all those people come from? Archaeologists have been debating that question for years, Dr. Emerson says. Unfortunately, the locals left no written record of the complex’s history. Artifacts such as pottery, tools, or body ornaments give an ambiguous answer.
Artifacts from Cahokia have been found in other native-American centers from Arkansas and northern Louisiana to Oklahoma, Iowa, and Wisconsin, just as artifacts from these areas appear in digs at Cahokia.
“Archaeologists are always struggling with this: Are artifacts moving, or are people moving?” Emerson says.
Emerson and two colleagues at the University of Illinois at Urbana-Champaign tried to tackle the question using two radioactive forms of the element strontium found in human teeth. They discovered that throughout the 300 years that native Americans occupied Cahokia, the complex appeared to receive a steady stream of immigrants who stayed. [Continue reading...]
For those of us who see industrial civilization as the guarantor of humanity’s destruction, it’s easy to picture an idyllic era earlier in our evolution, located perhaps during the cultural flowering of the Great Leap Forward.
Communities then remained relatively egalitarian without workers enslaved in back-breaking labor, while subsistence on few material resources meant that time was neither controlled by the dictates of a stratified social hierarchy nor by the demands of survival.
When people could accord as much value to storytelling, ritual, and music-making, as they did to hunting and gathering food, we might like to think that human beings were living in balance with nature.
As George Monbiot reveals, the emerging evidence about of our early ancestors paints a much grimmer picture — one in which human nature appears to have always been profoundly destructive.
You want to know who we are? Really? You think you do, but you will regret it. This article, if you have any love for the world, will inject you with a venom – a soul-scraping sadness – without an obvious antidote.
The Anthropocene, now a popular term among scientists, is the epoch in which we live: one dominated by human impacts on the living world. Most date it from the beginning of the industrial revolution. But it might have begun much earlier, with a killing spree that commenced two million years ago. What rose onto its hind legs on the African savannahs was, from the outset, death: the destroyer of worlds.
Before Homo erectus, perhaps our first recognisably human ancestor, emerged in Africa, the continent abounded with monsters. There were several species of elephants. There were sabretooths and false sabretooths, giant hyenas and creatures like those released in The Hunger Games: amphicyonids, or bear dogs, vast predators with an enormous bite.
Prof Blaire van Valkenburgh has developed a means by which we could roughly determine how many of these animals there were. When there are few predators and plenty of prey, the predators eat only the best parts of the carcass. When competition is intense, they eat everything, including the bones. The more bones a carnivore eats, the more likely its teeth are to be worn or broken. The breakages in carnivores’ teeth were massively greater in the pre-human era.
Not only were there more species of predators, including species much larger than any found on Earth today, but they appear to have been much more abundant – and desperate. We evolved in a terrible, wonderful world – that was no match for us. [Continue reading...]
University of New England, Australia: We humans like to think of ourselves as unique for many reasons, not least of which being our ability to communicate with words. But ground-breaking research by an expert from the University of New England shows that our ‘misunderstood cousins,’ the Neanderthals, may well have spoken in languages not dissimilar to the ones we use today.
Pinpointing the origin and evolution of speech and human language is one of the longest running and most hotly debated topics in the scientific world. It has long been believed that other beings, including the Neanderthals with whom our ancestors shared the Earth for thousands of years, simply lacked the necessary cognitive capacity and vocal hardware for speech.
Associate Professor Stephen Wroe, a zoologist and palaeontologist from UNE, along with an international team of scientists and the use of 3D x-ray imaging technology, made the revolutionary discovery challenging this notion based on a 60,000 year-old Neanderthal hyoid bone discovered in Israel in 1989.
“To many, the Neanderthal hyoid discovered was surprising because its shape was very different to that of our closest living relatives, the chimpanzee and the bonobo. However, it was virtually indistinguishable from that of our own species. This led to some people arguing that this Neanderthal could speak,” A/Professor Wroe said.
“The obvious counterargument to this assertion was that the fact that hyoids of Neanderthals were the same shape as modern humans doesn’t necessarily mean that they were used in the same way. With the technology of the time, it was hard to verify the argument one way or the other.”
However advances in 3D imaging and computer modelling allowed A/Professor Wroe’s team to revisit the question.
“By analysing the mechanical behaviour of the fossilised bone with micro x-ray imaging, we were able to build models of the hyoid that included the intricate internal structure of the bone. We then compared them to models of modern humans. Our comparisons showed that in terms of mechanical behaviour, the Neanderthal hyoid was basically indistinguishable from our own, strongly suggesting that this key part of the vocal tract was used in the same way.
“From this research, we can conclude that it’s likely that the origins of speech and language are far, far older than once thought.”
The ability to discern the emotions of others provides the foundation for emotional intelligence. How well-developed this faculty is seems to have little to do with the strength of other markers of intelligence, indeed, as a new study seems to imply, there may be little reason to see in emotional intelligence much that is uniquely human.
Scientific American: [A]lthough dogs have the capacity to understand more than 100 words, studies have demonstrated Fido can’t really speak human languages or comprehend them with the same complexity that we do. Yet researchers have now discovered that dog and human brains process the vocalizations and emotions of others more similarly than previously thought. The findings suggest that although dogs cannot discuss relativity theory with us, they do seem to be wired in a way that helps them to grasp what we feel by attending to the sounds we make.
To compare active human and dog brains, postdoctoral researcher Attila Andics and his team from MTA-ELTE Comparative Ethology Research Group in Hungary trained 11 dogs to lie still in an fMRI brain scanner for several six minute intervals so that the researchers could perform the same experiment on both human and canine participants. Both groups listened to almost two hundred dog and human sounds — from whining and crying to laughter and playful barking — while the team scanned their brain activity.
The resulting study, published in Current Biology today, reveals both that dog brains have voice-sensitive regions and that these neurological areas resemble those of humans. Sharing similar locations in both species, they process voices and emotions of other individuals similarly. Both groups respond with greater neural activity when they listen to voices reflecting positive emotions such as laughing than to negative sounds that include crying or whining. Dogs and people, however, respond more strongly to the sounds made by their own species. “Dogs and humans meet in a very similar social environment but we didn’t know before just how similar the brain mechanisms are to process this social information,” Andics says. [Continue reading...]
The Telegraph reports: Footprints left behind by what may be one our first human ancestors to arrive in Britain have been discovered on a beach in Norfolk.
The preserved tracks, which consisted of 49 imprints in a soft sedimentary rock, are believed to be around 900,000 years old and could transform scientists understanding of how early humans moved around the world.
The footprints were found in what scientists have described as a “million to one” discovery last summer when heavy seas washed sand off the foreshore in Happisburgh, Norfolk.
The find has only now been made public and are thought to be the oldest evidence of early humans in northern Europe yet to be discovered. [Continue reading...]
To an adult, most of the connotations of home seem positive: safety, stability, familiarity, comfort, nurturing. Yet as Ian Tattersall points out, to be tied to one place in a changing environment marked a turning point in human evolution — the juncture at which we placed ourselves in opposition to nature.
Archaeologists begin to see proto-houses during the Ice Age, some 15,000 years ago. Hunter-gatherers at the Ukrainian site of Mezhirich built four oval-to-circular huts that ranged from 120 to 240 square feet in area, and were clad in tons of mammoth bones. Out there on the treeless tundra, their occupants would have cooperated in hunting reindeer and other grazers that migrated seasonally through the area. The Mezhirich people dug pits in the permafrost that acted as natural “freezers” to preserve their meat and let them spend several months at a time in the “village.” With so much labor invested in the construction of their houses, it is hard to imagine that the Mezhirich folk did not somehow feel “at home” there.
But if an archaeologist had to pick an example of the earliest structures that most resembled our modern idea of home, it would probably be the round houses built by the semi-sedentary Natufians, an ancient people who lived around the eastern end of the Mediterranean Sea (Israel, Syria, and environs) at the end of the last Ice Age, some 12,000 years ago. A typical Natufian village consisted of several circular huts each measuring about 10 to 20 feet in diameter; these villages testify to a revolutionary change in human living arrangements. Finally, people were regularly living in semi-permanent settlements, in which the houses were clearly much more than simple shelters against the elements. The Natufians were almost certainly witness to a dramatic change in society.
The end of the Ice Age was a time of transition from a hunter-gatherer mode of subsistence to an agricultural way of life. But it also involved a Faustian bargain. Adopting a fixed residence went hand-in-hand with cultivating fields and domesticating animals. It allowed families to grow, providing additional labor to till the fields. But becoming dependent on the crops they grew meant that people found themselves in opposition to the environment: The rain didn’t fall and the sun didn’t shine at the farmers’ convenience. They locked themselves into a lifestyle, and to make the field continuously productive to feed their growing families, they had to modify their landscape.
Suzanne Moore writes: The last time I put my own atheism through the spin cycle rather than simply wiping it clean was when I wanted to make a ceremony after the birth of my third child. Would it be a blessing? From who? What does the common notion of a new baby as a gift mean? How would we make it meaningful to the people we invited who were from different faiths? And, importantly, what would it look like?
One of the problems I have with the New Atheism is that it fixates on ethics, ignoring aesthetics at its peril. It tends also towards atomisation, relying on abstracts such as “civic law” to conjure a collective experience. But I love ritual, because it is through ritual that we remake and strengthen our social bonds. As I write, down the road there is a memorial being held for Lou Reed, hosted by the local Unitarian church. Most people there will have no belief in God but will feel glad to be part of a shared appreciation of a man whose god was rock’n’roll.
When it came to making a ceremony, I really did not want the austerity of some humanist events I have attended, where I feel the sensual world is rejected. This is what I mean about aesthetics. Do we cede them to the religious and just look like a bunch of Calvinists? I found myself turning to flowers, flames and incense. Is there anything more beautiful than the offerings made all over the world, of tiny flames and blossom on leaves floating on water?
Already, I am revealing a kind of neo-paganism that hardcore rationalist will find unacceptable. But they find most human things unacceptable. For me, not believing in God does not mean one has to forgo poetry, magic, the chaos of ritual, the remaking of shared bonds. I fear ultra-orthodox atheism has come to resemble a rigid and patriarchal faith itself. [Continue reading...]
The New York Times reports: Early in the 20th century, two brothers discovered a nearly complete Neanderthal skeleton in a pit inside a cave at La Chapelle-aux-Saints, in southwestern France. The discovery raised the possibility that these evolutionary relatives of ours intentionally buried their dead — at least 50,000 years ago, before the arrival of anatomically modern humans in Europe.
These and at least 40 subsequent discoveries, a few as far from Europe as Israel and Iraq, appeared to suggest that Neanderthals, long thought of as brutish cave dwellers, actually had complex funeral practices. Yet a significant number of researchers have since objected that the burials were misinterpreted, and might not represent any advance in cognitive and symbolic behavior.
Now an international team of scientists is reporting that a 13-year re-examination of the burials at La Chapelle-aux-Saints supports the earlier claims that the burials were intentional.
The researchers — archaeologists, geologists and paleoanthropologists — not only studied the skeleton from the original excavations, but found more Neanderthal remains, from two children and an adult. They also studied the bones of other animals in the cave, mainly bison and reindeer, and the geology of the burial pits.
The findings, in this week’s issue of Proceedings of the National Academy of Sciences, “buttress claims for complex symbolic behavior among Western European Neanderthals,” the scientists reported.
William Rendu, the paper’s lead author and a researcher at the Center for International Research in the Humanities and Social Sciences in New York, said in an interview that the geology of the burial pits “cannot be explained by natural events” and that “there is no sign of weathering and scavenging by animals,” which means the bodies were covered soon after death.
“While we cannot know if this practice was part of a ritual or merely pragmatic,” Dr. Rendu said in a statement issued by New York University, “the discovery reduces the behavioral distance between them and us.” [Continue reading...]
Dominique Mosbergen writes: Researchers from the University of Adelaide in Australia argue in an upcoming book, The Dynamic Human, that humans really aren’t much smarter than other creatures — and that some animals may actually be brighter than we are.
“For millennia, all kinds of authorities — from religion to eminent scholars — have been repeating the same idea ad nauseam, that humans are exceptional by virtue that they are the smartest in the animal kingdom,” the book’s co-author Dr. Arthur Saniotis, a visiting research fellow with the university’s School of Medical Sciences, said in a written statement. “However, science tells us that animals can have cognitive faculties that are superior to human beings.”
Not to mention, ongoing research on intelligence and primate brain evolution backs the idea that humans aren’t the cleverest creatures on Earth, co-author Dr. Maciej Henneberg, a professor also at the School of Medical Sciences, told The Huffington Post in an email.
The researchers said the belief in the superiority of that human intelligence can be traced back around 10,000 years to the Agricultural Revolution, when humans began domesticating animals. The idea was reinforced with the advent of organized religion, which emphasized human beings’ superiority over other creatures. [Continue reading...]
At various times in my life, I’ve crossed paths with people possessing immense wealth and power, providing me with glimpses of the mindset of those who regard themselves as the most important people on this planet.
From what I can tell, the concentration of great power does not coincide with the expression of great intelligence. What is far more evident is a great sense of entitlement, which is to say a self-validating sense that power rests where power belongs and that the inequality in its distribution is a reflection of some kind of natural order.
Since this self-serving perception of hierarchical order operates among humans and since humans as a species wield so much more power than any other, it’s perhaps not surprising that we exhibit the same kind of hubris collectively that we see individually in the most dominant among us.
Nevertheless, it is becoming increasingly clear that our sense of superiority is rooted in ignorance.
Amit Majmudar writes: There may come a time when we cease to regard animals as inferior, preliminary iterations of the human—with the human thought of as the pinnacle of evolution so far—and instead regard all forms of life as fugue-like elaborations of a single musical theme.
Animals are routinely superhuman in one way or another. They outstrip us in this or that perceptual or physical ability, and we think nothing of it. It is only our kind of superiority (in the use of tools, basically) that we select as the marker of “real” superiority. A human being with an elephant’s hippocampus would end up like Funes the Memorious in the story by Borges; a human being with a dog’s olfactory bulb would become a Vermeer of scent, but his art would be lost on the rest of us, with our visually dominated brains. The poetry of the orcas is yet to be translated; I suspect that the whale sagas will have much more interesting things in them than the tablets and inscriptions of Sumer and Akkad.
If science should ever persuade people of this biological unity, it would be of far greater benefit to the species than penicillin or cardiopulmonary bypass; of far greater benefit to the planet than the piecemeal successes of environmental activism. We will have arrived, by study and reasoning, at the intuitive, mystical insights of poets.
The pursuit of artificial intelligence has been driven by the assumption that if human intelligence can be replicated or advanced upon by machines then this accomplishment will in various ways serve the human good. At the same time, thanks to the technophobia promoted in some dystopian science fiction, there is a popular fear that if machines become smarter than people we will end up becoming their slaves.
It turns out that even if there are some irrational fears wrapped up in technophobia, there are good reasons to regard computing devices as a threat to human intelligence.
It’s not that we are creating machines that harbor evil designs to take over the world, but simply that each time we delegate a function of the brain to an external piece of circuitry, our mental faculties inevitably atrophy.
Use it or lose it applies just as much to the brain as it does to any other part of the body.
Carolyn Gregoire writes: Take a moment to think about the last time you memorized someone’s phone number. Was it way back when, perhaps circa 2001? And when was the last time you were at a dinner party or having a conversation with friends, when you whipped out your smartphone to Google the answer to someone’s question? Probably last week.
Technology changes the way we live our daily lives, the way we learn, and the way we use our faculties of attention — and a growing body of research has suggested that it may have profound effects on our memories (particularly the short-term, or working, memory), altering and in some cases impairing its function.
The implications of a poor working memory on our brain functioning and overall intelligence levels are difficult to over-estimate.
“The depth of our intelligence hinges on our ability to transfer information from working memory, the scratch pad of consciousness, to long-term memory, the mind’s filing system,” Nicholas Carr, author of The Shallows: What The Internet Is Doing To Our Brains, wrote in Wired in 2010. “When facts and experiences enter our long-term memory, we are able to weave them into the complex ideas that give richness to our thought.”
While our long-term memory has a nearly unlimited capacity, the short-term memory has more limited storage, and that storage is very fragile. “A break in our attention can sweep its contents from our mind,” Carr explains.
Meanwhile, new research has found that taking photos — an increasingly ubiquitous practice in our smartphone-obsessed culture — actually hinders our ability to remember that which we’re capturing on camera.
Concerned about premature memory loss? You probably should be. Here are five things you should know about the way technology is affecting your memory.
1. Information overload makes it harder to retain information.
Even a single session of Internet usage can make it more difficult to file away information in your memory, says Erik Fransén, computer science professor at Sweden’s KTH Royal Institute of Technology. And according to Tony Schwartz, productivity expert and author of The Way We’re Working Isn’t Working, most of us aren’t able to effectively manage the overload of information we’re constantly bombarded with. [Continue reading...]
As I pointed out in a recent post, the externalization of intelligence long preceded the creation of smart phones and personal computers. Indeed, it goes all the way back to the beginning of civilization when we first learned how to transform language into a material form as the written word, thereby creating a substitute for memory.
Plato foresaw the consequences of writing.
In Phaedrus, he describes an exchange between the god Thamus, king and ruler of all Egypt, and the god Theuth, who has invented writing. Theuth, who is very proud of what he has created says: “This invention, O king, will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.” But Thamus points out that while one man has the ability to invent, the ability to judge an invention’s usefulness or harmfulness belongs to another.
If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.
Bedazzled by our ingenuity and its creations, we are fast forgetting the value of this quality that can never be implanted in a machine (or a text): wisdom.
David Perlmutter, MD writes: While gluten makes up the lion’s share of protein in wheat, research reveals that modern wheat is capable of producing more than 23,000 different proteins, any one of which could trigger a potentially damaging inflammatory response. One protein in particular is wheat germ agglutinin (WGA). WGA is classified as a lectin — a term for a protein produced by an organism to protect itself from predation.
All grains produce lectins, which selectively bind to unique proteins on the surfaces of bacteria, fungi, and insects. These proteins are found throughout the animal kingdom. One protein in particular for which WGA has an extremely high affinity is N-Acetylglucosamine. N-Acetylglucosamine richly adorns the casing of insects and plays an important role in the structure of the cellular walls of bacteria. More importantly, it is a key structural component in humans in a variety of tissues, including tendons, joint surfaces, cartilage, the lining of the entire digestive tract, and even the lining of the hundreds of miles of blood vessels found within each of us.
It is precisely the ability of WGA to bind to proteins lining the gut that raises concern amongst medical researchers. When WGA binds to these proteins, it may leave these cells less well protected against the harmful effects of the gut contents.
WGA may also have direct toxic effects on the heart, endocrine, and immune systems, and even the brain. In fact, so readily does WGA make its way into the brain that scientists are actually testing it as a possible means of delivering medicines in an attempt to treat Alzheimer’s disease.
And again, the concern here is not just for a small segment of the population who happened to inherit susceptibility for sensitivity to gluten. This is a concern as it relates to all humans. As medical researcher Sayer Ji stated, “What is unique about WGA is that it can do direct damage to the majority of tissues in the human body without requiring a specific set of genetic susceptibilities and/or immune-mediated articulations. This may explain why chronic inflammatory and degenerative conditions are endemic to wheat-consuming populations even when overt allergies or intolerances to wheat gluten appear exceedingly rare.”
The gluten issue is indeed very real and threatening. But it now seems clear that lectin proteins found in wheat may harbor the potential for even more detrimental effects on human health. It is particularly alarming to consider the fact that there is a move to actually genetically modify wheat to enhance its WGA content.
Scientific research is now giving us yet another reason to reconsider the merits of our daily bread. The story of WGA’s potential destructive effects on human health is just beginning to be told. We should embrace the notion that low levels of exposure to any toxin over an extended period can lead to serious health issues. And this may well characterize the under-recognized threat of wheat consumption for all humans.
The New York Times reports: Scientists have found the oldest DNA evidence yet of humans’ biological history. But instead of neatly clarifying human evolution, the finding is adding new mysteries.
In a paper in the journal Nature, scientists reported Wednesday that they had retrieved ancient human DNA from a fossil dating back about 400,000 years, shattering the previous record of 100,000 years.
The fossil, a thigh bone found in Spain, had previously seemed to many experts to belong to a forerunner of Neanderthals. But its DNA tells a very different story. It most closely resembles DNA from an enigmatic lineage of humans known as Denisovans. Until now, Denisovans were known only from DNA retrieved from 80,000-year-old remains in Siberia, 4,000 miles east of where the new DNA was found.
The mismatch between the anatomical and genetic evidence surprised the scientists, who are now rethinking human evolution over the past few hundred thousand years. It is possible, for example, that there are many extinct human populations that scientists have yet to discover. They might have interbred, swapping DNA. Scientists hope that further studies of extremely ancient human DNA will clarify the mystery.
“Right now, we’ve basically generated a big question mark,” said Matthias Meyer, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and a co-author of the new study. [Continue reading...]