Hunter-gatherers have healthier guts than urban dwellers

Nature Communications reports: The gut microbiota is responsible for many aspects of human health and nutrition, but most studies have focused on “western” populations. An international collaboration of researchers, including researchers of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, has for the first time analysed the gut microbiota of a modern hunter-gatherer community, the Hadza of Tanzania. The results of this work show that Hadza harbour a unique microbial profile with features yet unseen in any other human group, supporting the notion that Hadza gut bacteria play an essential role in adaptation to a foraging subsistence pattern. The study further shows how the intestinal flora may have helped our ancestors adapt and survive during the Paleolithic.

Bacterial populations have co-evolved with humans over millions of years, and have the potential to help us adapt to new environments and foods. Studies of the Hadza offer an especially rare opportunity for scientists to learn how humans survive by hunting and gathering, in the same environment and using similar foods as our ancestors did.

The research team, composed of anthropologists, microbial ecologists, molecular biologists, and analytical chemists, and led in part by Stephanie Schnorr and Amanda Henry of the Max Planck Institute for Evolutionary Anthropology, compared the Hadza gut microbiota to that of urban living Italians, representative of a “westernized” population. Their results, published recently in Nature Communications, show that the Hadza have a more diverse gut microbe ecosystem, i.e. more bacterial species compared to the Italians. “This is extremely relevant for human health”, says Stephanie Schnorr. “Several diseases emerging in industrialized countries, like IBS, colorectal cancer, obesity, type II diabetes, Crohn’s disease and others, are significantly associated with a reduction in gut microbial diversity.” [Continue reading...]

Jeff Leach recently accompanied some Hadza hunters and observed the way they handled a recently killed adult Impala: Before the two Hadza men I was with jumped in to help skin and gut the Impala, I quickly took swabs of each of their hands (and 1 hour after, 3 hours after, and so on) to assess how the skin (palm) microbiota change throughout the day/week of a typical Hadza (We’ve sampled the hands [and stools] of 150+ Hadza men, women, and children so far). As they slowly and methodically dismembered the animal, they carefully placed the stomach and its still steaming contents on the fleshy side of the recently removed hide. In a separate area, they piled the fatty internal organs (which men are only allowed to eat by the way). Once the animal had been processed more or less, I was amazed to see all three men take a handful of the partially digested plant material from the recently removed stomach to scrub off the copious amounts of blood that now covered their hands and foreman’s. This was followed by a final “cleaning” with dry grass for good measure.

While I was fascinated by the microbe-laden stomach contents being used as hand scrubber – presumably transferring an extraordinary diversity of microbes from the Impala gut to the hands of the Hadza – I was not prepared for what they did next. Once they had cleaned out – by hand – the contents of the stomach (“cleaned” is a generous word), they carved pieces of the stomach into bite-sized chunks and consumed it sushi-style. By which I mean they didn’t cook it or attempt to kill or eliminate the microbes from the gut of the Impala in anyway. And if this unprecedented transfer of microbes from the skin, blood, and stomach of another mammal wasn’t enough, they then turned their attention to the colon of the Impala.

After removing the poo pellets (which we collect samples of as well), they tossed the tubular colon onto a hastily built fire. However, it only sat on the fire for a minute at best and clearly not long enough to terminate the menagerie of invisible microbes clinging to the inside wall of the colon. They proceeded to cut the colon into chunks and to eat more or less raw. For myself, I kindly turned down offers to taste either the raw stomach or the partially cooked colon – but did eat some tasty Impala ribs I thoroughly turned on a stick over the fire to a microbial-free state of well done.

The Hadza explained that this is what they always do, and have always done (though I suspect sushi-style eating of innards is not an every-kill ritual. But….). Whether it’s an Impala, Dik Dik, Zebra, bush pig, Kudu or any other of the myriad of mammals they hunt and eat, becoming one with the deceased’s microbes in any number of ways is common place – same goes for 700 plus species of birds they hunt (minus abundant amounts of stomach contents for hand sanitizer!). While less obvious than at the “kill site,” the transfer of microbes continued back in camp when women, children and other men handled the newly arrived raw meat, internal organs, and skin. The transfer continued as the hunters engaged (touching) other members of the camp.

The breathtaking exchange (horizontal transfer) of microbes between the Hadza and their environment is more or less how it’s been for eons until humans started walling ourselves off from the microbial world through the many facets of globalization. Rather than think of ourselves as isolated islands of microbes, the Hadza teach us that we are better thought of as an archipelago of islands, once seamlessly connected to one another and to a larger metacommunity of microbes via a microbial super highway that runs through the gut and skin/feathers of every animal and water source on the landscape (for those of you keeping up with your homework, this is Macroecology 101). The same can be said for plants and their extraordinary diversity of microbes above (phyllosphere) and below ground (rhizosphere) that the Hadza, and once all humans, interacted with on a nearly continuous basis.

facebooktwittermail

Cahokia: North America’s first melting pot?

Christian Science Monitor: The first experiment in “melting pot” politics in North America appears to have emerged nearly 1,000 years ago in the bottom lands of the Mississippi River near today’s St. Louis, according to archaeologists piecing together the story of the rise and fall of the native American urban complex known as Cahokia.

During its heyday, Cahokia’s population reached an estimated 20,000 people – a level the continent north of the Rio Grande wouldn’t see again until the eve of the American Revolution and the growth of New York and Philadelphia.

Cahokia’s ceremonial center, seven miles northeast of St. Louis’s Gateway Arch, boasted 120 earthen mounds, including a broad, tiered mound some 10 stories high. In East St. Louis, one of two major satellites hosts another 50 earthen mounds, as well as residences. St. Louis hosted another 26 mounds and associated dwellings.

These are three of the four largest native-American mound centers known, “all within spitting distance of one another,” says Thomas Emerson, Illinois State Archaeologist and a member of a team testing the melting-pot idea. “That’s some kind of large, integrated complex to some degree.”

Where did all those people come from? Archaeologists have been debating that question for years, Dr. Emerson says. Unfortunately, the locals left no written record of the complex’s history. Artifacts such as pottery, tools, or body ornaments give an ambiguous answer.

Artifacts from Cahokia have been found in other native-American centers from Arkansas and northern Louisiana to Oklahoma, Iowa, and Wisconsin, just as artifacts from these areas appear in digs at Cahokia.

“Archaeologists are always struggling with this: Are artifacts moving, or are people moving?” Emerson says.

Emerson and two colleagues at the University of Illinois at Urbana-Champaign tried to tackle the question using two radioactive forms of the element strontium found in human teeth. They discovered that throughout the 300 years that native Americans occupied Cahokia, the complex appeared to receive a steady stream of immigrants who stayed. [Continue reading...]

facebooktwittermail

Throughout our existence humans have always been the most destructive creatures to roam this planet

woolly-mammoth

For those of us who see industrial civilization as the guarantor of humanity’s destruction, it’s easy to picture an idyllic era earlier in our evolution, located perhaps during the cultural flowering of the Great Leap Forward.

Communities then remained relatively egalitarian without workers enslaved in back-breaking labor, while subsistence on few material resources meant that time was neither controlled by the dictates of a stratified social hierarchy nor by the demands of survival.

When people could accord as much value to storytelling, ritual, and music-making, as they did to hunting and gathering food, we might like to think that human beings were living in balance with nature.

As George Monbiot reveals, the emerging evidence about of our early ancestors paints a much grimmer picture — one in which human nature appears to have always been profoundly destructive.

You want to know who we are? Really? You think you do, but you will regret it. This article, if you have any love for the world, will inject you with a venom – a soul-scraping sadness – without an obvious antidote.

The Anthropocene, now a popular term among scientists, is the epoch in which we live: one dominated by human impacts on the living world. Most date it from the beginning of the industrial revolution. But it might have begun much earlier, with a killing spree that commenced two million years ago. What rose onto its hind legs on the African savannahs was, from the outset, death: the destroyer of worlds.

Before Homo erectus, perhaps our first recognisably human ancestor, emerged in Africa, the continent abounded with monsters. There were several species of elephants. There were sabretooths and false sabretooths, giant hyenas and creatures like those released in The Hunger Games: amphicyonids, or bear dogs, vast predators with an enormous bite.

Prof Blaire van Valkenburgh has developed a means by which we could roughly determine how many of these animals there were. When there are few predators and plenty of prey, the predators eat only the best parts of the carcass. When competition is intense, they eat everything, including the bones. The more bones a carnivore eats, the more likely its teeth are to be worn or broken. The breakages in carnivores’ teeth were massively greater in the pre-human era.

Not only were there more species of predators, including species much larger than any found on Earth today, but they appear to have been much more abundant – and desperate. We evolved in a terrible, wonderful world – that was no match for us. [Continue reading...]

facebooktwittermail

Talking Neanderthals challenge assumptions about the origins of speech

University of New England, Australia: We humans like to think of ourselves as unique for many reasons, not least of which being our ability to communicate with words. But ground-breaking research by an expert from the University of New England shows that our ‘misunderstood cousins,’ the Neanderthals, may well have spoken in languages not dissimilar to the ones we use today.

Pinpointing the origin and evolution of speech and human language is one of the longest running and most hotly debated topics in the scientific world. It has long been believed that other beings, including the Neanderthals with whom our ancestors shared the Earth for thousands of years, simply lacked the necessary cognitive capacity and vocal hardware for speech.

Associate Professor Stephen Wroe, a zoologist and palaeontologist from UNE, along with an international team of scientists and the use of 3D x-ray imaging technology, made the revolutionary discovery challenging this notion based on a 60,000 year-old Neanderthal hyoid bone discovered in Israel in 1989.

“To many, the Neanderthal hyoid discovered was surprising because its shape was very different to that of our closest living relatives, the chimpanzee and the bonobo. However, it was virtually indistinguishable from that of our own species. This led to some people arguing that this Neanderthal could speak,” A/Professor Wroe said.

“The obvious counterargument to this assertion was that the fact that hyoids of Neanderthals were the same shape as modern humans doesn’t necessarily mean that they were used in the same way. With the technology of the time, it was hard to verify the argument one way or the other.”

However advances in 3D imaging and computer modelling allowed A/Professor Wroe’s team to revisit the question.

“By analysing the mechanical behaviour of the fossilised bone with micro x-ray imaging, we were able to build models of the hyoid that included the intricate internal structure of the bone. We then compared them to models of modern humans. Our comparisons showed that in terms of mechanical behaviour, the Neanderthal hyoid was basically indistinguishable from our own, strongly suggesting that this key part of the vocal tract was used in the same way.

“From this research, we can conclude that it’s likely that the origins of speech and language are far, far older than once thought.”

facebooktwittermail

The emotional intelligence of dogs

f13-iconThe ability to discern the emotions of others provides the foundation for emotional intelligence. How well-developed this faculty is seems to have little to do with the strength of other markers of intelligence, indeed, as a new study seems to imply, there may be little reason to see in emotional intelligence much that is uniquely human.

Scientific American: [A]lthough dogs have the capacity to understand more than 100 words, studies have demonstrated Fido can’t really speak human languages or comprehend them with the same complexity that we do. Yet researchers have now discovered that dog and human brains process the vocalizations and emotions of others more similarly than previously thought. The findings suggest that although dogs cannot discuss relativity theory with us, they do seem to be wired in a way that helps them to grasp what we feel by attending to the sounds we make.

To compare active human and dog brains, postdoctoral researcher Attila Andics and his team from MTA-ELTE Comparative Ethology Research Group in Hungary trained 11 dogs to lie still in an fMRI brain scanner for several six minute intervals so that the researchers could perform the same experiment on both human and canine participants. Both groups listened to almost two hundred dog and human sounds — from whining and crying to laughter and playful barking — while the team scanned their brain activity.

The resulting study, published in Current Biology today, reveals both that dog brains have voice-sensitive regions and that these neurological areas resemble those of humans. Sharing similar locations in both species, they process voices and emotions of other individuals similarly. Both groups respond with greater neural activity when they listen to voices reflecting positive emotions such as laughing than to negative sounds that include crying or whining. Dogs and people, however, respond more strongly to the sounds made by their own species. “Dogs and humans meet in a very similar social environment but we didn’t know before just how similar the brain mechanisms are to process this social information,” Andics says. [Continue reading...]

facebooktwittermail

900,000 year old footprints of earliest northern Europeans discovered

f13-iconThe Telegraph reports: Footprints left behind by what may be one our first human ancestors to arrive in Britain have been discovered on a beach in Norfolk.

The preserved tracks, which consisted of 49 imprints in a soft sedimentary rock, are believed to be around 900,000 years old and could transform scientists understanding of how early humans moved around the world.

The footprints were found in what scientists have described as a “million to one” discovery last summer when heavy seas washed sand off the foreshore in Happisburgh, Norfolk.

The find has only now been made public and are thought to be the oldest evidence of early humans in northern Europe yet to be discovered. [Continue reading...]

facebooktwittermail

At home in opposition to the land

To an adult, most of the connotations of home seem positive: safety, stability, familiarity, comfort, nurturing. Yet as Ian Tattersall points out, to be tied to one place in a changing environment marked a turning point in human evolution — the juncture at which we placed ourselves in opposition to nature.

Archaeologists begin to see proto-houses during the Ice Age, some 15,000 years ago. Hunter-gatherers at the Ukrainian site of Mezhirich built four oval-to-circular huts that ranged from 120 to 240 square feet in area, and were clad in tons of mammoth bones. Out there on the treeless tundra, their occupants would have cooperated in hunting reindeer and other grazers that migrated seasonally through the area. The Mezhirich people dug pits in the permafrost that acted as natural “freezers” to preserve their meat and let them spend several months at a time in the “village.” With so much labor invested in the construction of their houses, it is hard to imagine that the Mezhirich folk did not somehow feel “at home” there.

But if an archaeologist had to pick an example of the earliest structures that most resembled our modern idea of home, it would probably be the round houses built by the semi-sedentary Natufians, an ancient people who lived around the eastern end of the Mediterranean Sea (Israel, Syria, and environs) at the end of the last Ice Age, some 12,000 years ago. A typical Natufian village consisted of several circular huts each measuring about 10 to 20 feet in diameter; these villages testify to a revolutionary change in human living arrangements. Finally, people were regularly living in semi-permanent settlements, in which the houses were clearly much more than simple shelters against the elements. The Natufians were almost certainly witness to a dramatic change in society.

The end of the Ice Age was a time of transition from a hunter-gatherer mode of subsistence to an agricultural way of life. But it also involved a Faustian bargain. Adopting a fixed residence went hand-in-hand with cultivating fields and domesticating animals. It allowed families to grow, providing additional labor to till the fields. But becoming dependent on the crops they grew meant that people found themselves in opposition to the environment: The rain didn’t fall and the sun didn’t shine at the farmers’ convenience. They locked themselves into a lifestyle, and to make the field continuously productive to feed their growing families, they had to modify their landscape.

facebooktwittermail

Why non-believers need rituals too

Suzanne Moore writes: The last time I put my own atheism through the spin cycle rather than simply wiping it clean was when I wanted to make a ceremony after the birth of my third child. Would it be a blessing? From who? What does the common notion of a new baby as a gift mean? How would we make it meaningful to the people we invited who were from different faiths? And, importantly, what would it look like?

One of the problems I have with the New Atheism is that it fixates on ethics, ignoring aesthetics at its peril. It tends also towards atomisation, relying on abstracts such as “civic law” to conjure a collective experience. But I love ritual, because it is through ritual that we remake and strengthen our social bonds. As I write, down the road there is a memorial being held for Lou Reed, hosted by the local Unitarian church. Most people there will have no belief in God but will feel glad to be part of a shared appreciation of a man whose god was rock’n'roll.

When it came to making a ceremony, I really did not want the austerity of some humanist events I have attended, where I feel the sensual world is rejected. This is what I mean about aesthetics. Do we cede them to the religious and just look like a bunch of Calvinists? I found myself turning to flowers, flames and incense. Is there anything more beautiful than the offerings made all over the world, of tiny flames and blossom on leaves floating on water?

Already, I am revealing a kind of neo-paganism that hardcore rationalist will find unacceptable. But they find most human things unacceptable. For me, not believing in God does not mean one has to forgo poetry, magic, the chaos of ritual, the remaking of shared bonds. I fear ultra-orthodox atheism has come to resemble a rigid and patriarchal faith itself. [Continue reading...]

facebooktwittermail

Neanderthals and the dead

The New York Times reports: Early in the 20th century, two brothers discovered a nearly complete Neanderthal skeleton in a pit inside a cave at La Chapelle-aux-Saints, in southwestern France. The discovery raised the possibility that these evolutionary relatives of ours intentionally buried their dead — at least 50,000 years ago, before the arrival of anatomically modern humans in Europe.

These and at least 40 subsequent discoveries, a few as far from Europe as Israel and Iraq, appeared to suggest that Neanderthals, long thought of as brutish cave dwellers, actually had complex funeral practices. Yet a significant number of researchers have since objected that the burials were misinterpreted, and might not represent any advance in cognitive and symbolic behavior.

Now an international team of scientists is reporting that a 13-year re-examination of the burials at La Chapelle-aux-Saints supports the earlier claims that the burials were intentional.

The researchers — archaeologists, geologists and paleoanthropologists — not only studied the skeleton from the original excavations, but found more Neanderthal remains, from two children and an adult. They also studied the bones of other animals in the cave, mainly bison and reindeer, and the geology of the burial pits.

The findings, in this week’s issue of Proceedings of the National Academy of Sciences, “buttress claims for complex symbolic behavior among Western European Neanderthals,” the scientists reported.

William Rendu, the paper’s lead author and a researcher at the Center for International Research in the Humanities and Social Sciences in New York, said in an interview that the geology of the burial pits “cannot be explained by natural events” and that “there is no sign of weathering and scavenging by animals,” which means the bodies were covered soon after death.

“While we cannot know if this practice was part of a ritual or merely pragmatic,” Dr. Rendu said in a statement issued by New York University, “the discovery reduces the behavioral distance between them and us.” [Continue reading...]

facebooktwittermail

The most arrogant creatures on Earth

Dominique Mosbergen writes: Researchers from the University of Adelaide in Australia argue in an upcoming book, The Dynamic Human, that humans really aren’t much smarter than other creatures — and that some animals may actually be brighter than we are.

“For millennia, all kinds of authorities — from religion to eminent scholars — have been repeating the same idea ad nauseam, that humans are exceptional by virtue that they are the smartest in the animal kingdom,” the book’s co-author Dr. Arthur Saniotis, a visiting research fellow with the university’s School of Medical Sciences, said in a written statement. “However, science tells us that animals can have cognitive faculties that are superior to human beings.”

Not to mention, ongoing research on intelligence and primate brain evolution backs the idea that humans aren’t the cleverest creatures on Earth, co-author Dr. Maciej Henneberg, a professor also at the School of Medical Sciences, told The Huffington Post in an email.

The researchers said the belief in the superiority of that human intelligence can be traced back around 10,000 years to the Agricultural Revolution, when humans began domesticating animals. The idea was reinforced with the advent of organized religion, which emphasized human beings’ superiority over other creatures. [Continue reading...]

At various times in my life, I’ve crossed paths with people possessing immense wealth and power, providing me with glimpses of the mindset of those who regard themselves as the most important people on this planet.

From what I can tell, the concentration of great power does not coincide with the expression of great intelligence. What is far more evident is a great sense of entitlement, which is to say a self-validating sense that power rests where power belongs and that the inequality in its distribution is a reflection of some kind of natural order.

Since this self-serving perception of hierarchical order operates among humans and since humans as a species wield so much more power than any other, it’s perhaps not surprising that we exhibit the same kind of hubris collectively that we see individually in the most dominant among us.

Nevertheless, it is becoming increasingly clear that our sense of superiority is rooted in ignorance.

Amit Majmudar writes: There may come a time when we cease to regard animals as inferior, preliminary iterations of the human—with the human thought of as the pinnacle of evolution so far—and instead regard all forms of life as fugue-like elaborations of a single musical theme.

Animals are routinely superhuman in one way or another. They outstrip us in this or that perceptual or physical ability, and we think nothing of it. It is only our kind of superiority (in the use of tools, basically) that we select as the marker of “real” superiority. A human being with an elephant’s hippocampus would end up like Funes the Memorious in the story by Borges; a human being with a dog’s olfactory bulb would become a Vermeer of scent, but his art would be lost on the rest of us, with our visually dominated brains. The poetry of the orcas is yet to be translated; I suspect that the whale sagas will have much more interesting things in them than the tablets and inscriptions of Sumer and Akkad.

If science should ever persuade people of this biological unity, it would be of far greater benefit to the species than penicillin or cardiopulmonary bypass; of far greater benefit to the planet than the piecemeal successes of environmental activism. We will have arrived, by study and reasoning, at the intuitive, mystical insights of poets.

facebooktwittermail

How computers are making people stupid

The pursuit of artificial intelligence has been driven by the assumption that if human intelligence can be replicated or advanced upon by machines then this accomplishment will in various ways serve the human good. At the same time, thanks to the technophobia promoted in some dystopian science fiction, there is a popular fear that if machines become smarter than people we will end up becoming their slaves.

It turns out that even if there are some irrational fears wrapped up in technophobia, there are good reasons to regard computing devices as a threat to human intelligence.

It’s not that we are creating machines that harbor evil designs to take over the world, but simply that each time we delegate a function of the brain to an external piece of circuitry, our mental faculties inevitably atrophy.

Use it or lose it applies just as much to the brain as it does to any other part of the body.

Carolyn Gregoire writes: Take a moment to think about the last time you memorized someone’s phone number. Was it way back when, perhaps circa 2001? And when was the last time you were at a dinner party or having a conversation with friends, when you whipped out your smartphone to Google the answer to someone’s question? Probably last week.

Technology changes the way we live our daily lives, the way we learn, and the way we use our faculties of attention — and a growing body of research has suggested that it may have profound effects on our memories (particularly the short-term, or working, memory), altering and in some cases impairing its function.

The implications of a poor working memory on our brain functioning and overall intelligence levels are difficult to over-estimate.

“The depth of our intelligence hinges on our ability to transfer information from working memory, the scratch pad of consciousness, to long-term memory, the mind’s filing system,” Nicholas Carr, author of The Shallows: What The Internet Is Doing To Our Brains, wrote in Wired in 2010. “When facts and experiences enter our long-term memory, we are able to weave them into the complex ideas that give richness to our thought.”

While our long-term memory has a nearly unlimited capacity, the short-term memory has more limited storage, and that storage is very fragile. “A break in our attention can sweep its contents from our mind,” Carr explains.

Meanwhile, new research has found that taking photos — an increasingly ubiquitous practice in our smartphone-obsessed culture — actually hinders our ability to remember that which we’re capturing on camera.

Concerned about premature memory loss? You probably should be. Here are five things you should know about the way technology is affecting your memory.

1. Information overload makes it harder to retain information.

Even a single session of Internet usage can make it more difficult to file away information in your memory, says Erik Fransén, computer science professor at Sweden’s KTH Royal Institute of Technology. And according to Tony Schwartz, productivity expert and author of The Way We’re Working Isn’t Working, most of us aren’t able to effectively manage the overload of information we’re constantly bombarded with. [Continue reading...]

As I pointed out in a recent post, the externalization of intelligence long preceded the creation of smart phones and personal computers. Indeed, it goes all the way back to the beginning of civilization when we first learned how to transform language into a material form as the written word, thereby creating a substitute for memory.

Plato foresaw the consequences of writing.

In Phaedrus, he describes an exchange between the god Thamus, king and ruler of all Egypt, and the god Theuth, who has invented writing. Theuth, who is very proud of what he has created says: “This invention, O king, will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.” But Thamus points out that while one man has the ability to invent, the ability to judge an invention’s usefulness or harmfulness belongs to another.

If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.

Bedazzled by our ingenuity and its creations, we are fast forgetting the value of this quality that can never be implanted in a machine (or a text): wisdom.

facebooktwittermail

Worried about terrorism? You should be more afraid of bread!

David Perlmutter, MD writes: While gluten makes up the lion’s share of protein in wheat, research reveals that modern wheat is capable of producing more than 23,000 different proteins, any one of which could trigger a potentially damaging inflammatory response. One protein in particular is wheat germ agglutinin (WGA). WGA is classified as a lectin — a term for a protein produced by an organism to protect itself from predation.

All grains produce lectins, which selectively bind to unique proteins on the surfaces of bacteria, fungi, and insects. These proteins are found throughout the animal kingdom. One protein in particular for which WGA has an extremely high affinity is N-Acetylglucosamine. N-Acetylglucosamine richly adorns the casing of insects and plays an important role in the structure of the cellular walls of bacteria. More importantly, it is a key structural component in humans in a variety of tissues, including tendons, joint surfaces, cartilage, the lining of the entire digestive tract, and even the lining of the hundreds of miles of blood vessels found within each of us.

It is precisely the ability of WGA to bind to proteins lining the gut that raises concern amongst medical researchers. When WGA binds to these proteins, it may leave these cells less well protected against the harmful effects of the gut contents.

WGA may also have direct toxic effects on the heart, endocrine, and immune systems, and even the brain. In fact, so readily does WGA make its way into the brain that scientists are actually testing it as a possible means of delivering medicines in an attempt to treat Alzheimer’s disease.

And again, the concern here is not just for a small segment of the population who happened to inherit susceptibility for sensitivity to gluten. This is a concern as it relates to all humans. As medical researcher Sayer Ji stated, “What is unique about WGA is that it can do direct damage to the majority of tissues in the human body without requiring a specific set of genetic susceptibilities and/or immune-mediated articulations. This may explain why chronic inflammatory and degenerative conditions are endemic to wheat-consuming populations even when overt allergies or intolerances to wheat gluten appear exceedingly rare.”

The gluten issue is indeed very real and threatening. But it now seems clear that lectin proteins found in wheat may harbor the potential for even more detrimental effects on human health. It is particularly alarming to consider the fact that there is a move to actually genetically modify wheat to enhance its WGA content.

Scientific research is now giving us yet another reason to reconsider the merits of our daily bread. The story of WGA’s potential destructive effects on human health is just beginning to be told. We should embrace the notion that low levels of exposure to any toxin over an extended period can lead to serious health issues. And this may well characterize the under-recognized threat of wheat consumption for all humans.

facebooktwittermail

Baffling 400,000-year-old clue to human origins

The New York Times reports: Scientists have found the oldest DNA evidence yet of humans’ biological history. But instead of neatly clarifying human evolution, the finding is adding new mysteries.

In a paper in the journal Nature, scientists reported Wednesday that they had retrieved ancient human DNA from a fossil dating back about 400,000 years, shattering the previous record of 100,000 years.

The fossil, a thigh bone found in Spain, had previously seemed to many experts to belong to a forerunner of Neanderthals. But its DNA tells a very different story. It most closely resembles DNA from an enigmatic lineage of humans known as Denisovans. Until now, Denisovans were known only from DNA retrieved from 80,000-year-old remains in Siberia, 4,000 miles east of where the new DNA was found.

The mismatch between the anatomical and genetic evidence surprised the scientists, who are now rethinking human evolution over the past few hundred thousand years. It is possible, for example, that there are many extinct human populations that scientists have yet to discover. They might have interbred, swapping DNA. Scientists hope that further studies of extremely ancient human DNA will clarify the mystery.

“Right now, we’ve basically generated a big question mark,” said Matthias Meyer, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and a co-author of the new study. [Continue reading...]

facebooktwittermail

The Western European roots of Native Americans

The New York Times reports: The genome of a young boy buried at Mal’ta near Lake Baikal in eastern Siberia some 24,000 years ago has turned out to hold two surprises for anthropologists.

The first is that the boy’s DNA matches that of Western Europeans, showing that during the last Ice Age people from Europe had reached farther east across Eurasia than previously supposed. Though none of the Mal’ta boy’s skin or hair survives, his genes suggest he would have had brown hair, brown eyes and freckled skin.

The second surprise is that his DNA also matches a large proportion — about 25 percent — of the DNA of living Native Americans. The first people to arrive in the Americas have long been assumed to have descended from Siberian populations related to East Asians. It now seems that they may be a mixture between the Western Europeans who had reached Siberia and an East Asian population.

The Mal’ta boy was 3 to 4 years old and was buried under a stone slab wearing an ivory diadem, a bead necklace and a bird-shaped pendant. Elsewhere at the same site about 30 Venus figurines were found of the kind produced by the Upper Paleolithic cultures of Europe. The remains were excavated by Russian archaeologists over a 20-year period ending in 1958 and stored in museums in St. Petersburg.

There they lay for some 50 years until they were examined by a team led by Eske Willerslev of the University of Copenhagen. Dr. Willerslev, an expert in analyzing ancient DNA, was seeking to understand the peopling of the Americas by searching for possible source populations in Siberia. He extracted DNA from bone taken from the child’s upper arm, hoping to find ancestry in the East Asian peoples from whom Native Americans are known to be descended.

But the first results were disappointing. The boy’s mitochondrial DNA belonged to the lineage known as U, which is commonly found among the modern humans who first entered Europe about 44,000 years ago. The lineages found among Native Americans are those designated A, B, C, D and X, so the U lineage pointed to contamination of the bone by the archaeologists or museum curators who had handled it, a common problem with ancient DNA projects. “The study was put on low speed for about a year because I thought it was all contamination,” Dr. Willerslev said.

His team proceeded anyway to analyze the nuclear genome, which contains the major part of human inheritance. They were amazed when the nuclear genome also turned out to have partly European ancestry. Examining the genome from a second Siberian grave site, that of an adult who died 17,000 years ago, they found the same markers of European origin. Together, the two genomes indicate that descendants of the modern humans who entered Europe had spread much farther east across Eurasia than had previously been assumed and occupied Siberia during an extremely cold period starting 20,000 years ago that is known as the Last Glacial Maximum.

The other surprise from the Mal’ta boy’s genome was that it matched to both Europeans and Native Americans but not to East Asians. Dr. Willerslev’s interpretation was that the ancestors of Native Americans had already separated from the East Asian population when they interbred with the people of the Mal’ta culture, and that this admixed population then crossed over the Beringian land bridge that then lay between Siberia and Alaska to become a founding population of Native Americans. [Continue reading...]

facebooktwittermail

Social complexity and facial diversity among primates

UCLA Newsroom: Why do the faces of some primates contain so many different colors — black, blue, red, orange and white — that are mixed in all kinds of combinations and often striking patterns while other primate faces are quite plain?

UCLA biologists reported last year on the evolution of 129 primate faces in species from Central and South America. This research team now reports on the faces of 139 Old World African and Asian primate species that have been diversifying over some 25 million years.

With these Old World monkeys and apes, the species that are more social have more complex facial patterns, the biologists found. Species that have smaller group sizes tend to have simpler faces with fewer colors, perhaps because the presence of more color patches in the face results in greater potential for facial variation across individuals within species. This variation could aid in identification, which may be a more difficult task in larger groups.

Species that live in the same habitat with other closely related species tend to have more complex facial patterns, suggesting that complex faces may also aid in species recognition, the life scientists found.

“Humans are crazy for Facebook, but our research suggests that primates have been relying on the face to tell friends from competitors for the last 50 million years and that social pressures have guided the evolution of the enormous diversity of faces we see across the group today,” said Michael Alfaro, an associate professor of ecology and evolutionary biology in the UCLA College of Letters and Science and senior author of the study.

“Faces are really important to how monkeys and apes can tell one another apart,” he said. “We think the color patterns have to do both with the importance of telling individuals of your own species apart from closely related species and for social communication among members of the same species.” [Continue reading...]

facebooktwittermail

Allergies and the ‘farm effect’

Moises Velasquez-Manoff writes: Will the cure for allergies come from the cowshed?

Allergies are often seen as an accident. Your immune system misinterprets a harmless protein like dust or peanuts as a threat, and when you encounter it, you pay the price with sneezing, wheezing, and in the worst cases, death.

What prompts some immune systems to err like this, while others never do? Some of the vulnerability is surely genetic. But comparative studies highlight the importance of environment, beginning, it seems, in the womb. Microbes are one intriguing protective factor. Certain ones seem to stimulate a mother’s immune system during pregnancy, preventing allergic disease in children.

By emulating this naturally occurring phenomenon, scientists may one day devise a way to prevent allergies.

This task, though still in its infancy, has some urgency. Depending on the study and population, the prevalence of allergic disease and asthma increased between two- and threefold in the late 20th century, a mysterious trend often called the “allergy epidemic.”

These days, one in five American children have a respiratory allergy like hay fever, and nearly one in 10 have asthma.

Nine people die daily from asthma attacks. While the increase in respiratory allergies shows some signs of leveling off, the prevalence of food and skin allergies continues to rise. Five percent of children are allergic to peanuts, milk and other foods, half again as many as 15 years ago. And each new generation seems to have more severe, potentially life-threatening allergic reactions than the last.

Some time ago, I visited a place where seemingly protective microbes occurred spontaneously. It wasn’t a spotless laboratory in some university somewhere. It was a manure-spattered cowshed in Indiana’s Amish country.

My guide was Mark Holbreich, an allergist in Indianapolis. He’d recently discovered that the Amish people who lived in the northern part of the state were remarkably free of allergies and asthma.

About half of Americans have evidence of allergic sensitization, which increases the risk of allergic disease. But judging from skin-prick tests, just 7.2 percent of the 138 Amish children who Dr. Holbreich tested were sensitized to tree pollens and other allergens. That yawning difference positions the Indiana Amish among the least allergic populations ever described in the developed world.

This invulnerability isn’t likely to be genetic. The Amish originally came to the United States from the German-speaking part of Switzerland, and these days Swiss children, a genetically similar population, are about as allergic as Americans.

Ninety-two percent of the Amish children Dr. Holbreich tested either lived on farms or visited one frequently. Farming, Dr. Holbreich thinks, is the Amish secret. This idea has some history. Since the late 1990s, European scientists have investigated what they call the “farm effect.” [Continue reading...]

facebooktwittermail