Category Archives: Attention to the Unseen

How humans became meat eaters

Marta Zaraska writes: Just as modern chimps occasionally hunt colobus monkeys, our ancestors may have occasionally dined on the raw meat of small monkeys, too. Yet the guts of early hominins wouldn’t have allowed them to have a meat-heavy diet, like the one Americans eat today. Their guts were characteristic of fruit-and-leaf eaters, with a big caecum, a bacteria-brimming pouch at the beginning of the large intestine. If an australopith gorged himself on meat — say, ate a few zebra steaks tartare in one sitting — he likely would have suffered twisting of the colon, with piercing stomach pains, nausea, and bloating, possibly resulting in death. And yet in spite of these dangers, by 2.5 million years ago, our ancestors had become meat eaters.

It seems that our bodies had to adjust gradually, first getting hooked on seeds and nuts, which are rich in fats but poor in fiber. If our ancestors ate a lot of them, such a diet would have encouraged the growth of the small intestine (where the digestion of lipids takes place) and the shrinking of the caecum (where fibers are digested). This would have made our guts better for processing meat. A seed-and-nut diet could have prepared our ancestors for a carnivorous lifestyle in another way, too: It could have given them the tools for carving carcasses. Some researchers suggest that the simple stone tools used for pounding seeds and nuts could have easily been reassigned to cracking animal bones and cutting off chunks of flesh. And so, by 2.5 million years ago, our ancestors were ready for meat: They had the tools to get it and the bodies to digest it.

But being capable is one thing; having the will and skill to go out and get meat is quite another. So what inspired our ancestors to look at antelopes and hippos as potential dinners? The answer, or at least a part of it, may lie in a change of climate approximately 2.5 million years ago. [Continue reading…]

Facebooktwittermail

Do the Hadza give their honeyguides a fair wage?

Cara Giaimo writes: In the tree-strewn savannah of northern Tanzania, near the salty shores of Lake Eyasi, live some of the planet’s few remaining hunter-gatherers. Known as the Hadza, they live in Hadzaland, which stretches for about 4,000 square kilometers around the lake. No one is sure how long they’ve been there, but it could be since humans became human. As one anthropologist put it in a recent book, “their oral history contains no stories suggesting they came from some other place.”

Anthropologists have been scrutinizing the Hadza for centuries, seeking in their stories and behavior windows to the past. The Hadza themselves, at least at times, subscribe to a food-based method of self-understanding: they describe their predecessors based on what, and how, they ate. The first Hadza, the Akakaanebe, or “ancestors,” ate raw game, plentiful and easily slain–as one ethnographer relays, “they simply had to stare at an animal and it fell dead.” The second, the Tlaatlaanebe, ate fire-roasted meat, hunted with dogs. The third, the Hamakwabe, invented bows and arrows and cooking pots, and thus expanded the menu.

The Hamaishonebe, or “modern people” — the people of today — have a variety of meal strategies. Hadza hunting and gathering grounds are shrinking, under pressure from maize farms, herding grounds, and private game reserves, and some work jobs and buy food from their neighbors. But between two and three hundred of the 1300 Hadza remaining still survive almost entirely on wild foods: tubers, meat, fruit, and honey.

Of these staples, honey is the Hadza’s overwhelming favorite. But beehives, located high up in thick-trunked baobabs and guarded fiercely by their stinging occupants, are hard to get at, and even harder to find. Enter the greater honeyguide, an unassuming black and white bird about the size of a robin. Greater honeyguides, a distinct species within the honeyguide family, love grubs and beeswax, and are great at locating hives. This is a boon for the Hadza, who, according to some estimates, get about 15 percent of their calories from honey.

When Hadza want to find honey, they shout and whistle a special tune. If a honeyguide is around, it’ll fly into the camp, chattering and fanning out its feathers. The Hadza, now on the hunt, chase it, grabbing their axes and torches and shouting “Wait!” They follow the honeyguide until it lands near its payload spot, pinpoint the correct tree, smoke out the bees, hack it open, and free the sweet combs from the nest. The honeyguide stays and watches.

It’s one of those stories that sounds like a fable — until you get to the end, where the lesson normally goes. Then it becomes a bit more confusing. [Continue reading…]

The way this story plays out has commonly been depicted as shown in the video below, but it turns out that this relationship between humans and birds might not be quite as mutually beneficial as first thought.

Facebooktwittermail

Evidence mounts for interbreeding bonanza in ancient human species

Nature reports: The discovery of yet another period of interbreeding between early humans and Neanderthals is adding to the growing sense that sexual encounters among different ancient human species were commonplace throughout their history.

“As more early modern humans and archaic humans are found and sequenced, we’re going to see many more instances of interbreeding,” says Sergi Castellano, a population geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. His team discovered the latest example, which they believe occurred around 100,000 years ago, by analysing traces of Homo sapiens DNA in a Neanderthal genome extracted from a toe bone found in a cave in Siberia.

“There is this joke in the population genetics community — there’s always one more interbreeding event,” Castellano says. So before researchers discover the next one, here’s a rundown of the interbreeding episodes that they have already deduced from studies of ancient DNA. [Continue reading…]

Facebooktwittermail

Meet the world’s most notorious taxonomist

lichen2

Susie Neilson writes: In 2005, the taxonomist Quentin Wheeler named a trio of newly discovered slime-mold beetles after George W. Bush, Donald Rumsfeld, and Dick Cheney. He believed the names could increase public interest in the discovery and classification of new species, and help combat the quickening pace of extinction. (Species go extinct three times faster than we can name them.)

He knew he was onto something when, having received a call from the White House, it was Bush on the other end, thanking him for the honor. Wheeler, now the president of SUNY’s College of Environmental Science and Forestry, began attributing all sorts of provocative names to his bugs, including Darth Vader, Stephen Colbert, Roy and Barbara Orbison, Pocahontas, Hernan Cortez, and the Aztecs — he has even named 6 species after himself. Youcan call his strategy “shameless self-promotion” — Wheeler already has.

Nautilus spoke with Wheeler about his work.

What’s exciting about taxonomy?

It is the one field with the audacity to create a living inventory of every living thing on the entire planet and reconstruct the history of the diversity of life. Who else would tackle 12 million species in 3.8 billion years on the entire surface of the planet? If that isn’t real science, I don’t know what is. It infuriates me that taxonomy is marginalized as a bookkeeping activity, when in fact it has the most audacious research agenda of any biological science. [Continue reading…]

Facebooktwittermail

Scientists glimpse Einstein’s gravitational waves

Phys.org reports: In a landmark discovery for physics and astronomy, scientists said Thursday they have glimpsed the first direct evidence of gravitational waves, ripples in the fabric of space-time that Albert Einstein predicted a century ago.

When two black holes collided some 1.3 billion years ago, the joining of those two great masses sent forth a wobble that hurtled through space and reached Earth on September 14, 2015, when it was picked up by sophisticated instruments, researchers announced.

“Up until now we have been deaf to gravitational waves, but today, we are able to hear them,” said David Reitze, executive director of the LIGO Laboratory, at a packed press conference in the US capital.

Reitze and colleagues compared the magnitude of the discovery to Galileo’s use of the telescope four centuries ago to open the era of modern astronomy.

“I think we are doing something equally important here today. I think we are opening a window on the universe,” Reitze said. [Continue reading…]

Facebooktwittermail

Gravity waves can’t be understood without understanding Einstein’s idea of gravity

By David Blair, University of Western Australia

I have spent almost 40 years trying to detect gravity waves.

When I started there were just a few of us working away in university labs. Today 1,000 physicists working with billion-dollar observatories are quietly confident the waves are within our grasp.

If we are right, the gravity wave search will have taken 100 years from the date of Einstein’s prediction.

In 100 years’ time the discovery of Einstein’s gravity waves will be one of the landmarks in the history of science. It will stand out like the discovery of electromagnetic waves in 1886, a quarter of a century after these waves were predicted by physicist James Clerk Maxwell.

The problem of talking about gravity waves is that you can’t explain them without explaining Einstein’s idea of gravity. Recently I began to ask why it is so difficult to explain gravity, why the concept is met with glazed eyes and baffled looks. Eventually I came up with a theory I call the Tragedy of the Euclidean Time Warp.

Continue reading

Facebooktwittermail

Invasion of the body snatchers

Jacob Weisberg writes: “As smoking gives us something to do with our hands when we aren’t using them, Time gives us something to do with our minds when we aren’t thinking,” Dwight Macdonald wrote in 1957. With smartphones, the issue never arises. Hands and mind are continuously occupied texting, e-mailing, liking, tweeting, watching YouTube videos, and playing Candy Crush.

Americans spend an average of five and a half hours a day with digital media, more than half of that time on mobile devices, according to the research firm eMarketer. Among some groups, the numbers range much higher. In one recent survey, female students at Baylor University reported using their cell phones an average of ten hours a day. Three quarters of eighteen-to-twenty-four-year-olds say that they reach for their phones immediately upon waking up in the morning. Once out of bed, we check our phones 221 times a day — an average of every 4.3 minutes — according to a UK study. This number actually may be too low, since people tend to underestimate their own mobile usage. In a 2015 Gallup survey, 61 percent of people said they checked their phones less frequently than others they knew.

Our transformation into device people has happened with unprecedented suddenness. The first touchscreen-operated iPhones went on sale in June 2007, followed by the first Android-powered phones the following year. Smartphones went from 10 percent to 40 percent market penetration faster than any other consumer technology in history. In the United States, adoption hit 50 percent only three years ago. Yet today, not carrying a smartphone indicates eccentricity, social marginalization, or old age.

What does it mean to shift overnight from a society in which people walk down the street looking around to one in which people walk down the street looking at machines? [Continue reading…]

As one of those eccentric, socially marginalized but not quite old aged people without a smartphone, it means I now live in a world where it seems the mass of humanity has become myopic.

A driver remains stationary in front of a green light.

A couple sit next to each other in an airport, wrapped in silence with attention directed elsewhere down their mutually exclusive wormholes.

A jogger in the woods, hears no birdsong because his ears are stuffed with plastic buds delivering private tunes.

Amidst all this divided attention, one thing seems abundantly clearly: devices tap into and amplify the desire to be some place else.

To be confined to the present place and the present time is to be trapped in a prison cell from which the smartphone offers escape — though of course it doesn’t.

What it does is produce an itch in time; a restless sense that we don’t have enough — that an elusive missing something might soon appear on that mesmerizing little touchscreen.

The effect of this refusal to be where we are is to impoverish life as our effort to make it larger ends up doing the reverse.

Facebooktwittermail

Revealed: Honeybees are being killed off by a manmade pandemic

By Stephen John Martin, University of Salford

We live in a world where large numbers of people are connected by just a few degrees of separation. But while having friends of friends all over the globe can be great for holidays, trade and networking, travel also allows viruses to move like never before.

Zika is the latest “explosive pandemic” to be declared a global emergency by the World Health Organisation. But viruses don’t just target humans – they can infect all forms of life from bacteria to bananas, horses to honeybees.

A lethal combination of the Varroa mite and the deformed wing virus has resulted in the death of billions of bees over the past half century. In a study published in the journal Science, colleagues from the Universities of Exeter, Sheffield and I report how the virus has spread across the globe.

Continue reading

Facebooktwittermail

Race is a social construct, scientists argue

Scientific American: More than 100 years ago, American sociologist W.E.B. Du Bois was concerned that race was being used as a biological explanation for what he understood to be social and cultural differences between different populations of people. He spoke out against the idea of “white” and “black” as discrete groups, claiming that these distinctions ignored the scope of human diversity.

Science would favor Du Bois. Today, the mainstream belief among scientists is that race is a social construct without biological meaning. And yet, you might still open a study on genetics in a major scientific journal and find categories like “white” and “black” being used as biological variables.

In an article published today (Feb. 4) in the journal Science, four scholars say racial categories are weak proxies for genetic diversity and need to be phased out. [Continue reading…]

Facebooktwittermail

Imaginary conversations with imaginary atheists can reduce mistrust of real atheists

Pacific Standard reports: They’re feared and often loathed, viewed as non-conformists who pose a threat to our nation’s moral compass. But if more were open about their inclinations, and engaged in congenial conversation with members of the mistrusting majority, that prejudice might start melting away.

It happened with gays and lesbians. Perhaps it’s time for atheists to give it a try.

That’s one implication of newly published research, which reports simply imagining a positive interaction with an atheist is enough to increase willingness to engage and cooperate with them. [Continue reading…]

Facebooktwittermail

The significance of ‘untranslatable’ words

wet-rock

Tim Lomas writes: [‘untranslatable’] words exert great fascination, not only in specialised fields like linguistics or anthropology (Wierzbicka, 1999), but also in popular culture. Part of the fascination seems to derive from the notion that such words offer ‘windows’ into other cultures, and thus potentially into new ways of being in the world. As Wierzbicka (1997, p. 5) puts it, ‘words with special, culture-specific meanings reflect and pass on not only ways of living characteristic of a given society, but also ways of thinking’. Thus, ‘untranslatable’ words are not only of interest to translators; after all, many such professionals argue that it can be difficult to find exact translations for most words, and that nearly all terms lose some specificity or nuance when rendered in another tongue (Hatim & Munday, 2004). Rather, ‘untranslatability’ reflects the notion that such words identify phenomena that have only been recognised by specific cultures. Perhaps the most famous example is Schadenfreude, a German term describing pleasure at the misfortunes of others. Such words are not literally untranslatable, of course, since their meaning can be conveyed in a sentence. Rather, they are deemed ‘untranslatable’ to the extent that other languages lack a single word/phrase for the phenomenon.

The significance of such words is much debated. A dominant theoretical notion here is ‘linguistic relativity’ (Hussein, 2012). First formulated by the German philosophers Herder (1744–1803) and Humboldt (1767–1835), it came to prominence with the linguist Sapir (1929) and his student Whorf (1940). Their so-called ‘Sapir-Whorf hypothesis’ holds that language plays a constitutive role in the way that people experience, understand and even perceive the world. As Whorf (1956, pp. 213–214) put it, ‘We dissect nature along lines laid out by our native languages … The world is presented as a kaleidoscopic flux of impressions which has to be organized … largely by the linguistic systems in our minds’. This hypothesis comes in various strengths. Its stronger form is linguistic determinism, where language inextricably constitutes and constrains thought. For instance, Whorf argued that the Hopi people had a different experience of time due to particularities in their grammar, such that they lacked a linear sense of past, present and future. This strong determinism has been criticised, e.g. by Pinker (1995), who argued that the Hopi experience of time was not particularly different to that of Western cultures. However, the milder form of the hypothesis, linguistic relativism, simply holds that language shapes thought and experience. This milder hypothesis is generally accepted by most anthropologists and other such scholars (Perlovsky, 2009). Continue reading

Facebooktwittermail

Did the Vikings use crystal ‘sunstones’ to discover America?

By Stephen Harding, University of Nottingham

Ancient records tell us that the intrepid Viking seafarers who discovered Iceland, Greenland and eventually North America navigated using landmarks, birds and whales, and little else. There’s little doubt that Viking sailors would also have used the positions of stars at night and the sun during the daytime, and archaeologists have discovered what appears to be a kind of Viking navigational sundial. But without magnetic compasses, like all ancient sailors they would have struggled to find their way once the clouds came over.

However, there are also several reports in Nordic sagas and other sources of a sólarsteinn “sunstone”. The literature doesn’t say what this was used for but it has sparked decades of research examining if this might be a reference to a more intriguing form of navigational tool.

The idea is that the Vikings may have used the interaction of sunlight with particular types of crystal to create a navigational aid that may even have worked in overcast conditions. This would mean the Vikings had discovered the basic principles of measuring polarised light centuries before they were explained scientifically and which are today used to identify and measure different chemicals. Scientists are now getting closer to establishing if this form of navigation would have been possible, or if it is just a fanciful theory.

Continue reading

Facebooktwittermail

Do chins have a purpose?

mussolini

Ed Wong writes: “Little pig, little pig, let me come in,” says the big, bad wolf. “No, no, not by the hair on my chinny chin chin,” say the three little pigs. This scene is deeply unrealistic and not just because of the pigs’ architectural competence, the wolf’s implausible lung capacity, and everyone’s ability to talk.

The thing is: Pigs don’t have chins. Nor do any animals, except for us.

The lower jaw of a chimpanzee or gorilla slopes backwards from the front teeth. So did the jaw of other hominids like Homo erectus. Even Neanderthal jaws ended in a flat vertical plane. Only in modern humans does the lower jaw end in a protruding strut of bone. A sticky-outy bit. A chin.

“It’s really strange that only humans have chins,” says James Pampush from Duke University. “When we’re looking at things that are uniquely human, we can’t look to big brains or bipedalism because our extinct relatives had those. But they didn’t have chins. That makes this immediately relevant to everyone.” Indeed, except in rare cases involving birth defects, everyone has chins. Sure, some people have less pronounced ones than others, perhaps because their lower jaws are small or they have more flesh around the area. But if you peeled back that flesh and exposed their jawbones — and maybe don’t do that — you’d still see a chin.

So, why do chins exist?

There are no firm answers, which isn’t for lack of effort. Evolutionary biologists have been proposing hypotheses for more than a century, and Pampush has recently reviewed all the major ideas, together with David Daegling. “We kept showing, for one reason or another, that these hypotheses are not very good,” he says.

The most heavily promoted explanation is that chins are adaptations for chewing — that they help to reduce the physical stresses acting upon a masticating jaw. But Pampush found that, if anything, the chin makes things worse. The lower jaw consists of two halves that are joined in the middle; when we chew, we compress the bone on the outer face of this join (near the lips) and pull on the bone on the inner face (near the tongue). Since bone is much stronger when compressed than pulled, you’d ideally want to reinforce the inner face of the join and not the outer one. In other words, you’d want the opposite of a chin. [Continue reading…]

Facebooktwittermail

Keen to be healthier in old age? Tend your inner garden

By Claire Steves, King’s College London and Tim Spector, King’s College London

The world’s oldest man, Yasutaro Koide recently died at the age of 112. Commentators as usual, focused on his reported “secret to longevity”: not smoking, drinking or overdoing it. No surprises there. But speculation on the basis of one individual is not necessarily the most helpful way of addressing this human quest for the Philosopher’s Stone.

The “very old” do spark our interest – but is our search for a secret to longevity actually misguided? Wouldn’t you rather live healthier than live longer in poor health? Surely, what we really want to know is how do we live well in old age.

Clearly as scientists we try to illuminate these questions using populations of people not just odd individuals. Many previous attempts have approached this question by looking for differences between young and old people, but this approach is often biased by the many social and cultural developments that happen between generations, including diet changes. Time itself should not be the focus – at least, in part, because time is one thing we are unlikely to be able to stop.

Yasutaro Koide made 112.
Kyodo/Reuters

The real question behind our interest in people who survive into old age is how some manage to stay robust and fit while others become debilitated and dependent. To this end, recent scientific interest has turned to investigating the predictors of frailty within populations of roughly the same age. Frailty is a measure of how physically and mentally healthy an individual is. Studies show frailer older adults have an increased levels of low grade inflammation – so-called “inflammaging”.

Continue reading

Facebooktwittermail

Ancient societies were far more advanced than we commonly assume

Pacific Standard reports: Trapezoids are, oddly enough, fundamental to modern science. When European scientists used them to simplify certain astronomical calculations in the 14th century, it was an important first step toward calculus—the mathematics Isaac Newton and Gottfried Leibniz developed to understand the physics of astronomical objects like planets. In other words, trapezoids are important, and we’ve known this for nearly 700 years.

Well, the Babylonians knew all of that 14 centuries earlier, according to new research published in Science, proving once again that ancient societies were way more advanced than we’d like to think. [Continue reading…]

Facebooktwittermail

‘Half the confusion in the world comes from not knowing how little we need’

lichen7

Pico Iyer writes: The idea of going nowhere is as universal as the law of gravity; that’s why wise souls from every tradition have spoken of it. “All the unhappiness of men,” the seventeenth-century French mathematician and philosopher Blaise Pascal famously noted, “arises from one simple fact: that they cannot sit quietly in their chamber.” After Admiral Richard E. Byrd spent nearly five months alone in a shack in the Antarctic, in temperatures that sank to 70 degrees below zero, he emerged convinced that “Half the confusion in the world comes from not knowing how little we need.” Or, as they sometimes say around Kyoto, “Don’t just do something. Sit there.”

Yet the days of Pascal and even Admiral Byrd seem positively tranquil by today’s standards. The amount of data humanity will collect while you’re reading The Art of Stillness is five times greater than the amount that exists in the entire Library of Congress. Anyone reading it will take in as much information today as Shakespeare took in over a lifetime. Researchers in the new field of interruption science have found that it takes an average of twenty-five minutes to recover from a phone call. Yet such interruptions come every eleven minutes — which means we’re never caught up with our lives.

And the more facts come streaming in on us, the less time we have to process any one of them. The one thing technology doesn’t provide us with is a sense of how to make the best use of technology. Put another way, the ability to gather information, which used to be so crucial, is now far less important than the ability to sift through it. [Continue reading…]

Facebooktwittermail

How plants rely on friendly fungal bodyguards

By Alan Gange, Royal Holloway

Two plants of the same species grow side by side. One is attacked by insects, one not. On an individual plant, some leaves get eaten, some not. This doesn’t happen at random, but is caused by the fungi that live within the leaves and roots of the plant.

Imagine you are holding a shoot of the dahlia plant, pictured below. How many species do you have in your hand? The answer is most certainly not one, but probably somewhere between 20 and 30. This is because every plant has fungi and bacteria that live on its surface (called epiphytes) and within its tissues (called endophytes).

If the stem is still attached to its roots then the number of species would easily double. The roots contain lots of endophytes and a separate group of fungi, called mycorrhizas. These fungi grow into plant roots and form a symbiotic relationship in which the fungus donates nutrients (principally phosphate and nitrate) to the plant, in return for a supply of carbon.

Dahlia is full of fungi.
Alan Gange, Author provided

There has been a recent surge of interest in these fungi, as their presence can affect the growth of insects that attack plants. Research at Royal Holloway has shown that mycorrhizal fungi reduce the growth of many insects, by increasing the plant’s chemical defences. Our most recent work shows that endophyte fungi, the ones that live within plant tissue, can also cause plants to produce novel chemicals.

Continue reading

Facebooktwittermail