Author Archives: Attention to the Unseen

How noise pollution is changing animal behaviour

By Graeme Shannon, Bangor University

Noise pollution, generally an unintended byproduct of urbanisation, transport and industry, is a key characteristic of human development and population growth. In some cases, it is produced intentionally, for example when seismic surveys are being carried out using powerful airgun arrays to explore and map the seafloor, or active sonar, which uses sound waves to detect objects in the ocean.

All of this noise – whether intentional or not – has the ability to alter the acoustic environment of aquatic and terrestrial habitats. This can have a dramatic effect on the animals that live in them, perhaps even driving evolutionary change as species adapt to or avoid noisy environments.

Rising noise levels

The dramatic and comparatively recent rise in noise levels is marked in both magnitude and extent, with an estimated 30% of the European population exposed to road traffic noise levels greater than 55dB (decibels) at night, well above the 40dB target recommended by the World Health Organisation. Even remote natural areas do not escape the reach of anthropogenic, or manmade, noise. One study across 22 US national parks demonstrated that this kind of noise was, on average, audible more than 28% of the time.

Noise is not just irritating; we have known for some time that it can have direct human health impacts. Indeed, chronic exposure to noise levels above 55dB dramatically increases the risks of heart disease and stroke, while aircraft noise has been shown to impact the development of reading skills in children attending schools close to busy airports. The WHO estimates that in Europe at least a million healthy life years are lost every year due to traffic noise.

Continue reading

Facebooktwittermail

6,000 years ago humans upturned 300 million years of evolution

Smithsonian.com reports: It’s hard to imagine a global force strong enough to change natural patterns that have persisted on Earth for more than 300 million years, but a new study shows that human beings have been doing exactly that for about 6,000 years.

The increase in human activity, perhaps tied to population growth and the spread of agriculture, seems to have upended the way plants and animals distribute themselves across the land, so that species today are far more segregated than they’ve been at any other time.

That’s the conclusion of a study appearing this week in the journal Nature, and the ramifications could be huge, heralding a new stage in global evolution as dramatic as the shift from single-celled microbes to complex organisms.

A team of researchers led by S. Kathleen Lyons, a paleobiologist at the Evolution of Terrestrial Ecosystems (ETE) program in the Smithsonian’s National Museum of Natural History, examined the distribution of plants and animals across landscapes in the present and back through the fossil record in search of patterns.

Mostly they found randomness, but throughout time, there was always a small subset of plants and animals that showed up in relationship to one another more often than can be attributed to chance. That relationship either meant that pairs of species occur together, so when you find one, you usually find the other. Or it meant the opposite: when you find one, the other is usually not present, in which case they’re considered segregated. [Continue reading…]

Facebooktwittermail

Are humans reaching the limits of our ability to probe the laws of nature?

blakes-rock

Natalie Wolchover writes: Physicists typically think they “need philosophers and historians of science like birds need ornithologists,” the Nobel laureate David Gross told a roomful of philosophers, historians and physicists last week in Munich, Germany, paraphrasing Richard Feynman.

But desperate times call for desperate measures.

Fundamental physics faces a problem, Gross explained — one dire enough to call for outsiders’ perspectives. “I’m not sure that we don’t need each other at this point in time,” he said.

It was the opening session of a three-day workshop, held in a Romanesque-style lecture hall at Ludwig Maximilian University (LMU Munich) one year after George Ellis and Joe Silk, two white-haired physicists now sitting in the front row, called for such a conference in an incendiary opinion piece in Nature. One hundred attendees had descended on a land with a celebrated tradition in both physics and the philosophy of science to wage what Ellis and Silk declared a “battle for the heart and soul of physics.”

The crisis, as Ellis and Silk tell it, is the wildly speculative nature of modern physics theories, which they say reflects a dangerous departure from the scientific method. Many of today’s theorists — chief among them the proponents of string theory and the multiverse hypothesis — appear convinced of their ideas on the grounds that they are beautiful or logically compelling, despite the impossibility of testing them. Ellis and Silk accused these theorists of “moving the goalposts” of science and blurring the line between physics and pseudoscience. “The imprimatur of science should be awarded only to a theory that is testable,” Ellis and Silk wrote, thereby disqualifying most of the leading theories of the past 40 years. “Only then can we defend science from attack.”

They were reacting, in part, to the controversial ideas of Richard Dawid, an Austrian philosopher whose 2013 book String Theory and the Scientific Method identified three kinds of “non-empirical” evidence that Dawid says can help build trust in scientific theories absent empirical data. Dawid, a researcher at LMU Munich, answered Ellis and Silk’s battle cry and assembled far-flung scholars anchoring all sides of the argument for the high-profile event last week.

Gross, a supporter of string theory who won the 2004 Nobel Prize in physics for his work on the force that glues atoms together, kicked off the workshop by asserting that the problem lies not with physicists but with a “fact of nature” — one that we have been approaching inevitably for four centuries.

The dogged pursuit of a fundamental theory governing all forces of nature requires physicists to inspect the universe more and more closely — to examine, for instance, the atoms within matter, the protons and neutrons within those atoms, and the quarks within those protons and neutrons. But this zooming in demands evermore energy, and the difficulty and cost of building new machines increases exponentially relative to the energy requirement, Gross said. “It hasn’t been a problem so much for the last 400 years, where we’ve gone from centimeters to millionths of a millionth of a millionth of a centimeter” — the current resolving power of the Large Hadron Collider (LHC) in Switzerland, he said. “We’ve gone very far, but this energy-squared is killing us.”

As we approach the practical limits of our ability to probe nature’s underlying principles, the minds of theorists have wandered far beyond the tiniest observable distances and highest possible energies. Strong clues indicate that the truly fundamental constituents of the universe lie at a distance scale 10 million billion times smaller than the resolving power of the LHC. This is the domain of nature that string theory, a candidate “theory of everything,” attempts to describe. But it’s a domain that no one has the faintest idea how to access. [Continue reading…]

Facebooktwittermail

How a joke can help us unlock the mystery of meaning in language

By Vyvyan Evans, Bangor University

What do you get if you cross a kangaroo with an elephant?

You’ll have to wait for the punchline, but you should already have shards of meaning tumbling about your mind. Now, jokes don’t have to be all that funny, of course, but if they are to work at all then they must construct something beyond the simple words deployed.

Language is the tissue that connects us in our daily social lives. We use it to gossip, to get a job, and give someone the sack. We use it to seduce, quarrel, propose marriage, get divorced and yes, tell the odd gag. In the absence of telepathy, it lets us interact with our nearest and dearest, and in our virtual web of digital communication, with hundreds of people we may never have met.

But while we now know an awful lot about the detail of the grammatical systems of the world’s 7,000 or so languages, scientific progress on the mysterious elixir of communication – meaning – has been a much tougher nut to crack.

Continue reading

Facebooktwittermail

Possessed by a mask

Sandra Newman writes: It is an acknowledged fact of modern life that the internet brings out the worst in people. Otherwise law-abiding citizens pilfer films and music. Eminent authors create ‘sock puppets’ to anonymously praise their own work and denigrate that of rivals. Teenagers use the internet for bullying; even more disturbingly, grown-ups bully strangers with obsessive zeal, sometimes even driving them from their homes with repeated murder threats. Porn thrives, and takes on increasingly bizarre and often disturbing forms.

Commentators seem at a loss to satisfactorily account for this surge in antisocial tendencies. Sometimes it’s blamed on a few sociopathic individuals – but the offenders include people who are impeccably decent in their offline lives. The anonymity of online life is another explanation commonly given – but these behaviours persist even when the identities of users are easily discovered, and when their real names appear directly above offensive statements. It almost seems to be a contagion issuing from the technology itself, or at least strong evidence that computers are alienating us from our humanity. But we might have a better chance of understanding internet hooliganism if we looked at another form of concealment that isn’t true concealment, but that nonetheless has historically lured people into behaving in ways that are alien to their normal selves: the mask.

There doesn’t seem to be any culture in which masks have not been used. From the Australian outback to the Arctic, from Mesolithic Africa to the United States of the 21st century, people have always made and employed masks in ways that are seemingly various and yet have an underlying commonality. Their earliest appearance is in religious ritual. [Continue reading…]

Facebooktwittermail

Local ecological disasters are too easily obscured by the lofty discourse of climate change

Brandon Keim writes: In the Great Basin desert of the western United States, not far from the Great Salt Lake, is a kind of time machine. Homestead Cave has been inhabited for the past 13,000 years by successive generations of owls, beneath whose roosts accumulated millennia-deep piles of undigested fur and bone. By examining these piles, researchers have been able to reconstruct the region’s ecological history. It contains a very timely lesson.

Those 13,000 years spanned some profound environmental upheavals. Indeed, the cave opened when Lake Bonneville, a vast prehistoric water body that covered much of the region, receded at the last ice age’s end, and the Great Basin shifted from rainfall-rich coolness to its present hot, dry state. Yet despite these changes, life was pretty stable. Different species flourished at different times, but the total amount of biological energy – a metric used by ecologists to describe all the metabolic activity in an ecosystem – remained steady.

About a century ago, though, all that changed. There’s now about 20 per cent less biological energy flowing through the Great Basin than at the 20th century’s beginning. To put it another way: life’s richness contracted by one-fifth in an eyeblink of geological time. The culprit? Not climate change, as one might expect, but human activity, in particular the spread of invasive non-native grasses that flourish in disturbed areas and have little nutritional value, sustaining less life than would the native plants they’ve displaced.

I find myself thinking often of the parable of Homestead Cave, as I’ve come to call it. It underscores how resilient nature can be, and also the enormity of human impacts, which in this case dwarfed the transition to an entirely new climate state. The latter point, I fear, is too often overlooked these days, obscured by a fixation on climate change as Earth’s great ecological problem.

Make no mistake: climate change is a huge, desperately important issue. And it feels strange, if not downright traitorous, to raise concerns about the attention it receives. The parable of Homestead Cave is no licence to shirk climate duties on the assumption that nature will adapt, or to imagine that a rapidly warming, weather-extremed Earth won’t be calamitous for non‑human life. It will be. But so is a great deal else that we do. Paying attention to climate change and to other human impacts shouldn’t be a zero-sum game, but it too often seems that way. [Continue reading…]

Facebooktwittermail

When languages die, we lose a part of who we are

By Anouschka Foltz, Bangor University

The 2015 Paris Climate Conference (COP21) is in full gear and climate change is again on everyone’s mind. It conjures up images of melting glaciers, rising sea levels, droughts, flooding, threatened habitats, endangered species, and displaced people. We know it threatens biodiversity, but what about linguistic diversity?

Humans are the only species on the planet whose communication system exhibits enormous diversity. And linguistic diversity is crucial for understanding our capacity for language. An increase in climate-change related natural disasters may affect linguistic diversity. A good example is Vanuatu, an island state in the Pacific, with quite a dramatic recent rise in sea levels.

There are over 7,000 languages spoken in the world today. These languages exhibit enormous diversity, from the number of distinctive sounds (there are languages with as few as 11 different sounds and as many as 118) to the vast range of possible word orders, structures and concepts that languages use to convey meaning. Every absolute that linguists have posited has been challenged, and linguists are busy debating if there is anything at all that is common to all languages in the world or anything at all that does not exist in the languages of the world. Sign languages show us that languages do not even need to be spoken. This diversity is evidence of the enormous flexibility and plasticity of the human brain and its capacity for communication.

Studying diverse languages gives us invaluable insights into human cognition. But language diversity is at risk. Languages are dying every year. Often a language’s death is recorded when the last known speaker dies, and about 35% of languages in the world are currently losing speakers or are more seriously endangered. Most of these have never been recorded and so would be lost forever. Linguists estimate that about 50% of the languages spoken today will disappear in the next 100 years. Some even argue that up to 90% of today’s languages will have disappeared by 2115.

Continue reading

Facebooktwittermail

Millet: The missing piece in the puzzle of prehistoric humans’ transition from hunter-gatherers to farmers

New research shows a cereal familiar today as birdseed was carried across Eurasia by ancient shepherds and herders laying the foundation, in combination with the new crops they encountered, of ‘multi-crop’ agriculture and the rise of settled societies. Archaeologists say ‘forgotten’ millet has a role to play in modern crop diversity and today’s food security debate.

The domestication of the small-seeded cereal millet in North China around 10,000 years ago created the perfect crop to bridge the gap between nomadic hunter-gathering and organised agriculture in Neolithic Eurasia, and may offer solutions to modern food security, according to new research.

Now a forgotten crop in the West, this hardy grain – familiar in the west today as birdseed – was ideal for ancient shepherds and herders, who carried it right across Eurasia, where it was mixed with crops such as wheat and barley. This gave rise to ‘multi-cropping’, which in turn sowed the seeds of complex urban societies, say archaeologists.

A team from the UK, USA and China has traced the spread of the domesticated grain from North China and Inner Mongolia into Europe through a “hilly corridor” along the foothills of Eurasia. Millet favours uphill locations, doesn’t require much water, and has a short growing season: it can be harvested 45 days after planting, compared with 100 days for rice, allowing a very mobile form of cultivation.

Nomadic tribes were able to combine growing crops of millet with hunting and foraging as they travelled across the continent between 2500 and 1600 BC. Millet was eventually mixed with other crops in emerging populations to create ‘multi-crop’ diversity, which extended growing seasons and provided our ancient ancestors with food security.

The need to manage different crops in different locations, and the water resources required, depended upon elaborate social contracts and the rise of more settled, stratified communities and eventually complex ‘urban’ human societies.

Researchers say we need to learn from the earliest farmers when thinking about feeding today’s populations, and millet may have a role to play in protecting against modern crop failure and famine.

Continue reading

Facebooktwittermail

A scientific approach designed to precisely calibrate the metrics needed for quantifying bullshit

Science News reports: Dutch social psychologist Diederik Stapel was known for his meteoric rise, until he was known for his fall. His research on social interactions, which spanned topics from infidelity to selfishness to discrimination, frequently appeared in top-tier journals. But then in 2011, three junior researchers raised concerns that Stapel was fabricating data. Stapel’s institution, Tilburg University, suspended him and launched a formal investigation. A commission ultimately determined that of his more than 125 research papers, at least 55 were based on fraudulent data. Stapel now has 57 retractions to his name.

The case provided an unusual opportunity for exploring the language of deception: One set of Stapel’s papers that discussed faked data and a set of his papers based on legitimate results. Linguists David Markowitz and Jeffrey Hancock ran an analysis of articles in each set that listed Stapel as the first author. The researchers discovered particular tells in the language that allowed them to peg the fraudulent work with roughly 70 percent accuracy. While Stapel was careful to concoct data that appeared to be reasonable, he oversold his false goods, using, for example, more science-related terms and more amplifying terms, like extreme and exceptionally, in the now-retracted papers.

Markowitz and Hancock, now at Stanford, are still probing the language of lies, and they recently ran a similar analysis on a larger sample of papers with fudged data.

The bottom line: Fraudulent papers were full of jargon, harder to read, and bloated with references. This parsing-of-language approach, which the team describes in the Journal of Language and Social Psychology, might be used to flag papers that deserve extra scrutiny. But tricks for detecting counterfeit data are unlikely to thwart the murkier problem of questionable research practices or the general lack of clarity in the scientific literature.

“This is an important contribution to the discussion of quality control in research,”Nick Steneck, a science historian at the University of Michigan and an expert in research integrity practices, told me. “But there’s a whole lot of other reasons why clarity and readability of scientific writing matters, including making things understandable to the public.” [Continue reading…]

Facebooktwittermail

Naturalists are becoming an endangered species

By David Norman, University of Cambridge

The phrase “Natural History” is linked in most people’s minds today with places that use the phrase: the various Natural History Museums, or television programmes narrated so evocatively by renowned naturalist Sir David Attenborough.

As times have changed, used in its traditional sense the phrase now has an almost archaic ring to it, perhaps recalling the Victorian obsession with collecting butterflies or beetles, rocks or fossils, or stuffed birds and animals, or perhaps the 18th century best-seller, Gilbert White’s The Natural History of Selborne.

Once natural history was part of what was equally archaically called natural philosophy, encompassing the enquiry into all aspects of the natural world that we inhabit, from the tiniest creature to the largest, to molecules and materials, to planets and stars in outer space. These days, we call it science. Natural history specifically strives to study and understand organisms within their environment, which would these days equate to the disciplines of ecology or conservation.

In a recent article in the journal BioScience, a group of 17 scientists decry what they see as a shift away from this traditional learning (once typical parts of biology degrees) that taught students about organisms: where they live, what they eat, how they behave, their variety and relationships to their ecosystems in which they live.

Partly by the promise of a course-specific career, and perhaps partly because of poorly taught courses that can emphasise rote learning, students are enticed into more exciting fields such as biotechnology or evolutionary developmental biology (“evo-devo”), where understanding an organism is less important than understanding the function of a particular organ or limb.

Continue reading

Facebooktwittermail

What if historians started taking the ‘what if’ seriously?

Rebecca Onion writes: What if Adolf Hitler’s paintings had been acclaimed, rather than met with faint praise, and he had gone into art instead of politics? Have you ever wondered whether John F Kennedy would have such a shining reputation if he had survived his assassination and been elected to a second term? Or how the United States might have fared under Japanese occupation? Or what the world would be like if nobody had invented the airplane?

If you enjoy speculating about history in these counterfactual terms, there are many books and movies to satisfy you. The counterfactual is a friend to science-fiction writers and chatting partygoers alike. Yet ‘What if?’ is not a mode of discussion you’ll commonly hear in a university history seminar. At some point in my own graduate-school career, I became well-acculturated to the idea that counterfactualism was (as the British historian E P Thompson wrote in 1978) ‘Geschichtwissenschlopff, unhistorical shit.’

‘“What if?” is a waste of time’ went the headline to the Cambridge historian Richard Evans’ piece in The Guardian last year. Surveying the many instances of public counterfactual discourse in the anniversary commemorations of the First World War, Evans wrote: ‘This kind of fantasising is now all the rage, and threatens to overwhelm our perceptions of what really happened in the past, pushing aside our attempts to explain it in favour of a futile and misguided attempt to decide whether the decisions taken in August 1914 were right or wrong.’ It’s hard enough to do the reading and research required to understand the complexity of actual events, Evans argues. Let’s stay away from alternative universes.

But hold on a minute. In October 2015, when asked if, given the chance, he would kill the infant Hitler, the US presidential candidate Jeb Bush retorted with an enthusiastic: ‘Hell yeah, I would!’ Laughter was a first response: what a ridiculous question! And didn’t Bush sound a lot like his brash ‘Mission Accomplished’ brother George W just then? When The New York Times Magazine had asked its readers to make the same choice, only 42 per cent responded with an equally unequivocal ‘Yes’. And as The Atlantic’s thoughtful piece on the question by Matt Ford illustrated, in order to truly answer this apparently silly hypothetical, you have to define your own beliefs about the nature of progress, the inherent contingency of events, and the influence of individuals – even very charismatic ones – on the flow of historical change. These are big, important questions. If well-done counterfactuals can help us think them through, shouldn’t we allow what-ifs some space at the history table? [Continue reading…]

Facebooktwittermail

Some of our most cherished traits are shared by other animals — and even plants

Amos Zeeberg, Jonathon Keats, and Brandon Keim, write: The Venus flytrap, like most people in the Internet age, has about a 30-second attention span. But that’s a blessing for the carnivorous plant, which relies on memory to survive. The lobes of the plant are laced with three or four “trigger” hairs. When an insect enters the plant and rubs the trigger hairs, the lobes snap shut and the plant consumes its prey. Each stimulation generates an electrical charge, but it generally takes two charges to spark the electrochemical signal that triggers the closure, so the plant must “remember” the first charge as it waits for the second. It has only enough energy to remember for about 30 seconds, so its survival depends on short-term memory and the ability to forget. Similarly, in a human brain, a neuron builds up an electrical charge when stimulated by other nerves, approaching a threshold above which it will fire an electrical signal — the basis of everything from recognizing a plant, like a Venus flytrap, to contemplating the meaning of life. [Continue reading…]

Facebooktwittermail