Rebecca Kessler writes: ‘It’s hard to believe that a 40-ton animal can get hidden. They’re sneaky.’ Charles ‘Stormy’ Mayo was scanning the sea from the deck of the R/V Shearwater searching for omens: a cloud of vapour, a patch of white water, a fluke. A few minutes earlier someone had spotted the first North Atlantic right whales of the day. But now they were down below and out of sight in 80 feet of murky seawater. Feeding, most likely.
Finally, a whale’s head emerged briefly on the sea surface. Then a slab of black back followed by the silhouette of flukes, signaling another deep dive. The appearance lasted maybe a second and a half. Groans from the crew, who did not quite manage to snap a photo that could help identify the whale, one of an early March influx that foretold another strong season in Cape Cod Bay. ‘There’s probably a bunch of whales here but it’s going to drive us crazy,’ Mayo chimed in. ‘I’m going to say there are probably three. It’s hard as hell to tell.’
The world’s rarest whales – Eubalaena glacialis – have been visiting the bay in late winter and early spring for as long as anyone can remember. But Mayo and his team at the Center for Coastal Studies (CCS) in Provincetown documented a puzzling uptick in recent years. Not just a few dozen animals, as was typical, but hundreds were showing up and, in one year, darn-near two-thirds of the world’s entire living population of around 500 North Atlantic right whales. ‘Right Whale Kingdom’ Mayo has called the bay. Simultaneously, the whales went AWOL from their usual summer feeding grounds 300 miles to the northeast in Canada’s Bay of Fundy and elsewhere, further mystifying researchers.
The Shearwater idled, waiting. The whales remained deep down, scooping up patches of zooplankton and straining out the seawater through the long strips of baleen in their mouths. Scooping and straining, scooping and straining. A change in the location of their preferred food is the most likely explanation for the whales’ wandering itinerary, Mayo said. Something is shifting out there in the ocean. As with so much else about their lives, only the whales know what it is. [Continue reading…]
Category Archives: Attention to the Unseen
Watching evolution happen in two lifetimes
Emily Singer writes: When Rosemary and Peter Grant first set foot on Daphne Major, a tiny island in the Galápagos archipelago, in 1973, they had no idea it would become a second home. The husband and wife team, now emeritus biology professors at Princeton University, were looking for a pristine environment in which to study evolution. They hoped that the various species of finches on the island would provide the perfect means for uncovering the factors that drive the formation of new species.
The diminutive island wasn’t a particularly hospitable place for the Grants to spend their winters. At less than one-hundredth the size of Manhattan, Daphne resembles the tip of a volcano rising from the sea. Visitors must leap off the boat onto the edge of a steep ring of land that surrounds a central crater. The island’s vegetation is sparse. Herbs, cactus bushes and low trees provide food for finches — small, medium and large ground finches, as well as cactus finches — and other birds. The Grants brought with them all the food and water they would need and cooked meals in a shallow cave sheltered by a tarp from the baking sun. They camped on Daphne’s one tiny flat spot, barely larger than a picnic table.
Though lacking in creature comforts, Daphne proved to be a fruitful choice. The Galápagos’ extreme climate — swinging between periods of severe drought and bountiful rain — furnished ample natural selection. Rainfall varied from a meter of rain in 1983 to none in 1985. A severe drought in 1977 killed off many of Daphne’s finches, setting the stage for the Grants’ first major discovery. During the dry spell, large seeds became more plentiful than small ones. Birds with bigger beaks were more successful at cracking the large seeds. As a result, large finches and their offspring triumphed during the drought, triggering a lasting increase in the birds’ average size. The Grants had observed evolution in action.
That striking finding launched a prolific career for the pair. They visited Daphne for several months each year from 1973 to 2012, sometimes bringing their daughters. Over the course of their four-decade tenure, the couple tagged roughly 20,000 birds spanning at least eight generations. (The longest-lived bird on the Grants’ watch survived a whopping 17 years.) They tracked almost every mating and its offspring, creating large, multigenerational pedigrees for different finch species. They took blood samples and recorded the finches’ songs, which allowed them to track genetics and other factors long after the birds themselves died. They have confirmed some of Darwin’s most basic predictions and have earned a variety of prestigious science awards, including the Kyoto Prize in 2009.
Now nearly 80, the couple have slowed their visits to the Galápagos. These days, they are most excited about applying genomic tools to the data they collected. They are collaborating with other scientists to find the genetic variants that drove the changes in beak size and shape that they tracked over the past 40 years. Quanta Magazine spoke with the Grants about their time on Daphne; an edited and condensed version of the conversation follows. [Continue reading…]
Rewriting Earth’s creation story
Rebecca Boyle writes: Humanity’s trips to the moon revolutionized our view of this planet. As seen from another celestial body, Earth seemed more fragile and more precious; the iconic Apollo 8 image of Earth rising above the lunar surface helped launch the modern environmental movement. The moon landings made people want to take charge of Earth’s future. They also changed our view of its past.
Earth is constantly remaking itself, and over the eons it has systematically erased its origin story, subsuming and cannibalizing its earliest rocks. Much of what we think we know about the earliest days of Earth therefore comes from the geologically inactive moon, which scientists use like a time capsule.
Ever since Apollo astronauts toted chunks of the moon back home, the story has sounded something like this: After coalescing from grains of dust that swirled around the newly ignited sun, the still-cooling Earth would have been covered in seas of magma, punctured by inky volcanoes spewing sulfur and liquid rock. The young planet was showered in asteroids and larger structures called planetisimals, one of which sheared off a portion of Earth and formed the moon. Just as things were finally settling down, about a half-billion years after the solar system formed, the Earth and moon were again bombarded by asteroids whose onslaught might have liquefied the young planet — and sterilized it.
Geologists named this epoch the Hadean, after the Greek version of the underworld. Only after the so-called Late Heavy Bombardment quieted some 3.9 billion years ago did Earth finally start to morph into the Edenic, cloud-covered, watery world we know.
But as it turns out, the Hadean may not have been so hellish. New analysis of Earth and moon rocks suggest that instead of a roiling ball of lava, baby Earth was a world with continents, oceans of water, and maybe even an atmosphere. It might not have been bombarded by asteroids at all, or at least not in the large quantities scientists originally thought. The Hadean might have been downright hospitable, raising questions about how long ago life could have arisen on this planet. [Continue reading…]
Tardigrades: The most fascinating animals known to science
Brian Resnick writes: Paul Bartels gets a rush every time he discovers a new species of tardigrade, the phylum of microscopic animals best known for being both strangely cute and able to survive the vacuum of space.
“The first paper I wrote describing a new species, there was a maternal-paternal feeling — like I just gave birth to this new thing,” he tells me on a phone call.
The rush comes, in part, because tardigrades are the most fascinating animals known to science, able to survive in just about every environment imaginable. “There are some ecosystems in the Antarctic called nunataks where the wind blows away snow and ice, exposing outcroppings of rocks, and the only things that live on them are lichens and tardigrades,” says Bartels, an invertebrate zoologist at Warren Wilson College in North Carolina.
Pick up a piece of moss, and you’ll find tardigrades. In the soil: tardigrades. The ocean: You get it. They live on every continent, in every climate, and in every latitude. Their extreme resilience has allowed them to conquer the entire planet.
And though biologists have known about tardigrades since the dawn of the microscope, they’re only just beginning to understand how these remarkable organisms are able to survive anywhere. [Continue reading…]
Why neuroscientists need to study the crow
Grigori Guitchounts writes: The animals of neuroscience research are an eclectic bunch, and for good reason. Different model organisms—like zebra fish larvae, C. elegans worms, fruit flies, and mice — give researchers the opportunity to answer specific questions. The first two, for example, have transparent bodies, which let scientists easily peer into their brains; the last two have eminently tweakable genomes, which allow scientists to isolate the effects of specific genes. For cognition studies, researchers have relied largely on primates and, more recently, rats, which I use in my own work. But the time is ripe for this exclusive club of research animals to accept a new, avian member: the corvid family.
Corvids, such as crows, ravens, and magpies, are among the most intelligent birds on the planet — the list of their cognitive achievements goes on and on — yet neuroscientists have not scrutinized their brains for one simple reason: They don’t have a neocortex. The obsession with the neocortex in neuroscience research is not unwarranted; what’s unwarranted is the notion that the neocortex alone is responsible for sophisticated cognition. Because birds lack this structure—the most recently evolved portion of the mammalian brain, crucial to human intelligence—neuroscientists have largely and unfortunately neglected the neural basis of corvid intelligence.
This makes them miss an opportunity for an important insight. Having diverged from mammals more than 300 million years ago, avian brains have had plenty of time to develop along remarkably different lines (instead of a cortex with its six layers of neatly arranged neurons, birds evolved groups of neurons densely packed into clusters called nuclei). So, any computational similarities between corvid and primate brains — which are so different neurally — would indicate the development of common solutions to shared evolutionary problems, like creating and storing memories, or learning from experience. If neuroscientists want to know how brains produce intelligence, looking solely at the neocortex won’t cut it; they must study how corvid brains achieve the same clever behaviors that we see in ourselves and other mammals. [Continue reading…]
Walking improves creativity
Olivia Goldhill writes: For centuries, great thinkers have instinctively stepped out the door and begun walking, or at the very least pacing, when they needed to boost creativity. Charles Dickens routinely walked for 30 miles a day, while the philosopher Friedrich Nietzsche declared, “All truly great thoughts are conceived while walking.”
But in recent years, as lives have become increasingly sedentary, the idea has been put to the test. The precise physiology is unknown, but professors and therapists are turning what was once an unquestioned instinct into a certainty: Walking influences our thinking, and somehow improves creativity.
Last year, researchers at Stanford found that people perform better on creative divergent thinking tests during and immediately after walking. The effect was similar regardless of whether participants took a stroll inside or stayed inside, walking on a treadmill and staring at a wall. The act of walking itself, rather than the sights encountered on a saunter, was key to improving creativity, they found. [Continue reading…]
The social practice of self-betrayal in career-driven America
Talbot Brewer writes: I don’t know how careers are seen in other countries, but in the United States we are exhorted to view them as the primary locus of self-realization. The question before you when you are trying to choose a career is to figure out “What Color is Your Parachute?” (the title of a guide to job searches that has been a perennial best seller for most of my lifetime). The aim, to quote the title of another top-selling guide to career choices, is to “Do What You Are.”
These titles tell us something about what Americans expect to find in a career: themselves, in the unlikely form of a marketable commodity. But why should we expect that the inner self waiting to be born corresponds to some paid job or profession? Are we really all in possession of an inner lawyer, an inner beauty products placement specialist, or an inner advertising executive, just waiting for the right job opening? Mightn’t this script for our biographies serve as easily to promote self-limitation or self-betrayal as to further self-actualization?
We spend a great deal of our youth shaping ourselves into the sort of finished product that potential employers will be willing to pay dearly to use. Beginning at a very early age, schooling practices and parental guidance and approval are adjusted, sometimes only semi-consciously, so as to inculcate the personal capacities and temperament demanded by the corporate world. The effort to sculpt oneself for this destiny takes a more concerted form in high school and college. We choose courses of study, and understand the importance of success in these studies, largely with this end in view.
Even those who rebel against these forces of acculturation are deeply shaped by them. What we call “self-destructive” behavior in high school might perhaps be an understandable result of being dispirited by the career prospects that are recommended to us as sufficient motivation for our studies. As a culture we have a curious double-mindedness about such reactions. It is hard to get through high school in the United States without being asked to read J.D. Salinger’s Catcher in the Rye — the story of one Holden Caulfield’s angst-ridden flight from high school, fueled by a pervasive sense that the adult world is irredeemably phony. The ideal high school student is supposed to find a soul-mate in Holden and write an insightful paper about his telling cultural insights, submitted on time in twelve-point type with double spacing and proper margins and footnotes, so as to ensure the sort of grade that will keep the student on the express train to the adult world whose irredeemable phoniness he has just skillfully diagnosed. [Continue reading…]
Atheists in America
Emma Green writes: In general, Americans do not like atheists. In studies, they say they feel coldly toward nonbelievers; it’s estimated that more than half of the population say they’d be less likely to vote for a presidential candidate who didn’t believe in God.
This kind of deep-seated suspicion is a long-standing tradition in the U.S. In his new book, Village Atheists, the Washington University in St. Louis professor Leigh Eric Schmidt writes about the country’s early “infidels” — one of many fraught terms nonbelievers have used to describe themselves in history — and the conflicts they went through. While the history of atheists is often told as a grand tale of battling ideas, Schmidt set out to tell stories of “mundane materiality,” chronicling the lived experiences of atheists and freethinkers in 19th- and 20th-century America.
His findings both confirm and challenge stereotypes around atheists today. While it’s true that the number of nonbelievers is the United States is growing, it’s still small — roughly 3 percent of U.S. adults self-identify as atheists. And while more and more Americans say they’re not part of any particular religion, they’ve historically been in good company: At the end of the 19th century, Schmidt estimated, around a tenth of Americans may have been unaffiliated from any church or religious institution.
As the visibility and number of American atheists has changed over time, the group has gone through its own struggles over identity. Even today, atheists are significantly more likely to be white, male, and highly educated than the rest of the population, a demographic fact perhaps tied to the long legacy of misogyny and marginalization of women within the movement. At times, nonbelievers have advocated on behalf of minority religious rights and defended immigrants. But they’ve also been among the most vocal American nativists, rallying against Mormons, Catholics, and evangelical Protestants alike.
Schmidt and I discussed the history of atheists in the United States, from the suspicion directed toward them to the suspicions they have cast on others. Our conversation has been edited and condensed for clarity. [Continue reading…]
How U.S. history makes people and places disappear
Aileen McGraw writes: When Lauret Edith Savoy first heard the word “colored” at five years old, she saw herself as exactly that — full of veins as blue as the sky. Not long after, she learned another definition, steeped in racism. “Words full of spit showed that I could be hated for being ‘colored,’” she writes. “By the age of eight I wondered if I should hate in return.” Out of this painful history, Savoy has created something rich and productive — a body of work that examines the complex relationships between land, identity, and history.
Today, Savoy, who is of African American, Euro-American, and Native American descent, works as a geologist, a writer, and a professor of environmental studies at Mount Holyoke College. Her writing — described by New York Magazine’s “Vulture” as John McPhee meets James Baldwin — straddles science and the humanities.
Her most recent book Trace: Memory, History, Race, and the American Landscape explores the tendency of U.S. history to erase or rewrite — both literally and in memory — the stories of marginalized or dispossessed people and places that have been deemed unworthy, unsavory, or shameful. In eight densely researched, ruminative essays, Savoy uses her own family histories to trace moments in American history that have been largely forgotten: for example, the history of segregated Army nurses, like her mother, during World War II, or that of Charles Drew, the African-American physician who developed the first blood bank and was fired for trying to end the federally sanctioned policy of segregating blood. Savoy approaches the “environment” in the broadest sense: “Not just as surroundings; not just as the air, water, and land on which we depend, or that we pollute; not just as global warming — but as sets of circumstances, conditions, and contexts in which we live and die — in which each of us is intimately part.”
Nautilus recently spoke to Savoy over email about this relationship between landscape and identity, the meaning of biodiversity, and the power of the stories we tell. [Continue reading…]
England’s forgotten Muslim history
Jerry Brotton writes: Britain is divided as never before. The country has turned its back on Europe, and its female ruler has her sights set on trade with the East. As much as this sounds like Britain today, it also describes the country in the 16th century, during the golden age of its most famous monarch, Queen Elizabeth I.
One of the more surprising aspects of Elizabethan England is that its foreign and economic policy was driven by a close alliance with the Islamic world, a fact conveniently ignored today by those pushing the populist rhetoric of national sovereignty.
From the moment of her accession to the throne in 1558, Elizabeth began seeking diplomatic, commercial and military ties with Muslim rulers in Iran, Turkey and Morocco — and with good reasons. In 1570, when it became clear that Protestant England would not return to the Catholic faith, the pope excommunicated Elizabeth and called for her to be stripped of her crown. Soon, the might of Catholic Spain was against her, an invasion imminent. English merchants were prohibited from trading with the rich markets of the Spanish Netherlands. Economic and political isolation threatened to destroy the newly Protestant country.
Elizabeth responded by reaching out to the Islamic world. Spain’s only rival was the Ottoman Empire, ruled by Sultan Murad III, which stretched from North Africa through Eastern Europe to the Indian Ocean. The Ottomans had been fighting the Hapsburgs for decades, conquering parts of Hungary. Elizabeth hoped that an alliance with the sultan would provide much needed relief from Spanish military aggression, and enable her merchants to tap into the lucrative markets of the East. For good measure she also reached out to the Ottomans’ rivals, the shah of Persia and the ruler of Morocco. [Continue reading…]
Ethical shifts come with thinking in a different language
Julie Sedivy writes: What defines who we are? Our habits? Our aesthetic tastes? Our memories? If pressed, I would answer that if there is any part of me that sits at my core, that is an essential part of who I am, then surely it must be my moral center, my deep-seated sense of right and wrong.
And yet, like many other people who speak more than one language, I often have the sense that I’m a slightly different person in each of my languages — more assertive in English, more relaxed in French, more sentimental in Czech. Is it possible that, along with these differences, my moral compass also points in somewhat different directions depending on the language I’m using at the time?
Psychologists who study moral judgments have become very interested in this question. Several recent studies have focused on how people think about ethics in a non-native language — as might take place, for example, among a group of delegates at the United Nations using a lingua franca to hash out a resolution. The findings suggest that when people are confronted with moral dilemmas, they do indeed respond differently when considering them in a foreign language than when using their native tongue.
In a 2014 paper led by Albert Costa, volunteers were presented with a moral dilemma known as the “trolley problem”: imagine that a runaway trolley is careening toward a group of five people standing on the tracks, unable to move. You are next to a switch that can shift the trolley to a different set of tracks, thereby sparing the five people, but resulting in the death of one who is standing on the side tracks. Do you pull the switch?
Most people agree that they would. But what if the only way to stop the trolley is by pushing a large stranger off a footbridge into its path? People tend to be very reluctant to say they would do this, even though in both scenarios, one person is sacrificed to save five. But Costa and his colleagues found that posing the dilemma in a language that volunteers had learned as a foreign tongue dramatically increased their stated willingness to shove the sacrificial person off the footbridge, from fewer than 20% of respondents working in their native language to about 50% of those using the foreign one. [Continue reading…]
What to do about Liberia’s island colony of abandoned lab chimps?
By Ben Garrod, Anglia Ruskin University
The story of Liberia’s former research chimpanzees is both well-known and contentious. A non-profit blood bank, the New York Blood Centre (NYBC), set up a virus-testing laboratory in the country in 1974, and wild chimpanzees were trapped from their forests and housed within the “Vilab II” facility. They were subjected to medical experiments and were intentionally infected with hepatitis and other pathogens to help develop a range of vaccines.
By 2005, the director of Vilab II, Alfred M Prince, announced that all research had been terminated and that the NYBC had started to make “lifetime care” arrangements for the chimpanzees through an endowment. Over the next ten years, the chimps were “retired” to a series of small islands in a river estuary, receiving food, water and necessary captive care (at a cost of around US$20,000 a month).
Then, in March 2015, the NYBC withdrew its help and financial support and disowned Prince’s commitments. The move left about 85 chimps to fend for themselves. Escape is impossible, as chimpanzees are incapable of swimming well, and many are suspected to have likely died from a lack of food and water.
Although the Liberian government owns the chimps as a legal technicality, the day-to-day management of the chimps and the experiments were carried out by NYBC and it in no way absolves it from ultimate responsibility. But it has used this to distance itself from calls for it to continue funding care. In a statement last year it said it had had “unproductive discussions” with the Liberian government and that it “never had any obligation for care for the chimps, contractual or otherwise”. It has also said that it can “no longer sustain diverting millions of dollars away from our lifesaving mission”.
Understandably, animal rights groups are vocally opposing the blood bank’s actions.
Nature is being renamed ‘natural capital’ – but is it really the planet that will profit?
By Sian Sullivan, Bath Spa University
The four-yearly World Conservation Congress of the International Union for the Conservation of Nature has just taken place in Hawai’i. The congress is the largest global meeting on nature’s conservation. This year a controversial motion was debated regarding incorporating the language and mechanisms of “natural capital” into IUCN policy.
But what is “natural capital”? And why use it to refer to “nature”?
Motion 63 on “Natural Capital”, adopted at the congress, proposes the development of a “natural capital charter” as a framework “for the application of natural capital approaches and mechanisms”. In “noting that concepts and language of natural capital are becoming widespread within conservation circles and IUCN”, the motion reflects IUCN’s adoption of “a substantial policy position” on natural capital. Eleven programmed sessions scheduled for the congress included “natural capital” in the title. Many are associated with the recent launch of the global Natural Capital Protocol, which brings together business leaders to create a world where business both enhances and conserves nature.
At least one congress session discussed possible “unforeseen impacts of natural capital on broader issues of equitability, ethics, values, rights and social justice”. This draws on widespread concerns around the metaphor that nature-is-as-capital-is. Critics worry about the emphasis on economic, as opposed to ecological, language and models, and a corresponding marginalisation of non-economic values that elicit care for the natural world.
Sugar industry funded research as early as 1960s to coverup health hazards, report says
The Associated Press reports: The sugar industry began funding research that cast doubt on sugar’s role in heart disease — in part by pointing the finger at fat — as early as the 1960s, according to an analysis of newly uncovered documents.
The analysis published Monday in the journal JAMA Internal Medicine is based on correspondence between a sugar trade group and researchers at Harvard University, and is the latest example showing how food and beverage makers attempt to shape public understanding of nutrition.
In 1964, the group now known as the Sugar Assn. internally discussed a campaign to address “negative attitudes toward sugar” after studies began emerging linking sugar with heart disease, according to documents dug up from public archives. The following year the group approved “Project 226,” which entailed paying Harvard researchers today’s equivalent of $48,900 for an article reviewing the scientific literature, supplying materials they wanted reviewed, and receiving drafts of the article.
The resulting article published in 1967 concluded there was “no doubt” that reducing cholesterol and saturated fat was the only dietary intervention needed to prevent heart disease. The researchers overstated the consistency of the literature on fat and cholesterol while downplaying studies on sugar, according to the analysis. [Continue reading…]
Colliding black holes tell new story of stars
Natalie Wolchover writes: At a talk last month in Santa Barbara, California, addressing some of the world’s leading astrophysicists, Selma de Mink cut to the chase. “How did they form?” she began.
“They,” as everybody knew, were the two massive black holes that, more than 1 billion years ago and in a remote corner of the cosmos, spiraled together and merged, making waves in the fabric of space and time. These “gravitational waves” rippled outward and, on Sept. 14, 2015, swept past Earth, strumming the ultrasensitive detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO). LIGO’s discovery, announced in February, triumphantly vindicated Albert Einstein’s 1916 prediction that gravitational waves exist. By tuning in to these tiny tremors in space-time and revealing for the first time the invisible activity of black holes — objects so dense that not even light can escape their gravitational pull — LIGO promised to open a new window on the universe, akin, some said, to when Galileo first pointed a telescope at the sky.
Already, the new gravitational-wave data has shaken up the field of astrophysics. In response, three dozen experts spent two weeks in August sorting through the implications at the Kavli Institute for Theoretical Physics (KITP) in Santa Barbara.[Continue reading…]
Evidence rebuts Chomsky’s theory of language learning
Paul Ibbotson and Michael Tomasello write: The idea that we have brains hardwired with a mental template for learning grammar — famously espoused by Noam Chomsky of the Massachusetts Institute of Technology — has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages—and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.
The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all — such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique human ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.
This conclusion is important because the study of language plays a central role in diverse disciplines — from poetry to artificial intelligence to linguistics itself; misguided methods lead to questionable results. Further, language is used by humans in ways no animal can match; if you understand what language is, you comprehend a little bit more about human nature. [Continue reading…]
Beware the bad big wolf: why you need to put your adjectives in the right order
By Simon Horobin, University of Oxford
Unlikely as it sounds, the topic of adjective use has gone “viral”. The furore centres on the claim, taken from Mark Forsyth’s book The Elements of Eloquence, that adjectives appearing before a noun must appear in the following strict sequence: opinion, size, age, shape, colour, origin, material, purpose, Noun. Even the slightest attempt to disrupt this sequence, according to Forsyth, will result in the speaker sounding like a maniac. To illustrate this point, Forsyth offers the following example: “a lovely little old rectangular green French silver whittling knife”.
But is the “rule” worthy of an internet storm – or is it more of a ripple in a teacup? Well, certainly the example is a rather unlikely sentence, and not simply because whittling knives are not in much demand these days – ignoring the question of whether they can be both green and silver. This is because it is unusual to have a string of attributive adjectives (ones that appear before the noun they describe) like this.
More usually, speakers of English break up the sequence by placing some of the adjectives in predicative position – after the noun. Not all adjectives, however, can be placed in either position. I can refer to “that man who is asleep” but it would sound odd to refer to him as “that asleep man”; we can talk about the “Eastern counties” but not the “counties that are Eastern”. Indeed, our distribution of adjectives both before and after the noun reveals another constraint on adjective use in English – a preference for no more than three before a noun. An “old brown dog” sounds fine, a “little old brown dog” sounds acceptable, but a “mischievous little old brown dog” sounds plain wrong.
Torturing animals injures humanity
John P. Gluck writes: Five years ago, the National Institutes of Health all but ended biomedical and behavioral research on chimpanzees, concluding that, as the closest human relative, they deserved “special consideration and respect.”
But chimpanzees were far from the only nonhuman primates used in research then, or now. About 70,000 other primates are still living their lives as research subjects in labs across the United States.
On Wednesday, the N.I.H. will hold a workshop on “continued responsible research” with these animals. This sounds like a positive development. But as someone who spent decades working almost daily with macaque monkeys in primate research laboratories, I know firsthand that “responsible” research is not enough. What we really need to examine is the very moral ground of animal research itself.
Like many researchers, I once believed that intermittent scientific gains justified methods that almost always did harm. As a graduate student in the late 1960s, I came to see that my natural recoil from intentionally harming animals was a hindrance to how I understood scientific progress. I told myself that we were being responsible by providing good nutrition, safe cages, skilled and caring caretakers and veterinarians for the animals — and, crucially, that what we stood to learn outweighed any momentary or prolonged anguish these animals might experience. The potential for a medical breakthrough, the excitement of research and discovering whether my hypotheses were correct — and let’s not leave out smoldering ambition — made my transition to a more “rigorous” stance easier than I could have imagined.
One of my areas of study focused on the effects of early social deprivation on the intellectual abilities of rhesus monkeys. We kept young, intelligent monkeys separated from their families and others of their kind for many months in soundproof cages that remained lit 24 hours a day, then measured how their potential for complex social and intellectual lives unraveled. All the while, I comforted myself with the idea that these monkeys were my research partners, and that by creating developmental disorders in monkeys born in a lab, we could better understand these disorders in humans.
But it was impossible to fully quell my repugnance at all that I continued to witness and to inflict. At the same time, in the classroom, I began to face questions from students who had become increasingly concerned about the predicament of lab animals. [Continue reading…]