Author Archives: Attention to the Unseen

The social practice of self-betrayal in career-driven America

nyc (1)

Talbot Brewer writes: I don’t know how careers are seen in other countries, but in the United States we are exhorted to view them as the primary locus of self-realization. The question before you when you are trying to choose a career is to figure out “What Color is Your Parachute?” (the title of a guide to job searches that has been a perennial best seller for most of my lifetime). The aim, to quote the title of another top-selling guide to career choices, is to “Do What You Are.”

These titles tell us something about what Americans expect to find in a career: themselves, in the unlikely form of a marketable commodity. But why should we expect that the inner self waiting to be born corresponds to some paid job or profession? Are we really all in possession of an inner lawyer, an inner beauty products placement specialist, or an inner advertising executive, just waiting for the right job opening? Mightn’t this script for our biographies serve as easily to promote self-limitation or self-betrayal as to further self-actualization?

We spend a great deal of our youth shaping ourselves into the sort of finished product that potential employers will be willing to pay dearly to use. Beginning at a very early age, schooling practices and parental guidance and approval are adjusted, sometimes only semi-consciously, so as to inculcate the personal capacities and temperament demanded by the corporate world. The effort to sculpt oneself for this destiny takes a more concerted form in high school and college. We choose courses of study, and understand the importance of success in these studies, largely with this end in view.

Even those who rebel against these forces of acculturation are deeply shaped by them. What we call “self-destructive” behavior in high school might perhaps be an understandable result of being dispirited by the career prospects that are recommended to us as sufficient motivation for our studies. As a culture we have a curious double-mindedness about such reactions. It is hard to get through high school in the United States without being asked to read J.D. Salinger’s Catcher in the Rye — the story of one Holden Caulfield’s angst-ridden flight from high school, fueled by a pervasive sense that the adult world is irredeemably phony. The ideal high school student is supposed to find a soul-mate in Holden and write an insightful paper about his telling cultural insights, submitted on time in twelve-point type with double spacing and proper margins and footnotes, so as to ensure the sort of grade that will keep the student on the express train to the adult world whose irredeemable phoniness he has just skillfully diagnosed. [Continue reading…]

Facebooktwittermail

Atheists in America

Emma Green writes: In general, Americans do not like atheists. In studies, they say they feel coldly toward nonbelievers; it’s estimated that more than half of the population say they’d be less likely to vote for a presidential candidate who didn’t believe in God.

This kind of deep-seated suspicion is a long-standing tradition in the U.S. In his new book, Village Atheists, the Washington University in St. Louis professor Leigh Eric Schmidt writes about the country’s early “infidels” — one of many fraught terms nonbelievers have used to describe themselves in history — and the conflicts they went through. While the history of atheists is often told as a grand tale of battling ideas, Schmidt set out to tell stories of “mundane materiality,” chronicling the lived experiences of atheists and freethinkers in 19th- and 20th-century America.

His findings both confirm and challenge stereotypes around atheists today. While it’s true that the number of nonbelievers is the United States is growing, it’s still small — roughly 3 percent of U.S. adults self-identify as atheists. And while more and more Americans say they’re not part of any particular religion, they’ve historically been in good company: At the end of the 19th century, Schmidt estimated, around a tenth of Americans may have been unaffiliated from any church or religious institution.

As the visibility and number of American atheists has changed over time, the group has gone through its own struggles over identity. Even today, atheists are significantly more likely to be white, male, and highly educated than the rest of the population, a demographic fact perhaps tied to the long legacy of misogyny and marginalization of women within the movement. At times, nonbelievers have advocated on behalf of minority religious rights and defended immigrants. But they’ve also been among the most vocal American nativists, rallying against Mormons, Catholics, and evangelical Protestants alike.

Schmidt and I discussed the history of atheists in the United States, from the suspicion directed toward them to the suspicions they have cast on others. Our conversation has been edited and condensed for clarity. [Continue reading…]

Facebooktwittermail

How U.S. history makes people and places disappear

shadow14

Aileen McGraw writes: When Lauret Edith Savoy first heard the word “colored” at five years old, she saw herself as exactly that — full of veins as blue as the sky. Not long after, she learned another definition, steeped in racism. “Words full of spit showed that I could be hated for being ‘colored,’” she writes. “By the age of eight I wondered if I should hate in return.” Out of this painful history, Savoy has created something rich and productive — a body of work that examines the complex relationships between land, identity, and history.

Today, Savoy, who is of African American, Euro-American, and Native American descent, works as a geologist, a writer, and a professor of environmental studies at Mount Holyoke College. Her writing — described by New York Magazine’s “Vulture” as John McPhee meets James Baldwin — straddles science and the humanities.

Her most recent book Trace: Memory, History, Race, and the American Landscape explores the tendency of U.S. history to erase or rewrite — both literally and in memory — the stories of marginalized or dispossessed people and places that have been deemed unworthy, unsavory, or shameful. In eight densely researched, ruminative essays, Savoy uses her own family histories to trace moments in American history that have been largely forgotten: for example, the history of segregated Army nurses, like her mother, during World War II, or that of Charles Drew, the African-American physician who developed the first blood bank and was fired for trying to end the federally sanctioned policy of segregating blood. Savoy approaches the “environment” in the broadest sense: “Not just as surroundings; not just as the air, water, and land on which we depend, or that we pollute; not just as global warming — but as sets of circumstances, conditions, and contexts in which we live and die — in which each of us is intimately part.”

Nautilus recently spoke to Savoy over email about this relationship between landscape and identity, the meaning of biodiversity, and the power of the stories we tell. [Continue reading…]

Facebooktwittermail

England’s forgotten Muslim history

Jerry Brotton writes: Britain is divided as never before. The country has turned its back on Europe, and its female ruler has her sights set on trade with the East. As much as this sounds like Britain today, it also describes the country in the 16th century, during the golden age of its most famous monarch, Queen Elizabeth I.

One of the more surprising aspects of Elizabethan England is that its foreign and economic policy was driven by a close alliance with the Islamic world, a fact conveniently ignored today by those pushing the populist rhetoric of national sovereignty.

From the moment of her accession to the throne in 1558, Elizabeth began seeking diplomatic, commercial and military ties with Muslim rulers in Iran, Turkey and Morocco — and with good reasons. In 1570, when it became clear that Protestant England would not return to the Catholic faith, the pope excommunicated Elizabeth and called for her to be stripped of her crown. Soon, the might of Catholic Spain was against her, an invasion imminent. English merchants were prohibited from trading with the rich markets of the Spanish Netherlands. Economic and political isolation threatened to destroy the newly Protestant country.

Elizabeth responded by reaching out to the Islamic world. Spain’s only rival was the Ottoman Empire, ruled by Sultan Murad III, which stretched from North Africa through Eastern Europe to the Indian Ocean. The Ottomans had been fighting the Hapsburgs for decades, conquering parts of Hungary. Elizabeth hoped that an alliance with the sultan would provide much needed relief from Spanish military aggression, and enable her merchants to tap into the lucrative markets of the East. For good measure she also reached out to the Ottomans’ rivals, the shah of Persia and the ruler of Morocco. [Continue reading…]

Facebooktwittermail

Ethical shifts come with thinking in a different language

Julie Sedivy writes: What defines who we are? Our habits? Our aesthetic tastes? Our memories? If pressed, I would answer that if there is any part of me that sits at my core, that is an essential part of who I am, then surely it must be my moral center, my deep-seated sense of right and wrong.

And yet, like many other people who speak more than one language, I often have the sense that I’m a slightly different person in each of my languages — more assertive in English, more relaxed in French, more sentimental in Czech. Is it possible that, along with these differences, my moral compass also points in somewhat different directions depending on the language I’m using at the time?

Psychologists who study moral judgments have become very interested in this question. Several recent studies have focused on how people think about ethics in a non-native language — as might take place, for example, among a group of delegates at the United Nations using a lingua franca to hash out a resolution. The findings suggest that when people are confronted with moral dilemmas, they do indeed respond differently when considering them in a foreign language than when using their native tongue.

In a 2014 paper led by Albert Costa, volunteers were presented with a moral dilemma known as the “trolley problem”: imagine that a runaway trolley is careening toward a group of five people standing on the tracks, unable to move. You are next to a switch that can shift the trolley to a different set of tracks, thereby sparing the five people, but resulting in the death of one who is standing on the side tracks. Do you pull the switch?

Most people agree that they would. But what if the only way to stop the trolley is by pushing a large stranger off a footbridge into its path? People tend to be very reluctant to say they would do this, even though in both scenarios, one person is sacrificed to save five. But Costa and his colleagues found that posing the dilemma in a language that volunteers had learned as a foreign tongue dramatically increased their stated willingness to shove the sacrificial person off the footbridge, from fewer than 20% of respondents working in their native language to about 50% of those using the foreign one. [Continue reading…]

Facebooktwittermail

What to do about Liberia’s island colony of abandoned lab chimps?

By Ben Garrod, Anglia Ruskin University

The story of Liberia’s former research chimpanzees is both well-known and contentious. A non-profit blood bank, the New York Blood Centre (NYBC), set up a virus-testing laboratory in the country in 1974, and wild chimpanzees were trapped from their forests and housed within the “Vilab II” facility. They were subjected to medical experiments and were intentionally infected with hepatitis and other pathogens to help develop a range of vaccines.

By 2005, the director of Vilab II, Alfred M Prince, announced that all research had been terminated and that the NYBC had started to make “lifetime care” arrangements for the chimpanzees through an endowment. Over the next ten years, the chimps were “retired” to a series of small islands in a river estuary, receiving food, water and necessary captive care (at a cost of around US$20,000 a month).

Then, in March 2015, the NYBC withdrew its help and financial support and disowned Prince’s commitments. The move left about 85 chimps to fend for themselves. Escape is impossible, as chimpanzees are incapable of swimming well, and many are suspected to have likely died from a lack of food and water.

Although the Liberian government owns the chimps as a legal technicality, the day-to-day management of the chimps and the experiments were carried out by NYBC and it in no way absolves it from ultimate responsibility. But it has used this to distance itself from calls for it to continue funding care. In a statement last year it said it had had “unproductive discussions” with the Liberian government and that it “never had any obligation for care for the chimps, contractual or otherwise”. It has also said that it can “no longer sustain diverting millions of dollars away from our lifesaving mission”.

Understandably, animal rights groups are vocally opposing the blood bank’s actions.

Continue reading

Facebooktwittermail

Nature is being renamed ‘natural capital’ – but is it really the planet that will profit?

By Sian Sullivan, Bath Spa University

The four-yearly World Conservation Congress of the International Union for the Conservation of Nature has just taken place in Hawai’i. The congress is the largest global meeting on nature’s conservation. This year a controversial motion was debated regarding incorporating the language and mechanisms of “natural capital” into IUCN policy.

But what is “natural capital”? And why use it to refer to “nature”?

Motion 63 on “Natural Capital”, adopted at the congress, proposes the development of a “natural capital charter” as a framework “for the application of natural capital approaches and mechanisms”. In “noting that concepts and language of natural capital are becoming widespread within conservation circles and IUCN”, the motion reflects IUCN’s adoption of “a substantial policy position” on natural capital. Eleven programmed sessions scheduled for the congress included “natural capital” in the title. Many are associated with the recent launch of the global Natural Capital Protocol, which brings together business leaders to create a world where business both enhances and conserves nature.

At least one congress session discussed possible “unforeseen impacts of natural capital on broader issues of equitability, ethics, values, rights and social justice”. This draws on widespread concerns around the metaphor that nature-is-as-capital-is. Critics worry about the emphasis on economic, as opposed to ecological, language and models, and a corresponding marginalisation of non-economic values that elicit care for the natural world.

Continue reading

Facebooktwittermail

Sugar industry funded research as early as 1960s to coverup health hazards, report says

The Associated Press reports: The sugar industry began funding research that cast doubt on sugar’s role in heart disease — in part by pointing the finger at fat — as early as the 1960s, according to an analysis of newly uncovered documents.

The analysis published Monday in the journal JAMA Internal Medicine is based on correspondence between a sugar trade group and researchers at Harvard University, and is the latest example showing how food and beverage makers attempt to shape public understanding of nutrition.

In 1964, the group now known as the Sugar Assn. internally discussed a campaign to address “negative attitudes toward sugar” after studies began emerging linking sugar with heart disease, according to documents dug up from public archives. The following year the group approved “Project 226,” which entailed paying Harvard researchers today’s equivalent of $48,900 for an article reviewing the scientific literature, supplying materials they wanted reviewed, and receiving drafts of the article.

The resulting article published in 1967 concluded there was “no doubt” that reducing cholesterol and saturated fat was the only dietary intervention needed to prevent heart disease. The researchers overstated the consistency of the literature on fat and cholesterol while downplaying studies on sugar, according to the analysis. [Continue reading…]

Facebooktwittermail