The idea of political correctness is central to the culture wars of American politics

Scott Barry Kaufman writes: In a recent study, Christine Brophy and Jordan Peterson conducted a very illuminating analysis of the personality of political correctness. They created a very comprehensive 192-item PC scale measuring PC-related language, beliefs, and emotions based on their reading of news articles, books, and research papers on political correctness. Their PC battery employed a variety of question types, and tapped into the beliefs, language, and emotional sensitivity of politically correct individuals. The list was reviewed and added to by faculty and graduate students, and 332 participants completed the new PC scale, along with questionnaires on personality, IQ, and disgust sensitivity.
What did they find?

The researchers found that PC exists, can be reliably measured, and has two major dimensions. They labeled the first dimension “PC-Egalitarianism” and the second dimension “PC-Authoritarianism”. Interestingly, they found that PC is not a purely left-wing phenomenon, but is better understood as the manifestation of a general offense sensitivity, which is then employed for either liberal or conservative ends.

Nevertheless, while both dimensions of political correctness involve offense sensitivity, they found some critical differences. PC-Egalitarians tended to attribute a cultural basis for group differences, believed that differences in group power springs from societal injustices, and tended to support policies to prop up historically disadvantages groups. Therefore, the emotional response of this group to discriminating language appears to stem from an underlying motivation to achieve diversity through increased equality, and any deviation from equality is assumed to be caused by culture. Their beliefs lead to advocating for a more democratic governance.

In contrast, PC-Authoritarians tended to attribute a biological basis for group differences, supported censorship of material that offends, and supported policies of harsher punitive justice for transgressors. Therefore, this dimension of PC seems to reflect more of an indiscriminate or general sensitivity to offense, and seems to stem from an underlying motivation to achieve security and stability for those in distress. Their beliefs lead to advocating for a more autocratic governance to achieve uniformity. [Continue reading…]

Facebooktwittermail

Kerouac’s French-Canadian roots hold the key to his literary identity

Deni Ellis Béchard writes: The real-life backstory of Jack Kerouac’s unpublished novel is classic beat generation. It was December 1952, and tensions were running high as Jack and his friend Neal Cassady — the inspiration for the character of Dean Moriarty in On the Road — drove from San Francisco to Mexico City.

Whereas Neal was looking for adventure and a chance to stock up on weed, Jack was in a difficult period. His first novel, The Town and the City, published under the name John Kerouac in 1950, had met with lukewarm reviews and poor sales. In April 1951, he had written On the Road on a (now famous) 120-foot-long scroll, but hadn’t been able to find a publisher. He was thirty and had been laid off by the railroad after a bout of phlebitis in his leg.

Kerouac decided to convalesce in Mexico City with William S. Burroughs, who would later author Naked Lunch. Three months earlier, Burroughs had performed a William Tell act with his wife, Joan, while they were drunk and accidentally shot her in the head, killing her. Shortly after Kerouac’s arrival, Burroughs skipped bail and fled the country. Neal Cassady went home. Alone, living in a rooftop apartment in Mexico City, Jack wrote a short novel over the course of five days.

The first line reads: Dans l’moi d’Octobre, 1935, (dans la nuit de nos vra vie bardasseuze) y’arriva une machine du West, de Denver, sur le chemin pour New York. Written in the language of Kerouac’s childhood — a French-Canadian patois then commonly spoken in parts of New England — the line has an epic, North American ring. Kerouac would later translate it as “In the month of October, 1935, in the night of our real restless lives, a car came from the West, from Denver, on the road for New York.”

The novel’s title is Sur le chemin — “On the Road.” But it is not the On the Road we all know (which would be translated in France as Sur la route). It was the On the Road of Kerouac’s vernacular — chemin being used in the title to mean both “path” and “road.”

Over the course of his literary career, Kerouac redefined the archetype of the American man, and he has since become so integral to American culture that his identity as an immigrant writer is often forgotten. He was born in 1922 as Jean-Louis Lebris de Kérouac to parents from Quebec. He spoke French at home and grew up in the French-Canadian community of Lowell, Massachusetts. In one of his letters, he wrote, “The English language is a tool lately found . . . so late (I never spoke English before I was six or seven). At 21, I was still somewhat awkward and illiterate sounding in my [English] speech and writings.”

In 1954, Kerouac created a list of everything he had written and included Sur le chemin among his “completed novels” — even though it would remain in his archives for more than six decades before publication was finally arranged this year. Sur le chemin and his other French writings provide a key to unlocking his more famous works, revealing a man just as obsessed with the difficulty of living between two languages as he was with his better-known spiritual quests.

In particular, they help explain the path — le chemin — he took as he developed his influential style, which changed the way many writers throughout the world have thought about prose. To this day, Kerouac remains one of the most translated authors, and one whose work is shared across generations. His unpublished French works shine a light on how the voice and ideas of an iconic American figure emerged from the experiences of French-Canadian immigrants — a group whose language and culture remain largely unknown to mainstream America. [Continue reading…]

Facebooktwittermail

Why we have globalization to thank for Thanksgiving

By Farok J. Contractor, Rutgers University

As Americans sit down to their Thanksgiving Day feasts, some may recall the story of the “Pilgrim Fathers” who founded one of the first English settlements in North America in 1620, at what is today the town of Plymouth, Massachusetts.

The history we know is one of English settlers seeking religious freedom in a New World but instead finding “a hideous and desolate wilderness, full of wilde beasts and wilde men.”

What many Americans don’t realize, however, is that the story of those early settlers’ struggle, which culminated in what we remember today as the first Thanksgiving feast, is also a tale of globalization, many centuries before the word was even coined.

Crossing the Atlantic began a century before the Pilgrims’ passage to the New World aboard the Mayflower. By the 1600s, trans-Atlantic travel had became increasingly common. It was because of globalization that those first settlers were able to survive in an inhospitable and unforgiving land. And the turkey on Thanksgiving tables may not be a bird native to the U.S. but is more likely a (re)import from Europe.

Two short stories will help me explain. As a professor of international business at Rutgers University, I have been fascinated by the history of trade going back millennia, and how most Americans do not know the background story of Thanksgiving Day.

[Read more…]

Facebooktwittermail

How to bridge the political divide at the holiday dinner table

By Andrew J. Hoffman, University of Michigan

We are a divided nation; that is an understatement. What’s more, we increasingly hear we are living in our own “bubble” or echo chamber that differing views cannot penetrate. To correct the problem, many are calling for people to reach out, to talk and above all, to listen. That is all well and good, but what are we supposed to talk about? We can’t hope to listen without a topic for finding common ground.

In my view, there are (at least) two prominent issues in this election that can serve as a bridge across our political divides. The first is that the political and economic system needs fixing because it favors those with special status or access. The second is that income inequality is reaching an intolerable level.

Might these two topics help mend the unpleasant Thanksgiving or Christmas dinners that many Americans are dreading? Instead of avoiding that unpleasantness, it may be a time to embrace it.

[Read more…]

Facebooktwittermail

You should talk about politics this Thanksgiving – here’s why, and how

By Stacy Branham, University of Maryland, Baltimore County

After one of the most divisive presidential elections in American history, many of us may be anxious about dinner-table dialogue with family and friends this Thanksgiving. There is no denying that the way we communicate about politics has fundamentally changed with the proliferation of technology and social media. Twitter bots, fake news and echo chambers are just a few of the highlights from this election season. Much of how we’re conversing online can’t – and shouldn’t – be replicated around the family table. We are getting out of practice at conducting meaningful, respectful conversation.

There’s not a quick fix. We need more empathic communication – the slow, deep (inter)personal discourse that can nurture identity and build and strengthen relationships. Yet contemporary communication platforms can make it harder to build empathy with conversational partners. Even the phrase “conversational partners” seems unfitting in the world of 140-character limits, followers, likes and shares. In many ways, our devices help us talk at (@?) instead of with one another.

Literally meaning “in-feeling,” empathy is a process of internalizing another person’s perspective. Empathy-building is unselfish; you suspend your own sensibilities and try to fully imagine and embrace those of someone else. You can gain empathy by learning about other cultures from different media, by experiencing what others have gone through personally, or by having deep conversations with others.

My research into cross-cultural communications has taught me that empathy is not only the key to feeling connected – “I understand you” – but also the foundation for changing our narratives about one another – “now I see we are not so different.” That’s an important point to remember after such a difficult political experience. Building empathy requires communication, specifically talking to one another. But, not just any talking will suffice – especially not the type of talking promoted by today’s highly popular communication technologies.

[Read more…]

Facebooktwittermail

The moment when science went modern

structure2

Lorraine Daston writes: The history of science is punctuated by not one, not two, but three modernities: the first, in the seventeenth century, known as “the Scientific Revolution”; the second, circa 1800, often referred to as “the second Scientific Revolution”; and the third, in the first quarter of the twentieth century, when relativity theory and quantum mechanics not only overturned the achievements of Galileo and Newton but also challenged our deepest intuitions about space, time, and causation.

Each of these moments transformed science, both as a body of knowledge and as a social and political force. The first modernity of the seventeenth century displaced the Earth from the center of the cosmos, showered Europeans with new discoveries, from new continents to new planets, created new forms of inquiry such as field observation and the laboratory experiment, added prediction to explanation as an ideal toward which science should strive, and unified the physics of heaven and earth in Newton’s magisterial synthesis that served as the inspiration for the political reformers and revolutionaries of the Enlightenment. The second modernity of the early nineteenth century unified light, heat, electricity, magnetism, and gravitation into the single, fungible currency of energy, put that energy to work by creating the first science-based technologies to become gigantic industries (e.g., the manufacture of dyestuffs from coal tar derivatives), turned science into a salaried profession and allied it with state power in every realm, from combating epidemics to waging wars. The third modernity, of the early twentieth century, toppled the certainties of Newton and Kant, inspired the avant-garde in the arts, and paved the way for what were probably the two most politically consequential inventions of the last hundred years: the mass media and the atomic bomb.

The aftershocks of all three of these earthquakes of modernity are still reverberating today: in heated debates, from Saudi Arabia to Sri Lanka to Senegal, about the significance of the Enlightenment for human rights and intellectual freedom; in the assessment of how science-driven technology and industrialization may have altered the climate of the entire planet; in anxious negotiations about nuclear disarmament and utopian visions of a global polity linked by the worldwide Net. No one denies the world-shaking and world-making significance of any of these three moments of scientific modernity.

Yet from the perspective of the scientists themselves, the experience of modernity coincides with none of these seismic episodes. The most unsettling shift in scientific self-understanding — about what science was and where it was going — began in the middle decades of the nineteenth century, reaching its climax circa 1900. It was around that time that scientists began to wonder uneasily about whether scientific progress was compatible with scientific truth. If advances in knowledge were never-ending, could any scientific theory or empirical result count as real knowledge — true forever and always? Or was science, like the monarchies of Europe’s anciens régimes and the boundaries of its states and principalities, doomed to perpetual revision and revolution? [Continue reading…]

Facebooktwittermail

The vulnerability of monolingual Americans in an English-speaking world

Ivan Krastev writes: In our increasingly Anglophone world, Americans have become nakedly transparent to English speakers everywhere, yet the world remains bafflingly and often frighteningly opaque to monolingual Americans. While the world devours America’s movies and follows its politics closely, Americans know precious little about how non-Americans think and live. Americans have never heard of other countries’ movie stars and have only the vaguest notions of what their political conflicts are about.

This gross epistemic asymmetry is a real weakness. When WikiLeaks revealed the secret cables of the American State Department or leaked the emails of the Clinton campaign, it became a global news sensation and a major embarrassment for American diplomacy. Leaking Chinese diplomatic cables or Russian officials’ emails could never become a worldwide human-interest story, simply because only a relative handful of non-Chinese or non-Russians could read them, let alone make sense of them. [Continue reading…]

Although I’m pessimistic about the prospects of the meek inheriting the earth, the bi-lingual are in a very promising position. And Anglo-Americans should never forget that this is after all a country with a Spanish name. As for where I stand personally, I’m with the bi-lingual camp in spirit even if my own claim to be bi-lingual is a bit tenuous — an English-speaker who understands American-English but speaks British-English; does that count?

Facebooktwittermail

Where did the first farmers live? Looking for answers in DNA

Carl Zimmer writes: Beneath a rocky slope in central Jordan lie the remains of a 10,000-year-old village called Ain Ghazal, whose inhabitants lived in stone houses with timber roof beams, the walls and floors gleaming with white plaster.

Hundreds of people living there worshiped in circular shrines and made haunting, wide-eyed sculptures that stood three feet high. They buried their cherished dead under the floors of their houses, decapitating the bodies in order to decorate the skulls.

But as fascinating as this culture was, something else about Ain Ghazal intrigues archaeologists more: It was one of the first farming villages to have emerged after the dawn of agriculture.

Around the settlement, Ain Ghazal farmers raised barley, wheat, chickpeas and lentils. Other villagers would leave for months at a time to herd sheep and goats in the surrounding hills.

Sites like Ain Ghazal provide a glimpse of one of the most important transitions in human history: the moment that people domesticated plants and animals, settled down, and began to produce the kind of society in which most of us live today.

But for all that sites like Ain Ghazal have taught archaeologists, they are still grappling with enormous questions. Who exactly were the first farmers? How did agriculture, a cornerstone of civilization itself, spread to other parts of the world?

Some answers are now emerging from a surprising source: DNA extracted from skeletons at Ain Ghazal and other early settlements in the Near East. These findings have already challenged long-held ideas about how agriculture and domestication arose. [Continue reading…]

Facebooktwittermail

Guns, empires and Indians

David J Silverman writes: It has become commonplace to attribute the European conquest of the Americas to Jared Diamond’s triumvirate of guns, germs and steel. Germs refer to plague, measles, flu, whooping cough and, especially, the smallpox that whipsawed through indigenous populations, sometimes with a mortality rate of 90 per cent. The epidemics left survivors ill-equipped to fend off predatory encroachments, either from indigenous or from European peoples, who seized captives, land and plunder in the wake of these diseases.

Guns and steel, of course, represent Europeans’ technological prowess. Metal swords, pikes, armour and firearms, along with ships, livestock and even wheeled carts, gave European colonists significant military advantages over Native American people wielding bows and arrows, clubs, hatchets and spears. The attractiveness of such goods also meant that Indians desired trade with Europeans, despite the danger the newcomers represented. The lure of trade enabled Europeans to secure beachheads on the East Coast of North America, and make inroads to the interior of the continent. Intertribal competition for European trade also enabled colonists to employ ‘divide and conquer’ strategies against much larger indigenous populations.

Diamond’s explanation has grown immensely popular and influential. It appears to be a simple and sweeping teleology providing order and meaning to the complexity of the European conquest of the Western hemisphere. The guns, germs and steel perspective has helped further understanding of some of the major forces behind globalisation. But it also involves a level of abstraction that risks obscuring the history of individuals and groups whose experiences cannot be so aptly and neatly summarised. [Continue reading…]

Facebooktwittermail

Hints of tool use, culture seen in bumble bees

Science magazine reports: For years, cognitive scientist Lars Chittka felt a bit eclipsed by his colleagues at Queen Mary University of London. Their studies of apes, crows, and parrots were constantly revealing how smart these animals were. He worked on bees, and at the time, almost everyone assumed that the insects acted on instinct, not intelligence. “So there was a challenge for me: Could we get our small-brained bees to solve tasks that would impress a bird cognition researcher?” he recalls. Now, it seems he has succeeded at last.

Chittka’s team has shown that bumble bees can not only learn to pull a string to retrieve a reward, but they can also learn this trick from other bees, even though they have no experience with such a task in nature. The study “successfully challenges the notion that ‘big brains’ are necessary” for new skills to spread, says Christian Rutz, an evolutionary ecologist who studies bird cognition at the University of St. Andrews in the United Kingdom.

Many researchers have used string pulling to assess the smarts of animals, particularly birds and apes. So Chittka and his colleagues set up a low clear plastic table barely tall enough to lay three flat artificial blue flowers underneath. Each flower contained a well of sugar water in the center and had a string attached that extended beyond the table’s boundaries. The only way the bumble bee could get the sugar water was to pull the flower out from under the table by tugging on the string. [Continue reading…]

Facebooktwittermail

Study suggests human proclivity for violence gets modulated but is not necessarily diminished by culture

A new study (by José Maria Gómez et al) challenges Steven Pinker’s rosy picture of the state of the world. Science magazine reports: Though group-living primates are relatively violent, the rates vary. Nearly 4.5% of chimpanzee deaths are caused by another chimp, for example, whereas bonobos are responsible for only 0.68% of their compatriots’ deaths. Based on the rates of lethal violence seen in our close relatives, Gómez and his team predicted that 2% of human deaths would be caused by another human.

To see whether that was true, the researchers dove into the scientific literature documenting lethal violence among humans, from prehistory to today. They combined data from archaeological excavations, historical records, modern national statistics, and ethnographies to tally up the number of humans killed by other humans in different time periods and societies. From 50,000 years ago to 10,000 years ago, when humans lived in small groups of hunter-gatherers, the rate of killing was “statistically indistinguishable” from the predicted rate of 2%, based on archaeological evidence, Gómez and his colleagues report today in Nature.

Later, as human groups consolidated into chiefdoms and states, rates of lethal violence shot up — as high as 12% in medieval Eurasia, for example. But in the contemporary era, when industrialized states exert the rule of law, violence is lower than our evolutionary heritage would predict, hovering around 1.3% when combining statistics from across the world. That means evolution “is not a straitjacket,” Gómez says. Culture modulates our bloodthirsty tendencies.

The study is “innovative and meticulously conducted,” says Douglas Fry, an anthropologist at the University of Alabama, Birmingham. The 2% figure is significantly lower than Harvard University psychologist Steven Pinker’s much publicized estimate that 15% of deaths are due to lethal violence among hunter-gatherers. The lower figure resonates with Fry’s extensive studies of nomadic hunter-gatherers, whom he has observed to be less violent than Pinker’s work suggests. “Along with archaeology and nomadic forager research, this [study] shoots holes in the view that the human past and human nature are shockingly violent,” Fry says. [Continue reading…]

Facebooktwittermail

The social practice of self-betrayal in career-driven America

nyc (1)

Talbot Brewer writes: I don’t know how careers are seen in other countries, but in the United States we are exhorted to view them as the primary locus of self-realization. The question before you when you are trying to choose a career is to figure out “What Color is Your Parachute?” (the title of a guide to job searches that has been a perennial best seller for most of my lifetime). The aim, to quote the title of another top-selling guide to career choices, is to “Do What You Are.”

These titles tell us something about what Americans expect to find in a career: themselves, in the unlikely form of a marketable commodity. But why should we expect that the inner self waiting to be born corresponds to some paid job or profession? Are we really all in possession of an inner lawyer, an inner beauty products placement specialist, or an inner advertising executive, just waiting for the right job opening? Mightn’t this script for our biographies serve as easily to promote self-limitation or self-betrayal as to further self-actualization?

We spend a great deal of our youth shaping ourselves into the sort of finished product that potential employers will be willing to pay dearly to use. Beginning at a very early age, schooling practices and parental guidance and approval are adjusted, sometimes only semi-consciously, so as to inculcate the personal capacities and temperament demanded by the corporate world. The effort to sculpt oneself for this destiny takes a more concerted form in high school and college. We choose courses of study, and understand the importance of success in these studies, largely with this end in view.

Even those who rebel against these forces of acculturation are deeply shaped by them. What we call “self-destructive” behavior in high school might perhaps be an understandable result of being dispirited by the career prospects that are recommended to us as sufficient motivation for our studies. As a culture we have a curious double-mindedness about such reactions. It is hard to get through high school in the United States without being asked to read J.D. Salinger’s Catcher in the Rye — the story of one Holden Caulfield’s angst-ridden flight from high school, fueled by a pervasive sense that the adult world is irredeemably phony. The ideal high school student is supposed to find a soul-mate in Holden and write an insightful paper about his telling cultural insights, submitted on time in twelve-point type with double spacing and proper margins and footnotes, so as to ensure the sort of grade that will keep the student on the express train to the adult world whose irredeemable phoniness he has just skillfully diagnosed. [Continue reading…]

Facebooktwittermail