Katherine W. Phillips writes: The first thing to acknowledge about diversity is that it can be difficult. In the U.S., where the dialogue of inclusion is relatively advanced, even the mention of the word “diversity” can lead to anxiety and conflict. Supreme Court justices disagree on the virtues of diversity and the means for achieving it. Corporations spend billions of dollars to attract and manage diversity both internally and externally, yet they still face discrimination lawsuits, and the leadership ranks of the business world remain predominantly white and male.
It is reasonable to ask what good diversity does us. Diversity of expertise confers benefits that are obvious — you would not think of building a new car without engineers, designers and quality-control experts — but what about social diversity? What good comes from diversity of race, ethnicity, gender and sexual orientation? Research has shown that social diversity in a group can cause discomfort, rougher interactions, a lack of trust, greater perceived interpersonal conflict, lower communication, less cohesion, more concern about disrespect, and other problems. So what is the upside?
The fact is that if you want to build teams or organizations capable of innovating, you need diversity. Diversity enhances creativity. It encourages the search for novel information and perspectives, leading to better decision making and problem solving. Diversity can improve the bottom line of companies and lead to unfettered discoveries and breakthrough innovations. Even simply being exposed to diversity can change the way you think. This is not just wishful thinking: it is the conclusion I draw from decades of research from organizational scientists, psychologists, sociologists, economists and demographers.
By Peter Adamson, Aeon, November 4, 2016
In European antiquity, philosophers largely wrote in Greek. Even after the Roman conquest of the Mediterranean and the demise of paganism, philosophy was strongly associated with Hellenic culture. The leading thinkers of the Roman world, such as Cicero and Seneca, were steeped in Greek literature; Cicero even went to Athens to pay homage to the home of his philosophical heroes. Tellingly, the emperor Marcus Aurelius went so far as to write his Meditations in Greek. Cicero, and later Boethius, did attempt to initiate a philosophical tradition in Latin. But during the early Middle Ages, most of Greek thought was accessible in Latin only partially and indirectly.
Elsewhere, the situation was better. In the eastern part of the Roman Empire, the Greek-speaking Byzantines could continue to read Plato and Aristotle in the original. And philosophers in the Islamic world enjoyed an extraordinary degree of access to the Hellenic intellectual heritage. In 10th-century Baghdad, readers of Arabic had about the same degree of access to Aristotle that readers of English do today.
This was thanks to a well-funded translation movement that unfolded during the Abbasid caliphate, beginning in the second half of the eighth century. Sponsored at the highest levels, even by the caliph and his family, this movement sought to import Greek philosophy and science into Islamic culture. Their empire had the resources to do so, not just financially but also culturally. From late antiquity to the rise of Islam, Greek had survived as a language of intellectual activity among Christians, especially in Syria. So when Muslim aristocrats decided to have Greek science and philosophy translated into Arabic, it was to Christians that they turned. Sometimes, a Greek work might even be translated first into Syriac, and only then into Arabic. It was an immense challenge. Greek is not a semitic language, so they were moving from one language group to another: more like translating Finnish into English than Latin into English. And there was, at first, no established terminology for expressing philosophical ideas in Arabic.
By Thomas Nail, Aeon, December 14, 2016
Today there are more than 1 billion regional and international migrants, and the number continues to rise: within 40 years, it might double due to climate change. While many of these migrants might not cross a regional or international border, people change residences and jobs more often, while commuting longer and farther to work. This increase in human mobility and expulsion affects us all. It should be recognised as a defining feature of our epoch: the 21st century will be the century of the migrant.
In order to manage and control this mobility, the world is becoming ever more bordered. In just the past 20 years, but particularly since the terrorist attacks of 11 September 2001 on the US, hundreds of new borders have emerged around the world: miles of new razor-wire fences and concrete security walls, numerous offshore detention centres, biometric passport databases, and security checkpoints in schools, airports and along various roadways across the world. All attest to the present preoccupation with controlling social motion through borders.
This preoccupation, however, runs through the history of Western civilisation. In fact, civilisation’s very expansion required the continual expulsion of migrant populations. These include the territorial techniques of dispossessing people from their land through miles of new fencing (invented during the Neolithic period); political techniques of stripping people of their right to free movement and inclusion with new walls to keep out foreigners (invented during the Ancient period and put to use in Egypt, Greece and Rome); juridical techniques of criminalisation and cellular confinement (invented during the European Middle Ages); and economic techniques of unemployment and expropriation surveyed by a continuous series of checkpoints (an innovation of the Modern era). The return and mixture of all these historical techniques, thought to have been excised by modern liberalism, now define a growing portion of everyday social life.
Scott Barry Kaufman writes: In a recent study, Christine Brophy and Jordan Peterson conducted a very illuminating analysis of the personality of political correctness. They created a very comprehensive 192-item PC scale measuring PC-related language, beliefs, and emotions based on their reading of news articles, books, and research papers on political correctness. Their PC battery employed a variety of question types, and tapped into the beliefs, language, and emotional sensitivity of politically correct individuals. The list was reviewed and added to by faculty and graduate students, and 332 participants completed the new PC scale, along with questionnaires on personality, IQ, and disgust sensitivity.
What did they find?
The researchers found that PC exists, can be reliably measured, and has two major dimensions. They labeled the first dimension “PC-Egalitarianism” and the second dimension “PC-Authoritarianism”. Interestingly, they found that PC is not a purely left-wing phenomenon, but is better understood as the manifestation of a general offense sensitivity, which is then employed for either liberal or conservative ends.
Nevertheless, while both dimensions of political correctness involve offense sensitivity, they found some critical differences. PC-Egalitarians tended to attribute a cultural basis for group differences, believed that differences in group power springs from societal injustices, and tended to support policies to prop up historically disadvantages groups. Therefore, the emotional response of this group to discriminating language appears to stem from an underlying motivation to achieve diversity through increased equality, and any deviation from equality is assumed to be caused by culture. Their beliefs lead to advocating for a more democratic governance.
In contrast, PC-Authoritarians tended to attribute a biological basis for group differences, supported censorship of material that offends, and supported policies of harsher punitive justice for transgressors. Therefore, this dimension of PC seems to reflect more of an indiscriminate or general sensitivity to offense, and seems to stem from an underlying motivation to achieve security and stability for those in distress. Their beliefs lead to advocating for a more autocratic governance to achieve uniformity. [Continue reading…]
Deni Ellis Béchard writes: The real-life backstory of Jack Kerouac’s unpublished novel is classic beat generation. It was December 1952, and tensions were running high as Jack and his friend Neal Cassady — the inspiration for the character of Dean Moriarty in On the Road — drove from San Francisco to Mexico City.
Whereas Neal was looking for adventure and a chance to stock up on weed, Jack was in a difficult period. His first novel, The Town and the City, published under the name John Kerouac in 1950, had met with lukewarm reviews and poor sales. In April 1951, he had written On the Road on a (now famous) 120-foot-long scroll, but hadn’t been able to find a publisher. He was thirty and had been laid off by the railroad after a bout of phlebitis in his leg.
Kerouac decided to convalesce in Mexico City with William S. Burroughs, who would later author Naked Lunch. Three months earlier, Burroughs had performed a William Tell act with his wife, Joan, while they were drunk and accidentally shot her in the head, killing her. Shortly after Kerouac’s arrival, Burroughs skipped bail and fled the country. Neal Cassady went home. Alone, living in a rooftop apartment in Mexico City, Jack wrote a short novel over the course of five days.
The first line reads: Dans l’moi d’Octobre, 1935, (dans la nuit de nos vra vie bardasseuze) y’arriva une machine du West, de Denver, sur le chemin pour New York. Written in the language of Kerouac’s childhood — a French-Canadian patois then commonly spoken in parts of New England — the line has an epic, North American ring. Kerouac would later translate it as “In the month of October, 1935, in the night of our real restless lives, a car came from the West, from Denver, on the road for New York.”
The novel’s title is Sur le chemin — “On the Road.” But it is not the On the Road we all know (which would be translated in France as Sur la route). It was the On the Road of Kerouac’s vernacular — chemin being used in the title to mean both “path” and “road.”
Over the course of his literary career, Kerouac redefined the archetype of the American man, and he has since become so integral to American culture that his identity as an immigrant writer is often forgotten. He was born in 1922 as Jean-Louis Lebris de Kérouac to parents from Quebec. He spoke French at home and grew up in the French-Canadian community of Lowell, Massachusetts. In one of his letters, he wrote, “The English language is a tool lately found . . . so late (I never spoke English before I was six or seven). At 21, I was still somewhat awkward and illiterate sounding in my [English] speech and writings.”
In 1954, Kerouac created a list of everything he had written and included Sur le chemin among his “completed novels” — even though it would remain in his archives for more than six decades before publication was finally arranged this year. Sur le chemin and his other French writings provide a key to unlocking his more famous works, revealing a man just as obsessed with the difficulty of living between two languages as he was with his better-known spiritual quests.
In particular, they help explain the path — le chemin — he took as he developed his influential style, which changed the way many writers throughout the world have thought about prose. To this day, Kerouac remains one of the most translated authors, and one whose work is shared across generations. His unpublished French works shine a light on how the voice and ideas of an iconic American figure emerged from the experiences of French-Canadian immigrants — a group whose language and culture remain largely unknown to mainstream America. [Continue reading…]
As Americans sit down to their Thanksgiving Day feasts, some may recall the story of the “Pilgrim Fathers” who founded one of the first English settlements in North America in 1620, at what is today the town of Plymouth, Massachusetts.
What many Americans don’t realize, however, is that the story of those early settlers’ struggle, which culminated in what we remember today as the first Thanksgiving feast, is also a tale of globalization, many centuries before the word was even coined.
Crossing the Atlantic began a century before the Pilgrims’ passage to the New World aboard the Mayflower. By the 1600s, trans-Atlantic travel had became increasingly common. It was because of globalization that those first settlers were able to survive in an inhospitable and unforgiving land. And the turkey on Thanksgiving tables may not be a bird native to the U.S. but is more likely a (re)import from Europe.
Two short stories will help me explain. As a professor of international business at Rutgers University, I have been fascinated by the history of trade going back millennia, and how most Americans do not know the background story of Thanksgiving Day.
We are a divided nation; that is an understatement. What’s more, we increasingly hear we are living in our own “bubble” or echo chamber that differing views cannot penetrate. To correct the problem, many are calling for people to reach out, to talk and above all, to listen. That is all well and good, but what are we supposed to talk about? We can’t hope to listen without a topic for finding common ground.
In my view, there are (at least) two prominent issues in this election that can serve as a bridge across our political divides. The first is that the political and economic system needs fixing because it favors those with special status or access. The second is that income inequality is reaching an intolerable level.
Might these two topics help mend the unpleasant Thanksgiving or Christmas dinners that many Americans are dreading? Instead of avoiding that unpleasantness, it may be a time to embrace it.
After one of the most divisive presidential elections in American history, many of us may be anxious about dinner-table dialogue with family and friends this Thanksgiving. There is no denying that the way we communicate about politics has fundamentally changed with the proliferation of technology and social media. Twitter bots, fake news and echo chambers are just a few of the highlights from this election season. Much of how we’re conversing online can’t – and shouldn’t – be replicated around the family table. We are getting out of practice at conducting meaningful, respectful conversation.
There’s not a quick fix. We need more empathic communication – the slow, deep (inter)personal discourse that can nurture identity and build and strengthen relationships. Yet contemporary communication platforms can make it harder to build empathy with conversational partners. Even the phrase “conversational partners” seems unfitting in the world of 140-character limits, followers, likes and shares. In many ways, our devices help us talk at (@?) instead of with one another.
Literally meaning “in-feeling,” empathy is a process of internalizing another person’s perspective. Empathy-building is unselfish; you suspend your own sensibilities and try to fully imagine and embrace those of someone else. You can gain empathy by learning about other cultures from different media, by experiencing what others have gone through personally, or by having deep conversations with others.
My research into cross-cultural communications has taught me that empathy is not only the key to feeling connected – “I understand you” – but also the foundation for changing our narratives about one another – “now I see we are not so different.” That’s an important point to remember after such a difficult political experience. Building empathy requires communication, specifically talking to one another. But, not just any talking will suffice – especially not the type of talking promoted by today’s highly popular communication technologies.
Lorraine Daston writes: The history of science is punctuated by not one, not two, but three modernities: the first, in the seventeenth century, known as “the Scientific Revolution”; the second, circa 1800, often referred to as “the second Scientific Revolution”; and the third, in the first quarter of the twentieth century, when relativity theory and quantum mechanics not only overturned the achievements of Galileo and Newton but also challenged our deepest intuitions about space, time, and causation.
Each of these moments transformed science, both as a body of knowledge and as a social and political force. The first modernity of the seventeenth century displaced the Earth from the center of the cosmos, showered Europeans with new discoveries, from new continents to new planets, created new forms of inquiry such as field observation and the laboratory experiment, added prediction to explanation as an ideal toward which science should strive, and unified the physics of heaven and earth in Newton’s magisterial synthesis that served as the inspiration for the political reformers and revolutionaries of the Enlightenment. The second modernity of the early nineteenth century unified light, heat, electricity, magnetism, and gravitation into the single, fungible currency of energy, put that energy to work by creating the first science-based technologies to become gigantic industries (e.g., the manufacture of dyestuffs from coal tar derivatives), turned science into a salaried profession and allied it with state power in every realm, from combating epidemics to waging wars. The third modernity, of the early twentieth century, toppled the certainties of Newton and Kant, inspired the avant-garde in the arts, and paved the way for what were probably the two most politically consequential inventions of the last hundred years: the mass media and the atomic bomb.
The aftershocks of all three of these earthquakes of modernity are still reverberating today: in heated debates, from Saudi Arabia to Sri Lanka to Senegal, about the significance of the Enlightenment for human rights and intellectual freedom; in the assessment of how science-driven technology and industrialization may have altered the climate of the entire planet; in anxious negotiations about nuclear disarmament and utopian visions of a global polity linked by the worldwide Net. No one denies the world-shaking and world-making significance of any of these three moments of scientific modernity.
Yet from the perspective of the scientists themselves, the experience of modernity coincides with none of these seismic episodes. The most unsettling shift in scientific self-understanding — about what science was and where it was going — began in the middle decades of the nineteenth century, reaching its climax circa 1900. It was around that time that scientists began to wonder uneasily about whether scientific progress was compatible with scientific truth. If advances in knowledge were never-ending, could any scientific theory or empirical result count as real knowledge — true forever and always? Or was science, like the monarchies of Europe’s anciens régimes and the boundaries of its states and principalities, doomed to perpetual revision and revolution? [Continue reading…]
Ivan Krastev writes: In our increasingly Anglophone world, Americans have become nakedly transparent to English speakers everywhere, yet the world remains bafflingly and often frighteningly opaque to monolingual Americans. While the world devours America’s movies and follows its politics closely, Americans know precious little about how non-Americans think and live. Americans have never heard of other countries’ movie stars and have only the vaguest notions of what their political conflicts are about.
This gross epistemic asymmetry is a real weakness. When WikiLeaks revealed the secret cables of the American State Department or leaked the emails of the Clinton campaign, it became a global news sensation and a major embarrassment for American diplomacy. Leaking Chinese diplomatic cables or Russian officials’ emails could never become a worldwide human-interest story, simply because only a relative handful of non-Chinese or non-Russians could read them, let alone make sense of them. [Continue reading…]
Although I’m pessimistic about the prospects of the meek inheriting the earth, the bi-lingual are in a very promising position. And Anglo-Americans should never forget that this is after all a country with a Spanish name. As for where I stand personally, I’m with the bi-lingual camp in spirit even if my own claim to be bi-lingual is a bit tenuous — an English-speaker who understands American-English but speaks British-English; does that count?
Carl Zimmer writes: Beneath a rocky slope in central Jordan lie the remains of a 10,000-year-old village called Ain Ghazal, whose inhabitants lived in stone houses with timber roof beams, the walls and floors gleaming with white plaster.
Hundreds of people living there worshiped in circular shrines and made haunting, wide-eyed sculptures that stood three feet high. They buried their cherished dead under the floors of their houses, decapitating the bodies in order to decorate the skulls.
But as fascinating as this culture was, something else about Ain Ghazal intrigues archaeologists more: It was one of the first farming villages to have emerged after the dawn of agriculture.
Around the settlement, Ain Ghazal farmers raised barley, wheat, chickpeas and lentils. Other villagers would leave for months at a time to herd sheep and goats in the surrounding hills.
Sites like Ain Ghazal provide a glimpse of one of the most important transitions in human history: the moment that people domesticated plants and animals, settled down, and began to produce the kind of society in which most of us live today.
But for all that sites like Ain Ghazal have taught archaeologists, they are still grappling with enormous questions. Who exactly were the first farmers? How did agriculture, a cornerstone of civilization itself, spread to other parts of the world?
Some answers are now emerging from a surprising source: DNA extracted from skeletons at Ain Ghazal and other early settlements in the Near East. These findings have already challenged long-held ideas about how agriculture and domestication arose. [Continue reading…]
David J Silverman writes: It has become commonplace to attribute the European conquest of the Americas to Jared Diamond’s triumvirate of guns, germs and steel. Germs refer to plague, measles, flu, whooping cough and, especially, the smallpox that whipsawed through indigenous populations, sometimes with a mortality rate of 90 per cent. The epidemics left survivors ill-equipped to fend off predatory encroachments, either from indigenous or from European peoples, who seized captives, land and plunder in the wake of these diseases.
Guns and steel, of course, represent Europeans’ technological prowess. Metal swords, pikes, armour and firearms, along with ships, livestock and even wheeled carts, gave European colonists significant military advantages over Native American people wielding bows and arrows, clubs, hatchets and spears. The attractiveness of such goods also meant that Indians desired trade with Europeans, despite the danger the newcomers represented. The lure of trade enabled Europeans to secure beachheads on the East Coast of North America, and make inroads to the interior of the continent. Intertribal competition for European trade also enabled colonists to employ ‘divide and conquer’ strategies against much larger indigenous populations.
Diamond’s explanation has grown immensely popular and influential. It appears to be a simple and sweeping teleology providing order and meaning to the complexity of the European conquest of the Western hemisphere. The guns, germs and steel perspective has helped further understanding of some of the major forces behind globalisation. But it also involves a level of abstraction that risks obscuring the history of individuals and groups whose experiences cannot be so aptly and neatly summarised. [Continue reading…]