Kenan Malik writes: Debates about immigration are… rarely about numbers as such. They are much more about who the migrants are, and about underlying anxieties of nation, community, identity and values. ‘We should not forget’, claimed Hungarian prime minister Viktor Orbán, as Hungary put up new border fences, and introduced draconian new anti-immigration laws, ‘that the people who are coming here grew up in a different religion and represent a completely different culture. Most are not Christian, but Muslim.’ ‘Is it not worrying’, he asked, ‘that Europe’s Christian culture is already barely able to maintain its own set of Christian values?’
Many thinkers, Christian and non-Christian, religious and non-religious, echo this fear of Muslim immigration undermining the cultural and moral foundation of Western civilization. The late Oriana Fallaci, the Italian writer who perhaps more than most promoted the notion of Eurabia – the belief that Europe is being Islamicised – described herself as a ‘Christian atheist’, insisting that only Christianity provided Europe with a cultural and intellectual bulwark against Islam. The British historian Niall Ferguson calls himself ‘an incurable atheist’ and yet is alarmed by the decline of Christianity which undermines ‘any religious resistance’ to radical Islam. Melanie Phillips, a non-believing Jew, argues in her book The World Turned Upside Down that ‘Christianity is under direct and unremitting cultural assault from those who want to destroy the bedrock values of Western civilization.’
To look upon migration in this fashion is, I want to suggest, a misunderstanding of both Europe’s past and Europe’s present. To understand why, I want first to explore two fundamental questions, the answers to which must frame any discussion on inclusion and morality. What we mean by a diverse society? And why should we value it, or indeed, fear it?
When we think about diversity today in Europe, the picture we see is that of societies that in the past were homogenous, but have now become plural because of immigration. But in what way were European societies homogenous in the past? And in what ways are they diverse today?
Certainly, if you had asked a Frenchman or an Englishman or a Spaniard in the nineteenth or the fifteenth or the twelfth centuries, they would certainly not have described their societies as homogenous. And were they to be transported to contemporary Europe, it is likely that they would see it as far less diverse than we do.
Our view of the Europe of the past is distorted by historical amnesia; and our view of the Europe of the present is distorted by a highly restricted notion of diversity. When we talk of European societies as historically homogenous, what we mean is that they used to be ethnically, or perhaps culturally, homogenous. But the world is diverse in many ways. Societies are cut through by differences, not only of ethnicity, but also of class, gender, faith, politics, and much else. [Continue reading…]
Adam Piore writes: It started with one man quietly sipping a Tom Collins in the lounge car of the Cleveland-bound train.
“God bless America,” he sang, “land that I love …”
It didn’t take long. Others joined in. “Stand beside her … and guide her …” Soon the entire train car had taken up the melody, belting out the patriotic song at the top of their lungs.
It was 1940 and such spontaneous outpourings, this one described in a letter to the song’s creator Irving Berlin, were not unusual. That was the year the simple, 32-bar arrangement was somehow absorbed into the fabric of American culture, finding its way into American Legion halls, churches and synagogues, schools, and even a Louisville, Kentucky, insurance office, where the song reportedly sprang to the lips of the entire sales staff one day. The song has reemerged in times of national crisis or pride over and over, to be sung in ballparks, school assemblies, and on the steps of the United States Capitol after 9/11.
Berlin immigrated to the U.S. at age 5. His family fled Russia to escape a wave of murderous pogroms directed at Jews. His mother often murmured “God Bless America” as he was growing up. “And not casually, but with emotion which was almost exaltation,” Berlin later recalled.
“He always talked about it like a love song,” says Sheryl Kaskowitz, the author of God Bless America, the Surprising History of an Iconic Song. “It came from this really genuine love and a sense of gratitude to the U.S.”
It might seem ironic that someone born in a foreign land would compose a song that so powerfully expressed a sense of national belonging—that this song embraced by an entire nation was the expression of love from an outsider for his adopted land. In the U.S., a nation of immigrants built on the prospect of renewal, it’s not the least bit surprising. It is somehow appropriate.
Patriotism is an innate human sentiment. It is part of a deeper subconscious drive toward group formation and allegiance. It operates as much in one nation under God as it does in a football stadium. Group bonding is in our evolutionary history, our nature. According to some recent studies, the factors that make us patriotic are in our very genes.
But this allegiance—this blurring of the lines between individual and group—has a closely related flipside; it’s not always a warm feeling of connection in the Cleveland-bound lounge car. Sometimes our instinct for group identification serves as a powerful wedge to single out those among us who are different. Sometimes what makes us feel connected is not a love of home and country but a common enemy. [Continue reading…]
Nature reports: Two Middle Eastern populations independently developed farming and then spread the technology to Europe, Africa and Asia, according to the genomes of 44 people who lived thousands of years ago in present-day Armenia, Turkey, Israel, Jordan and Iran.
Posted on 17 June on the bioRxiv preprint server1, the research supports archaeological evidence about the multiple origins of farming, and represents the first detailed look at the ancestry of the individuals behind one of the most important periods in human history — the Neolithic revolution.
Some 11,000 years ago, humans living in the ancient Middle East region called the Fertile Crescent shifted from a nomadic existence, based on hunting game and gathering wild plants, to a more sedentary lifestyle that would later give rise to permanent settlements. Over thousands of years, these early farmers domesticated the first crops and transformed sheep, wild boars and other creatures into domestic animals.
Dozens of studies have examined the genetics of the first European farmers, who emigrated from the Middle East beginning some 8,000 years ago, but the hot climes of the Fertile Crescent had made it difficult to obtain ancient DNA from remains found there. Advances in extracting DNA from a tiny ear bone called the petrous allowed a team led by Iosif Lazaridis and David Reich, population geneticists at Harvard Medical School in Boston, Massachusetts, to analyse the genomes of the 44 Middle Eastern individuals, who lived between 14,000 and 3,500 years ago. [Continue reading…]
Science News reports: Prehumans living around 800,000 years ago in what’s now southeastern Spain were, literally, trailblazers. They lit small, controlled blazes in a cave, a new study finds.
Discoveries in the cave provide the oldest evidence of fire making in Europe and support proposals that members of the human genus, Homo, regularly ignited fires starting at least 1 million years ago, say paleontologist Michael Walker of the University of Murcia in Spain and his colleagues. Fire making started in Africa (SN: 5/5/12, p. 18) and then moved north to the Middle East (SN: 5/1/04, p. 276) and Europe, the researchers conclude in the June Antiquity.
If the age estimate for the Spain find holds up, the new report adds to a “surprising number” of sites from deep in the Stone Age that retain evidence of small, intentionally lit fires, says archaeologist John Gowlett of the University of Liverpool in England.
Excavations conducted since 2011 at the Spanish cave, Cueva Negra del Estrecho del Río Quípar, have uncovered more than 165 stones and stone artifacts that had been heated, as well as about 2,300 animal-bone fragments displaying signs of heating and charring. Microscopic and chemical analyses indicate that these finds had been heated to between 400° and 600° Celsius, consistent with having been burned in a fire. [Continue reading…]
Justin E H Smith writes: A poet, somewhere in Siberia, or the Balkans, or West Africa, some time in the past 60,000 years, recites thousands of memorised lines in the course of an evening. The lines are packed with fixed epithets and clichés. The bard is not concerned with originality, but with intonation and delivery: he or she is perfectly attuned to the circumstances of the day, and to the mood and expectations of his or her listeners.
If this were happening 6,000-plus years ago, the poet’s words would in no way have been anchored in visible signs, in text. For the vast majority of the time that human beings have been on Earth, words have had no worldly reality other than the sound made when they are spoken.
As the theorist Walter J Ong pointed out in Orality and Literacy: Technologizing the Word (1982), it is difficult, perhaps even impossible, now to imagine how differently language would have been experienced in a culture of ‘primary orality’. There would be nowhere to ‘look up a word’, no authoritative source telling us the shape the word ‘actually’ takes. There would be no way to affirm the word’s existence at all except by speaking it – and this necessary condition of survival is important for understanding the relatively repetitive nature of epic poetry. Say it over and over again, or it will slip away. In the absence of fixed, textual anchors for words, there would be a sharp sense that language is charged with power, almost magic: the idea that words, when spoken, can bring about new states of affairs in the world. They do not so much describe, as invoke.
As a consequence of the development of writing, first in the ancient Near East and soon after in Greece, old habits of thought began to die out, and certain other, previously latent, mental faculties began to express themselves. Words were now anchored and, though spellings could change from one generation to another, or one region to another, there were now physical traces that endured, which could be transmitted, consulted and pointed to in settling questions about the use or authority of spoken language.
Writing rapidly turned customs into laws, agreements into contracts, genealogical lore into history. In each case, what had once been fundamentally temporal and singular was transformed into something eternal (as in, ‘outside of time’) and general. Even the simple act of making everyday lists of common objects – an act impossible in a primary oral culture – was already a triumph of abstraction and systematisation. From here it was just one small step to what we now call ‘philosophy’. [Continue reading…]
Amy Ellis Nutt writes: For starters, he is the father of Western science and Western philosophy. He invented formal logic and the scientific method and wrote the first books about biology, physics, astronomy and psychology. Freedom and democracy, justice and equality, the importance of a middle class and the dangers of credit — they’re just a sampling of Aristotle’s political and economic principles. And, yes, Christianity, Islam and our Founding Fathers also owe him a lot.
Nearly 2-1/2 millennia after Aristotle’s birth, we now know where his ashes most likely were laid to rest: in the city of his birth, Stagira, on a small, picturesque peninsula in northern Greece.
“We have no [concrete] evidence, but very strong indications reaching almost to certainty,” archaeologist Kostas Sismanidis said through a translator at this week’s World Congress celebrating “Aristotle 2400 Years.” [Continue reading…]
The Guardian reports: Four helipads will cluster around one of the largest domes in the world, like sideplates awaiting the unveiling of a momentous main course, which will be jacked up 45 storeys into the sky above the deserts of Mecca. It is the crowning feature of the holy city’s crowning glory, the superlative summit of what will be the world’s largest hotel when it opens in 2017.
With 10,000 bedrooms and 70 restaurants, plus five floors for the sole use of the Saudi royal family, the £2.3bn Abraj Kudai is an entire city of five-star luxury, catering to the increasingly high expectations of well-heeled pilgrims from the Gulf.
Modelled on a “traditional desert fortress”, seemingly filtered through the eyes of a Disneyland imagineer with classical pretensions, the steroidal scheme comprises 12 towers teetering on top of a 10-storey podium, which houses a bus station, shopping mall, food courts, conference centre and a lavishly appointed ballroom.
Located in the Manafia district, just over a mile south of the Grand Mosque, the complex is funded by the Saudi Ministry of Finance and designed by the Dar Al-Handasah group, a 7,000-strong global construction conglomerate that turns its hand to everything from designing cities in Kazakhstan to airports in Dubai. For the Abraj Kudai, it has followed the wedding-cake pastiche style of the city’s recent hotel boom: cornice is piled upon cornice, with fluted pink pilasters framing blue-mirrored windows, some arched with a vaguely Ottoman air. The towers seem to be packed so closely together that guests will be able to enjoy views into each other’s rooms.
“The city is turning into Mecca-hattan,” says Irfan Al-Alawi, director of the UK-based Islamic Heritage Research Foundation, which campaigns to try to save what little heritage is left in Saudi Arabia’s holy cities. “Everything has been swept away to make way for the incessant march of luxury hotels, which are destroying the sanctity of the place and pricing normal pilgrims out.”
The Grand Mosque is now loomed over by the second tallest building in the world, the Abraj al-Bait clocktower, home to thousands more luxury hotel rooms, where rates can reach £4,000 a night for suites with the best views of the Kaaba – the black cube at the centre of the mosque around which Muslims must walk. The hotel rises 600m (2,000ft) into the air, projecting a dazzling green laser-show by night, on a site where an Ottoman fortress once stood – razed for development, along with the hill on which it sat.
The list of heritage crimes goes on, driven by state-endorsed Wahhabism, the hardline interpretation of Islam that perceives historical sites as encouraging sinful idolatry – which spawned the ideology that is now driving Isis’s reign of destruction in Syria and Iraq. [Continue reading…]
The construction of towering luxury hotels in Mecca seems to conflict with what can be described as the leveling effect for pilgrims performing the annual Hajj.
A 2008 Harvard study which compared attitudes of 800 successful Hajj lottery applicants from Pakistan, to an equal number of unsuccessful ones, found:
Hajjis have more positive views about people from other Muslim countries and are more likely to believe that different Pakistani ethnic and Islamic sectarian groups are equal and that they can live in harmony. Despite non-Muslims not being part of the hajj experience, these views also extend to adherents of other religions: Pilgrims are 22 percent more likely to declare that people of different religions are equal and 11 percent more likely to state that different religions can live in harmony by compromising over their disagreements.
Paralleling the findings on tolerance, hajjis report more positive views on women’s abilities, greater concern for their quality of life, and are also more likely to favor educating girls and women participating in the workforce.
Hajjis are also less likely to support the use of violence and show no evidence of any increased hostility toward the West. They are more than twice as likely to declare that the goals of Osama bin Laden are incorrect, more likely to express a preference for peace between Pakistan and India, and more likely to declare that it is incorrect to physically punish someone if they have dishonored the family. Hajjis also become more sensitive to crimes against women.
It thus seems that in many respects, the value of Hajj has less to do with the quality of accommodation available to pilgrims than it does with the avenues of access.
“These are the last days of Mecca,” Alawi tells The Guardian. “The pilgrimage is supposed to be a spartan, simple rite of passage, but it has turned into an experience closer to Las Vegas, which most pilgrims simply can’t afford.”
Ryan Ruby writes: For a word that literally means definition, the aphorism is a rather indefinite genre. It bears a family resemblance to the fragment, the proverb, the maxim, the hypomnema, the epigram, the mantra, the parable, and the prose poem. Coined sometime between the fifth and third centuries BC as the title for one of the books of the Corpus Hippocraticum, the Aphorismi were originally a compendium of the latest medical knowledge. The penultimate aphorism, “In chronic disease an excessive flux from the bowels is bad,” is more representative of the collection’s contents than the first — “Life is short, art is long” — for which it is best known.
But in those six words lies a clue to the particular space aphorisms were supposed to define. Thanks to a semantic slippage between the Greek word techne and its English translation (via the Latin ars), the saying is often taken to mean that the works of human beings outlast their days. But in its original context, Hippocrates or his editors probably intended something more pragmatic: the craft of medicine takes a long time to learn, and physicians have a short time in which to learn it. Although what aphorisms have in common with the forms listed above is their brevity, what is delimited by the aphorism is not the number of words in which ideas are expressed but the scope of their inquiry. Unlike Hebrew proverbs, in which the beginning of wisdom is the fear of God, the classical aphorism is a secular genre concerned with the short span of time we are allotted on earth. Books of aphorisms are also therapeutic in nature, collections of practical wisdom through which we can rid ourselves of unnecessary suffering and achieve what Hippocrates’ contemporary Socrates called eudaimonia, the good life.
This is certainly what the Stoic philosopher Arrian had in mind when he whittled down the discourses of his master, Epictetus, into a handbook of aphorisms. The Enchiridion is composed of that mixture of propositional assertion and assertive imperative that is now a hallmark of the form. In it, Epictetus, a former slave, outlines the Stoic view that, while “some things are in our control,” most things are ruled by fate. The way to the good life is to bring what is up to us — our attitudes, judgments, and desires — into harmony with what is not up to us: what happens to our bodies, possessions, and reputations. If we accept that what does happen must happen, we will never be disappointed by vain hopes or sudden misfortunes. Our dispositions, not our destinies, are the real source of our unhappiness. [Continue reading…]
Tania Lombrozo writes: Researchers have studied how people think about humans in relation to the natural world, and how the way we reason about humans and other animals changes over the course of development and as a function of education and culture.
The findings from this body of work suggest that by age 5, Western children growing up in urban environments are anomalous in the extent to which they regard humans as central to the biological world. Much of the rest of the world — including 3-year-olds, 5-year-olds in rural environments and adults from indigenous populations in South America — are more inclined to think about humans as one animal species among others, at least when it comes to reasoning about the properties that human and non-human animals are likely to possess.
To illustrate, consider a study by Patricia Herrmann, Sandra Waxman and Douglas Medin published in the Proceedings of the National Academy of Sciences in 2010. In one experiment, 64 urban children, aged 3 or 5, were asked a series of questions that assessed their willingness to generalize an unknown property from one object to another. For instance, they might be told that people “have andro inside,” and would then have to guess whether it’s right or wrong to say that dogs “have andro inside.”
The findings with 5-year-olds replicated classic work in developmental psychology and suggested a strong “anthropocentric” bias: The children were more likely to generalize from humans to non-humans than the other way around, consistent with a privileged place for humans in the biological world. The 3-year-olds, by contrast, showed no signs of this bias: They generalized from humans to non-humans and from non-humans to humans in just the same way. These findings suggest that an anthropocentric perspective isn’t a necessary starting point for human reasoning about the biological world, but rather a perspective we acquire through experience.
So what happens between the ages of 3 and 5 to induce an anthropocentric bias?
Perhaps surprisingly, one influence seems to be anthropomorphism in storybooks. [Continue reading…]
Specifically, something is undermining young people’s mental health, especially girls.
In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.
“It indicates a lot of suffering,” Twenge told Quartz.
It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:
Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.
The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).
“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.
Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.
Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.
Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.
“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.
“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]
Kacem El Ghazzali writes: When we say that nowadays to call for sexual freedom in Arab and Muslim societies is more dangerous than the demand to topple monarchies or dictatorial regimes, we are not playing with metaphor or attempting to gain sympathy. We are stating a bitter and painful fact of the reality in which we are living.
In Arab and Muslim milieus, sex is considered a means and not an end, hedged by many prickly restrictions that make it an objectionable matter and synonymous with sin. Its function within marriage is confined to procreation and nothing else, and all sexual activity outside the institution of marriage is banned legally and rejected socially. Innocent children born out of wedlock are socially rejected and considered foundlings.
This situation cannot be said to be characteristic of Arab societies only, but we experience these miseries in far darker and more intense ways than in other countries. This is especially so because of the dominance of machismo, which considers a man’s sexual adventures as heroics worthy of pride, while a woman who dares to give in to her sexual desires is destined to be killed — or at best beaten and expelled from home — because she has brought dishonor upon her family. [Continue reading…]
Antonia Malchik writes: The ranch my mother was born on was not built solely by her family’s labour. It relied on water aquifers deep beneath the surface, the health of soil on plains and hills beyond their borders, on hundreds – perhaps thousands – of years of care by the Blackfoot tribe whose land it should have remained, the weather over which they had no control, the sun, seeds, and a community who knew in their bones that nobody could do this alone. These things comprised an ecosystem that was vital to their survival, and the same holds true today. These are our shared natural resources, or what was once known as ‘the commons’.
We live on and in the commons, even if we don’t recognise it as such. Every time we take a breath, we’re drawing from the commons. Every time we walk down a road we’re using the commons. Every time we sit in the sunshine or shelter from the rain, listen to birdsong or shut our windows against the stench from a nearby oil refinery, we are engaging with the commons. But we have forgotten the critical role that the commons play in our existence. The commons make life possible. Beyond that, they make private property possible. When the commons become degraded or destroyed, enjoyment and use of private property become untenable. A Montana rancher could own ten thousand acres and still be dependent on the health of the commons. Neither a gated community nor high-rise penthouse apartments can close a human being from the wider world that we all rely on. [Continue reading…]
Robert Kaplan writes: Orientalism, through which one culture appropriated and dominated another, is slowly evaporating in a world of cosmopolitan interactions and comparative studies, as [Edward] Said intuited it might. Europe has responded by artificially reconstructing national-cultural identities on the extreme right and left, to counter the threat from the civilization it once dominated.
Although the idea of an end to history — with all its ethnic and territorial disputes — turns out to have been a fantasy, this realization is no excuse for a retreat into nationalism. The cultural purity that Europe craves in the face of the Muslim-refugee influx is simply impossible in a world of increasing human interactions.
“The West,” if it does have a meaning beyond geography, manifests a spirit of ever more inclusive liberalism. Just as in the 19th century there was no going back to feudalism, there is no going back now to nationalism, not without courting disaster. [Continue reading…]
Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.
Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.
A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”
Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.
Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)
That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]