Farming invented twice in Middle East, genomes study reveals

wheat

Nature reports: Two Middle Eastern populations independently developed farming and then spread the technology to Europe, Africa and Asia, according to the genomes of 44 people who lived thousands of years ago in present-day Armenia, Turkey, Israel, Jordan and Iran.

Posted on 17 June on the bioRxiv preprint server1, the research supports archaeological evidence about the multiple origins of farming, and represents the first detailed look at the ancestry of the individuals behind one of the most important periods in human history — the Neolithic revolution.

Some 11,000 years ago, humans living in the ancient Middle East region called the Fertile Crescent shifted from a nomadic existence, based on hunting game and gathering wild plants, to a more sedentary lifestyle that would later give rise to permanent settlements. Over thousands of years, these early farmers domesticated the first crops and transformed sheep, wild boars and other creatures into domestic animals.

Dozens of studies have examined the genetics of the first European farmers, who emigrated from the Middle East beginning some 8,000 years ago, but the hot climes of the Fertile Crescent had made it difficult to obtain ancient DNA from remains found there. Advances in extracting DNA from a tiny ear bone called the petrous allowed a team led by Iosif Lazaridis and David Reich, population geneticists at Harvard Medical School in Boston, Massachusetts, to analyse the genomes of the 44 Middle Eastern individuals, who lived between 14,000 and 3,500 years ago. [Continue reading…]

Facebooktwittermail

In the depths of the digital age

Edward Mendelson writes: Every technological revolution coincides with changes in what it means to be a human being, in the kinds of psychological borders that divide the inner life from the world outside. Those changes in sensibility and consciousness never correspond exactly with changes in technology, and many aspects of today’s digital world were already taking shape before the age of the personal computer and the smartphone. But the digital revolution suddenly increased the rate and scale of change in almost everyone’s lives. Elizabeth Eisenstein’s exhilaratingly ambitious historical study The Printing Press as an Agent of Change (1979) may overstate its argument that the press was the initiating cause of the great changes in culture in the early sixteenth century, but her book pointed to the many ways in which new means of communication can amplify slow, preexisting changes into an overwhelming, transforming wave.

In The Changing Nature of Man (1956), the Dutch psychiatrist J.H. van den Berg described four centuries of Western life, from Montaigne to Freud, as a long inward journey. The inner meanings of thought and actions became increasingly significant, while many outward acts became understood as symptoms of inner neuroses rooted in everyone’s distant childhood past; a cigar was no longer merely a cigar. A half-century later, at the start of the digital era in the late twentieth century, these changes reversed direction, and life became increasingly public, open, external, immediate, and exposed.

Virginia Woolf’s serious joke that “on or about December 1910 human character changed” was a hundred years premature. Human character changed on or about December 2010, when everyone, it seemed, started carrying a smartphone. For the first time, practically anyone could be found and intruded upon, not only at some fixed address at home or at work, but everywhere and at all times. Before this, everyone could expect, in the ordinary course of the day, some time at least in which to be left alone, unobserved, unsustained and unburdened by public or familial roles. That era now came to an end.

Many probing and intelligent books have recently helped to make sense of psychological life in the digital age. Some of these analyze the unprecedented levels of surveillance of ordinary citizens, others the unprecedented collective choice of those citizens, especially younger ones, to expose their lives on social media; some explore the moods and emotions performed and observed on social networks, or celebrate the Internet as a vast aesthetic and commercial spectacle, even as a focus of spiritual awe, or decry the sudden expansion and acceleration of bureaucratic control.

The explicit common theme of these books is the newly public world in which practically everyone’s lives are newly accessible and offered for display. The less explicit theme is a newly pervasive, permeable, and transient sense of self, in which much of the experience, feeling, and emotion that used to exist within the confines of the self, in intimate relations, and in tangible unchanging objects — what William James called the “material self” — has migrated to the phone, to the digital “cloud,” and to the shape-shifting judgments of the crowd. [Continue reading…]

Facebooktwittermail

Earliest evidence of fire making by prehumans in Europe found

Science News reports: Prehumans living around 800,000 years ago in what’s now southeastern Spain were, literally, trailblazers. They lit small, controlled blazes in a cave, a new study finds.

Discoveries in the cave provide the oldest evidence of fire making in Europe and support proposals that members of the human genus, Homo, regularly ignited fires starting at least 1 million years ago, say paleontologist Michael Walker of the University of Murcia in Spain and his colleagues. Fire making started in Africa (SN: 5/5/12, p. 18) and then moved north to the Middle East (SN: 5/1/04, p. 276) and Europe, the researchers conclude in the June Antiquity.

If the age estimate for the Spain find holds up, the new report adds to a “surprising number” of sites from deep in the Stone Age that retain evidence of small, intentionally lit fires, says archaeologist John Gowlett of the University of Liverpool in England.

Excavations conducted since 2011 at the Spanish cave, Cueva Negra del Estrecho del Río Quípar, have uncovered more than 165 stones and stone artifacts that had been heated, as well as about 2,300 animal-bone fragments displaying signs of heating and charring. Microscopic and chemical analyses indicate that these finds had been heated to between 400° and 600° Celsius, consistent with having been burned in a fire. [Continue reading…]

Facebooktwittermail

How philosophy came to disdain the wisdom of oral cultures

Justin E H Smith writes: A poet, somewhere in Siberia, or the Balkans, or West Africa, some time in the past 60,000 years, recites thousands of memorised lines in the course of an evening. The lines are packed with fixed epithets and clichés. The bard is not concerned with originality, but with intonation and delivery: he or she is perfectly attuned to the circumstances of the day, and to the mood and expectations of his or her listeners.

If this were happening 6,000-plus years ago, the poet’s words would in no way have been anchored in visible signs, in text. For the vast majority of the time that human beings have been on Earth, words have had no worldly reality other than the sound made when they are spoken.

As the theorist Walter J Ong pointed out in Orality and Literacy: Technologizing the Word (1982), it is difficult, perhaps even impossible, now to imagine how differently language would have been experienced in a culture of ‘primary orality’. There would be nowhere to ‘look up a word’, no authoritative source telling us the shape the word ‘actually’ takes. There would be no way to affirm the word’s existence at all except by speaking it – and this necessary condition of survival is important for understanding the relatively repetitive nature of epic poetry. Say it over and over again, or it will slip away. In the absence of fixed, textual anchors for words, there would be a sharp sense that language is charged with power, almost magic: the idea that words, when spoken, can bring about new states of affairs in the world. They do not so much describe, as invoke.

As a consequence of the development of writing, first in the ancient Near East and soon after in Greece, old habits of thought began to die out, and certain other, previously latent, mental faculties began to express themselves. Words were now anchored and, though spellings could change from one generation to another, or one region to another, there were now physical traces that endured, which could be transmitted, consulted and pointed to in settling questions about the use or authority of spoken language.

Writing rapidly turned customs into laws, agreements into contracts, genealogical lore into history. In each case, what had once been fundamentally temporal and singular was transformed into something eternal (as in, ‘outside of time’) and general. Even the simple act of making everyday lists of common objects – an act impossible in a primary oral culture – was already a triumph of abstraction and systematisation. From here it was just one small step to what we now call ‘philosophy’. [Continue reading…]

Facebooktwittermail

Here ‘lies’ Aristotle, archaeologist declares — and this is why we should care

Amy Ellis Nutt writes: For starters, he is the father of Western science and Western philosophy. He invented formal logic and the scientific method and wrote the first books about biology, physics, astronomy and psychology. Freedom and democracy, justice and equality, the importance of a middle class and the dangers of credit — they’re just a sampling of Aristotle’s political and economic principles. And, yes, Christianity, Islam and our Founding Fathers also owe him a lot.

Nearly 2-1/2 millennia after Aristotle’s birth, we now know where his ashes most likely were laid to rest: in the city of his birth, Stagira, on a small, picturesque peninsula in northern Greece.

“We have no [concrete] evidence, but very strong indications reaching almost to certainty,” archaeologist Kostas Sismanidis said through a translator at this week’s World Congress celebrating “Aristotle 2400 Years.” [Continue reading…]

Facebooktwittermail

Mecca fast becoming a Las Vegas for pilgrims as world’s largest hotel soon opens

The Guardian reports: Four helipads will cluster around one of the largest domes in the world, like sideplates awaiting the unveiling of a momentous main course, which will be jacked up 45 storeys into the sky above the deserts of Mecca. It is the crowning feature of the holy city’s crowning glory, the superlative summit of what will be the world’s largest hotel when it opens in 2017.

With 10,000 bedrooms and 70 restaurants, plus five floors for the sole use of the Saudi royal family, the £2.3bn Abraj Kudai is an entire city of five-star luxury, catering to the increasingly high expectations of well-heeled pilgrims from the Gulf.

Modelled on a “traditional desert fortress”, seemingly filtered through the eyes of a Disneyland imagineer with classical pretensions, the steroidal scheme comprises 12 towers teetering on top of a 10-storey podium, which houses a bus station, shopping mall, food courts, conference centre and a lavishly appointed ballroom.

Located in the Manafia district, just over a mile south of the Grand Mosque, the complex is funded by the Saudi Ministry of Finance and designed by the Dar Al-Handasah group, a 7,000-strong global construction conglomerate that turns its hand to everything from designing cities in Kazakhstan to airports in Dubai. For the Abraj Kudai, it has followed the wedding-cake pastiche style of the city’s recent hotel boom: cornice is piled upon cornice, with fluted pink pilasters framing blue-mirrored windows, some arched with a vaguely Ottoman air. The towers seem to be packed so closely together that guests will be able to enjoy views into each other’s rooms.

“The city is turning into Mecca-hattan,” says Irfan Al-Alawi, director of the UK-based Islamic Heritage Research Foundation, which campaigns to try to save what little heritage is left in Saudi Arabia’s holy cities. “Everything has been swept away to make way for the incessant march of luxury hotels, which are destroying the sanctity of the place and pricing normal pilgrims out.”

The Grand Mosque is now loomed over by the second tallest building in the world, the Abraj al-Bait clocktower, home to thousands more luxury hotel rooms, where rates can reach £4,000 a night for suites with the best views of the Kaaba – the black cube at the centre of the mosque around which Muslims must walk. The hotel rises 600m (2,000ft) into the air, projecting a dazzling green laser-show by night, on a site where an Ottoman fortress once stood – razed for development, along with the hill on which it sat.

The list of heritage crimes goes on, driven by state-endorsed Wahhabism, the hardline interpretation of Islam that perceives historical sites as encouraging sinful idolatry – which spawned the ideology that is now driving Isis’s reign of destruction in Syria and Iraq. [Continue reading…]

The construction of towering luxury hotels in Mecca seems to conflict with what can be described as the leveling effect for pilgrims performing the annual Hajj.

A 2008 Harvard study which compared attitudes of 800 successful Hajj lottery applicants from Pakistan, to an equal number of unsuccessful ones, found:

Hajjis have more positive views about people from other Muslim countries and are more likely to believe that different Pakistani ethnic and Islamic sectarian groups are equal and that they can live in harmony. Despite non-Muslims not being part of the hajj experience, these views also extend to adherents of other religions: Pilgrims are 22 percent more likely to declare that people of different religions are equal and 11 percent more likely to state that different religions can live in harmony by compromising over their disagreements.

Paralleling the findings on tolerance, hajjis report more positive views on women’s abilities, greater concern for their quality of life, and are also more likely to favor educating girls and women participating in the workforce.

Hajjis are also less likely to support the use of violence and show no evidence of any increased hostility toward the West. They are more than twice as likely to declare that the goals of Osama bin Laden are incorrect, more likely to express a preference for peace between Pakistan and India, and more likely to declare that it is incorrect to physically punish someone if they have dishonored the family. Hajjis also become more sensitive to crimes against women.

It thus seems that in many respects, the value of Hajj has less to do with the quality of accommodation available to pilgrims than it does with the avenues of access.

“These are the last days of Mecca,” Alawi tells The Guardian. “The pilgrimage is supposed to be a spartan, simple rite of passage, but it has turned into an experience closer to Las Vegas, which most pilgrims simply can’t afford.”

Facebooktwittermail

The long history of a short form

decay12

Ryan Ruby writes: For a word that literally means definition, the aphorism is a rather indefinite genre. It bears a family resemblance to the fragment, the proverb, the maxim, the hypomnema, the epigram, the mantra, the parable, and the prose poem. Coined sometime between the fifth and third centuries BC as the title for one of the books of the Corpus Hippocraticum, the Aphorismi were originally a compendium of the latest medical knowledge. The penultimate aphorism, “In chronic disease an excessive flux from the bowels is bad,” is more representative of the collection’s contents than the first — “Life is short, art is long” — for which it is best known.

But in those six words lies a clue to the particular space aphorisms were supposed to define. Thanks to a semantic slippage between the Greek word techne and its English translation (via the Latin ars), the saying is often taken to mean that the works of human beings outlast their days. But in its original context, Hippocrates or his editors probably intended something more pragmatic: the craft of medicine takes a long time to learn, and physicians have a short time in which to learn it. Although what aphorisms have in common with the forms listed above is their brevity, what is delimited by the aphorism is not the number of words in which ideas are expressed but the scope of their inquiry. Unlike Hebrew proverbs, in which the beginning of wisdom is the fear of God, the classical aphorism is a secular genre concerned with the short span of time we are allotted on earth. Books of aphorisms are also therapeutic in nature, collections of practical wisdom through which we can rid ourselves of unnecessary suffering and achieve what Hippocrates’ contemporary Socrates called eudaimonia, the good life.

This is certainly what the Stoic philosopher Arrian had in mind when he whittled down the discourses of his master, Epictetus, into a handbook of aphorisms. The Enchiridion is composed of that mixture of propositional assertion and assertive imperative that is now a hallmark of the form. In it, Epictetus, a former slave, outlines the Stoic view that, while “some things are in our control,” most things are ruled by fate. The way to the good life is to bring what is up to us — our attitudes, judgments, and desires — into harmony with what is not up to us: what happens to our bodies, possessions, and reputations. If we accept that what does happen must happen, we will never be disappointed by vain hopes or sudden misfortunes. Our dispositions, not our destinies, are the real source of our unhappiness. [Continue reading…]

Facebooktwittermail

Animals are us

elephant

Tania Lombrozo writes: Researchers have studied how people think about humans in relation to the natural world, and how the way we reason about humans and other animals changes over the course of development and as a function of education and culture.

The findings from this body of work suggest that by age 5, Western children growing up in urban environments are anomalous in the extent to which they regard humans as central to the biological world. Much of the rest of the world — including 3-year-olds, 5-year-olds in rural environments and adults from indigenous populations in South America — are more inclined to think about humans as one animal species among others, at least when it comes to reasoning about the properties that human and non-human animals are likely to possess.

To illustrate, consider a study by Patricia Herrmann, Sandra Waxman and Douglas Medin published in the Proceedings of the National Academy of Sciences in 2010. In one experiment, 64 urban children, aged 3 or 5, were asked a series of questions that assessed their willingness to generalize an unknown property from one object to another. For instance, they might be told that people “have andro inside,” and would then have to guess whether it’s right or wrong to say that dogs “have andro inside.”

The findings with 5-year-olds replicated classic work in developmental psychology and suggested a strong “anthropocentric” bias: The children were more likely to generalize from humans to non-humans than the other way around, consistent with a privileged place for humans in the biological world. The 3-year-olds, by contrast, showed no signs of this bias: They generalized from humans to non-humans and from non-humans to humans in just the same way. These findings suggest that an anthropocentric perspective isn’t a necessary starting point for human reasoning about the biological world, but rather a perspective we acquire through experience.

So what happens between the ages of 3 and 5 to induce an anthropocentric bias?

Perhaps surprisingly, one influence seems to be anthropomorphism in storybooks. [Continue reading…]

Facebooktwittermail

‘Children today are less free than they have ever been’

Jenny Anderson writes: “Something in modern life is undermining mental health,” Jean Twenge, a professor of psychology at San Diego State University, wrote in a recent paper.

Specifically, something is undermining young people’s mental health, especially girls.

In her paper, Twenge looks at four studies covering 7 million people, ranging from teens to adults in the US. Among her findings: high school students in the 2010s were twice as likely to see a professional for mental health issues than those in the 1980s; more teens struggled to remember things in 2010-2012 compared to the earlier period; and 73% more reported trouble sleeping compared to their peers in the 1980s. These so-called “somatic” or “of-the-body” symptoms strongly predict depression.

“It indicates a lot of suffering,” Twenge told Quartz.

It’s not just high school students. College students also feel more overwhelmed; student health centers are in higher demand for bad breakups or mediocre grades, issues that previously did not drive college kids to seek professional help. While the number of kids who reported feeling depressed spiked in the 1980s and 1990s, it started to fall after 2008. It has started rising again:

Kids are being diagnosed with higher levels of attention-deficit hyperactivity disorder (ADHD), and everyone aged 6-18 is seeking more mental health services, and more medication.

The trend is not a uniquely American phenomenon: In the UK, the number of teenagers (15-16) with depression nearly doubled between the 1980s and the 2000s and a recent survey found British 15-year-olds were among the least happy teenagers in the world (those in Poland and Macedonia were the only ones who were more unhappy).

“We would like to think of history as progress, but if progress is measured in the mental health and happiness of young people, then we have been going backward at least since the early 1950s,” Peter Gray, a psychologist and professor at Boston College, wrote in Psychology Today.

Researchers have a raft of explanations for why kids are so stressed out, from a breakdown in family and community relationships, to the rise of technology and increased academic stakes and competition. Inequality is rising and poverty is debilitating.

Twenge has observed a notable shift away from internal, or intrinsic goals, which one can control, toward extrinsic ones, which are set by the world, and which are increasingly unforgiving.

Gray has another theory: kids aren’t learning critical life-coping skills because they never get to play anymore.

“Children today are less free than they have ever been,” he told Quartz. And that lack of freedom has exacted a dramatic toll, he says.

“My hypothesis is that the generational increases in externality, extrinsic goals, anxiety, and depression are all caused largely by the decline, over that same period, in opportunities for free play and the increased time and weight given to schooling,” he wrote. [Continue reading…]

Facebooktwittermail

The time has come for a ‘Sexual Spring’ in the Arab world

Kacem El Ghazzali writes: When we say that nowadays to call for sexual freedom in Arab and Muslim societies is more dangerous than the demand to topple monarchies or dictatorial regimes, we are not playing with metaphor or attempting to gain sympathy. We are stating a bitter and painful fact of the reality in which we are living.

In Arab and Muslim milieus, sex is considered a means and not an end, hedged by many prickly restrictions that make it an objectionable matter and synonymous with sin. Its function within marriage is confined to procreation and nothing else, and all sexual activity outside the institution of marriage is banned legally and rejected socially. Innocent children born out of wedlock are socially rejected and considered foundlings.

This situation cannot be said to be characteristic of Arab societies only, but we experience these miseries in far darker and more intense ways than in other countries. This is especially so because of the dominance of machismo, which considers a man’s sexual adventures as heroics worthy of pride, while a woman who dares to give in to her sexual desires is destined to be killed — or at best beaten and expelled from home — because she has brought dishonor upon her family. [Continue reading…]

Facebooktwittermail

It’s time to reinstate the forgotten ideal of the commons

Antonia Malchik writes: The ranch my mother was born on was not built solely by her family’s labour. It relied on water aquifers deep beneath the surface, the health of soil on plains and hills beyond their borders, on hundreds – perhaps thousands – of years of care by the Blackfoot tribe whose land it should have remained, the weather over which they had no control, the sun, seeds, and a community who knew in their bones that nobody could do this alone. These things comprised an ecosystem that was vital to their survival, and the same holds true today. These are our shared natural resources, or what was once known as ‘the commons’.

We live on and in the commons, even if we don’t recognise it as such. Every time we take a breath, we’re drawing from the commons. Every time we walk down a road we’re using the commons. Every time we sit in the sunshine or shelter from the rain, listen to birdsong or shut our windows against the stench from a nearby oil refinery, we are engaging with the commons. But we have forgotten the critical role that the commons play in our existence. The commons make life possible. Beyond that, they make private property possible. When the commons become degraded or destroyed, enjoyment and use of private property become untenable. A Montana rancher could own ten thousand acres and still be dependent on the health of the commons. Neither a gated community nor high-rise penthouse apartments can close a human being from the wider world that we all rely on. [Continue reading…]

Facebooktwittermail

Islam is reshaping Europe

europe-middle-east-north-africa

Robert Kaplan writes: Orientalism, through which one culture appropriated and dominated another, is slowly evaporating in a world of cosmopolitan interactions and comparative studies, as [Edward] Said intuited it might. Europe has responded by artificially reconstructing national-cultural identities on the extreme right and left, to counter the threat from the civilization it once dominated.

Although the idea of an end to history — with all its ethnic and territorial disputes — turns out to have been a fantasy, this realization is no excuse for a retreat into nationalism. The cultural purity that Europe craves in the face of the Muslim-refugee influx is simply impossible in a world of increasing human interactions.

“The West,” if it does have a meaning beyond geography, manifests a spirit of ever more inclusive liberalism. Just as in the 19th century there was no going back to feudalism, there is no going back now to nationalism, not without courting disaster. [Continue reading…]

Facebooktwittermail

Technology is not ruining our kids. Parents (and their technology) are ruining them

Jenny Anderson writes: Many of us worry what technology is doing to our kids. A cascade of reports show that their addiction to iAnything is diminishing empathy, increasing bullying (pdf), robbing them of time to play, and just be. So we parents set timers, lock away devices and drone on about the importance of actual real-live human interaction. And then we check our phones.

Sherry Turkle, a professor in the program in Science, Technology and Society at M.I.T. and the author, most recently, of Reclaiming Conversation: The Power of Talk in a Digital Age, turned the tables by imploring parents to take control and model better behavior.

A 15-year-old boy told her that: “someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation.”

Turkle explains the cost of too-much technology in stark terms: Our children can’t engage in conversation, or experience solitude, making it very hard for them to be empathetic. “In one experiment, many student subjects opted to give themselves mild electric shocks rather than sit alone with their thoughts,” she noted.

Unfortunately, it seems we parents are the solution. (Newsflash, kids aren’t going to give up their devices because they are worried about how it may influence their future ability to empathize.)

That means exercising some self-control. Many of us aren’t exactly paragons of virtue in this arena. [Continue reading…]

Facebooktwittermail

Culture without borders: The history of culture is the history of cultural appropriation

Kenan Malik writes: Cultural appropriation is, in the words of Susan Scafidi, professor of law at Fordham University, and author of Who Owns Culture? Appropriation and Authenticity in American Law, “Taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission”. This can include the “unauthorised use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.”

stepping-stonesBut what is it for knowledge or an object to “belong” to a culture? And who gives permission for someone from another culture to use such knowledge or forms?

The idea that the world could be divided into distinct cultures, and that every culture belonged to a particular people, has its roots in late 18th-century Europe.

The Romantic movement, which developed in part in opposition to the rationalism of the Enlightenment, celebrated cultural differences and insisted on the importance of “authentic” ways of being.

For Johann Gottfried Herder, the German philosopher who best articulated the Romantic notion of culture, what made each people – or “volk” – unique was its particular language, history and modes of living. The unique nature of each volk was expressed through its “volksgeist” – the unchanging spirit of a people refined through history.

Herder was no reactionary – he was an important champion of equality – but his ideas about culture were adopted by reactionary thinkers. Those ideas became central to racial thinking – the notion of the volksgeist was transformed into the concept of racial make-up – and fuelled the belief that non-Western societies were “backward” because of their “backward” cultures.

Radicals challenging racism and colonialism rejected the Romantic view of culture, adopting instead a universalist perspective. From the struggle against slavery to the anti-colonial movements, the aim not to protect one’s own special culture but to create a more universal culture in which all could participate on equal terms.

In recent decades, however, the universalist viewpoint has eroded, largely as many of the social movements that embodied that viewpoint have disintegrated. The social space vacated by that disintegration became filled by identity politics.

As the broader struggles for social transformation have faded, people have tended to retreat into their particular faiths or cultures, and to embrace more parochial forms of identity. In this process, the old cultural arguments of the racists have returned, but now rebranded as “antiracist”.

But how does creating gated cultures, and preventing others from trespassing upon one’s culture without permission, challenge racism or promote social justice? [Continue reading…]

Facebooktwittermail

The strange history of secularism twists debate about British Muslim attitudes

By Humeira Iqtidar, King’s College London

Governments in Britain have tended to treat Muslim citizens much like colonial administrations treated their subjects. Intermediaries – tribal leaders or religious figures – are found to establish communication between the empire and its people. One positive thing about a recent ICM poll of British muslims is that it offers an alternative. The survey, carried out for a Channel 4 documentary, was never going to be able to reflect the complexity of British Muslim life accurately, but it does signal a shift by engaging directly with Muslim citizens.

How poll data is used is one way to test how colonialism’s legacy might linger on. The Daily Mail chose for its headline the quote: “Muslims are not like us and we should just accept that they will not integrate …” while Sky News highlighted that: “Half of British Muslims want homosexuality banned.”

Few media outlets rushed to use the headline that “86% of Muslims feel strong affiliation with UK, higher than the national average”, although this too is one of the findings from the survey. It is an “us and them” framework that fails to spark debate about who “we” might be and why “they”, with all their differences, might need greater integration with us, as the report has suggested.

We don’t have space here to discuss how the category Muslim may be broken up across class, regional or ethnic background. Nor will we get into comparisons with others: whether, for instance, British Catholics, or for that matter, members of the Conservative Party, might have similar sentiments towards homosexuality.

[Read more…]

Facebooktwittermail