Category Archives: Culture

UN cites humanity’s immeasurable loss in Syria’s war

AFP reports: Nearly 300 sites of incalculable value for Syria and human history have been destroyed, damaged or looted in almost four years of war, the U.N. said Tuesday, citing “alarming” satellite evidence.

From prehistoric settlements and ancient markets to world-famous mosques and Crusader castles, Syria is home to countless treasures.

However, since the country’s brutal war erupted in 2011, heritage sites have been plundered by all sides – regime loyalists, anti-government rebels, jihadi fighters and even desperate residents.

After a major survey, the United Nations said that detailed analysis of satellite images from several hundred sites had unearthed the full scale of the damage. [Continue reading…]

Facebooktwittermail

Co-operation

Patrick Bateson writes: I am disturbed by the way we have created a social environment in which so much emphasis is laid on competition – on forging ahead while trampling on others. The ideal of social cooperation has come to be treated as high-sounding flabbiness, while individual selfishness is regarded as the natural and sole basis for a realistic approach to life. The image of the struggle for existence lies at the back of it, seriously distorting the view we have of ourselves and wrecking mutual trust.

The fashionable philosophy of individualism draws its respectability in part from an appeal to biology and specifically to the Darwinian theory of evolution by natural selection. Now, Darwin’s theory remains the most powerful explanation for the way that each plant and animal evolved so that it is exquisitely adapted to its environment. The theory works just as well for behaviour as it does for anatomy. Individual animals differ in the way they behave. Those that behave in a manner that is better suited to the conditions in which they live are more likely to survive. Finally, if their descendants resemble them in terms of behaviour, then in the course of evolution, the better adapted forms of behaviour will replace those that are not so effective in keeping the individual alive.

It is the Darwinian concept of differential survival that has been picked up and used so insistently in political rhetoric. Biology is thought to be all about competition – and that supposedly means constant struggle. This emphasis has had an insidious effect on the public mind and has encouraged the belief in individual selfishness and in confrontation. Competition is now widely seen as the mainspring of human activity, at least in Western countries. Excellence in the universities and in the arts is thought to be driven by the same ruthless process that supposedly works so well on the sportsfield or the market place, and they all have a lot in common with what supposedly happens in the jungle. The image of selfish genes, competing with each other in the course of evolution has fused imperceptibly with the notion of selfish individuals competing with each other in the course of their life-times. Individuals only thrive by winning. The argument has become so much a part of conventional belief that it is hard at first to see what is wrong with it.

To put it bluntly, thought has been led seriously astray by the rhetoric. [Continue reading…]

Facebooktwittermail

How broken sleep can unleash creativity

Karen Emslie writes: It is 4.18am. In the fireplace, where logs burned, there are now orange lumps that will soon be ash. Orion the Hunter is above the hill. Taurus, a sparkling V, is directly overhead, pointing to the Seven Sisters. Sirius, one of Orion’s heel dogs, is pumping red-blue-violet, like a galactic disco ball. As the night moves on, the old dog will set into the hill.

It is 4.18am and I am awake. Such early waking is often viewed as a disorder, a glitch in the body’s natural rhythm – a sign of depression or anxiety. It is true that when I wake at 4am I have a whirring mind. And, even though I am a happy person, if I lie in the dark my thoughts veer towards worry. I have found it better to get up than to lie in bed teetering on the edge of nocturnal lunacy.

If I write in these small hours, black thoughts become clear and colourful. They form themselves into words and sentences, hook one to the next – like elephants walking trunk to tail. My brain works differently at this time of night; I can only write, I cannot edit. I can only add, I cannot take away. I need my day-brain for finesse. I will work for several hours and then go back to bed.

All humans, animals, insects and birds have clocks inside, biological devices controlled by genes, proteins and molecular cascades. These inner clocks are connected to the ceaseless yet varying cycle of light and dark caused by the rotation and tilt of our planet. They drive primal physiological, neural and behavioural systems according to a roughly 24-hour cycle, otherwise known as our circadian rhythm, affecting our moods, desires, appetites, sleep patterns, and sense of the passage of time.

The Romans, Greeks and Incas woke up without iPhone alarms or digital radio clocks. Nature was their timekeeper: the rise of the sun, the dawn chorus, the needs of the field or livestock. Sundials and hourglasses recorded the passage of time until the 14th century when the first mechanical clocks were erected on churches and monasteries. By the 1800s, mechanical timepieces were widely worn on neck chains, wrists or lapels; appointments could be made and meal- or bed-times set.

Societies built around industrialisation and clock-time brought with them urgency and the concept of being ‘on time’ or having ‘wasted time’. Clock-time became increasingly out of synch with natural time, yet light and dark still dictated our working day and social structures.

Then, in the late 19th century, everything changed. [Continue reading…]

Facebooktwittermail

Societies in harsh environments more likely to believe in moralizing high gods

EurekAlert!: Just as physical adaptations help populations prosper in inhospitable habitats, belief in moralizing, high gods might be similarly advantageous for human cultures in poorer environments. A new study from the National Evolutionary Synthesis Center (NESCent) suggests that societies with less access to food and water are more likely to believe in these types of deities.

“When life is tough or when it’s uncertain, people believe in big gods,” says Russell Gray, a professor at the University of Auckland and a founding director of the Max Planck Institute for History and the Sciences in Jena, Germany. “Prosocial behavior maybe helps people do well in harsh or unpredictable environments.”

Gray and his coauthors found a strong correlation between belief in high gods who enforce a moral code and other societal characteristics. Political complexity–namely a social hierarchy beyond the local community– and the practice of animal husbandry were both strongly associated with a belief in moralizing gods.

The emergence of religion has long been explained as a result of either culture or environmental factors but not both. The new findings imply that complex practices and characteristics thought to be exclusive to humans arise from a medley of ecological, historical, and cultural variables.

“When researchers discuss the forces that shaped human history, there is considerable disagreement as to whether our behavior is primarily determined by culture or by the environment,” says primary author Carlos Botero, a researcher at the Initiative for Biological Complexity at North Carolina State University. “We wanted to throw away all preconceived notions regarding these processes and look at all the potential drivers together to see how different aspects of the human experience may have contributed to the behavioral patterns we see today.” [Continue reading…]

Facebooktwittermail

Gossip makes human society possible

Julie Beck writes: While gossiping is a behavior that has long been frowned upon, perhaps no one has frowned quite so intensely as the 16th- and 17th-century British. Back then, gossips, or “scolds” were sometimes forced to wear a menacing iron cage on their heads, called the “branks” or “scold’s bridle.” These masks purportedly had iron spikes or bits that went in the mouth and prevented the wearer from speaking. (And of course, of course, this ghastly punishment seems to have been mostly for women who were talking too much.)

Today, people who gossip are still not very well-liked, though we tend to resist the urge to cage their heads. Progress. And yet the reflexive distaste people feel for gossip and those who gossip in general is often nowhere to be found when people find themselves actually faced with a juicy morsel about someone they know. Social topics—personal relationships, likes and dislikes, anecdotes about social activities—made up about two-thirds of all conversations in analyses done by evolutionary psychologist Robin Dunbar. The remaining one-third of their time not spent talking about other people was devoted to discussing everything else: sports, music, politics, etc.

“Language in freely forming natural conversations is principally used for the exchange of social information,” Dunbar writes. “That such topics are so overwhelmingly important to us suggests that this is a primary function of language.” He even goes so far as to say: “Gossip is what makes human society as we know it possible.”

In recent years, research on the positive effects of gossip has proliferated. Rather than just a means to humiliate people and make them cry in the bathroom, gossip is now being considered by scientists as a way to learn about cultural norms, bond with others, promote cooperation, and even, as one recent study found, allow individuals to gauge their own success and social standing. [Continue reading…]

Facebooktwittermail

Slaves of productivity

Quinn Norton writes: We dream now of making Every Moment Count, of achieving flow and never leaving, creating one project that must be better than the last, of working harder and smarter. We multitask, we update, and we conflate status with long hours worked in no paid overtime systems for the nebulous and fantastic status of being Too Important to have Time to Ourselves, time to waste. But this incarnation of the American dream is all about doing, and nothing about doing anything good, or even thinking about what one was doing beyond how to do more of it more efficiently. It was not even the surrenders to hedonism and debauchery or greed our literary dreams have recorded before. It is a surrender to nothing, to a nothingness of lived accounting.

This moment’s goal of productivity, with its all-consuming practice and unattainable horizon, is perfect for our current corporate world. Productivity never asks what it builds, just how much of it can be piled up before we leave or die. It is irrelevant to pleasure. It’s agnostic about the fate of humanity. It’s not even selfish, because production negates the self. Self can only be a denominator, holding up a dividing bar like a caryatid trying to hold up a stone roof.

I am sure this started with the Industrial Revolution, but what has swept through this generation is more recent. This idea of productivity started in the 1980s, with the lionizing of the hardworking greedy. There’s a critique of late capitalism to be had for sure, but what really devastated my generation was the spiritual malaise inherent in Taylorism’s perfectly mechanized human labor. But Taylor had never seen a robot or a computer perfect his methods of being human. By the 1980s, we had. In the age of robots we reinvented the idea of being robots ourselves. We wanted to program our minds and bodies and have them obey clocks and routines. In this age of the human robot, of the materialist mind, being efficient took the pre-eminent spot, beyond goodness or power or wisdom or even cruel greed. [Continue reading…]

Facebooktwittermail

Denying problems when we don’t like the political solutions

Phys.org: A new study from Duke University finds that people will evaluate scientific evidence based on whether they view its policy implications as politically desirable. If they don’t, then they tend to deny the problem even exists.

“Logically, the proposed solution to a problem, such as an increase in government regulation or an extension of the free market, should not influence one’s belief in the problem. However, we find it does,” said co-author Troy Campbell, a Ph.D. candidate at Duke’s Fuqua School of Business. “The cure can be more immediately threatening than the problem.”

The study, “Solution Aversion: On the Relation Between Ideology and Motivated Disbelief,” appears in the November issue of the Journal of Personality and Social Psychology.

The researchers conducted three experiments (with samples ranging from 120 to 188 participants) on three different issues—climate change, air pollution that harms lungs, and crime.

“The goal was to test, in a scientifically controlled manner, the question: Does the desirability of a solution affect beliefs in the existence of the associated problem? In other words, does what we call ‘solution aversion’ exist?” Campbell said.

“We found the answer is yes. And we found it occurs in response to some of the most common solutions for popularly discussed problems.”

For climate change, the researchers conducted an experiment to examine why more Republicans than Democrats seem to deny its existence, despite strong scientific evidence that supports it.

One explanation, they found, may have more to do with conservatives’ general opposition to the most popular solution—increasing government regulation—than with any difference in fear of the climate change problem itself, as some have proposed. [Continue reading…]

Facebooktwittermail

We are all confident idiots

David Dunning writes: Last March, during the enormous South by Southwest music festival in Austin, Texas, the late-night talk show Jimmy Kimmel Live! sent a camera crew out into the streets to catch hipsters bluffing. “People who go to music festivals pride themselves on knowing who the next acts are,” Kimmel said to his studio audience, “even if they don’t actually know who the new acts are.” So the host had his crew ask festival-goers for their thoughts about bands that don’t exist.

“The big buzz on the street,” said one of Kimmel’s interviewers to a man wearing thick-framed glasses and a whimsical T-shirt, “is Contact Dermatitis. Do you think he has what it takes to really make it to the big time?”

“Absolutely,” came the dazed fan’s reply.

The prank was an installment of Kimmel’s recurring “Lie Witness News” feature, which involves asking pedestrians a variety of questions with false premises. In another episode, Kimmel’s crew asked people on Hollywood Boulevard whether they thought the 2014 film Godzilla was insensitive to survivors of the 1954 giant lizard attack on Tokyo; in a third, they asked whether Bill Clinton gets enough credit for ending the Korean War, and whether his appearance as a judge on America’s Got Talent would damage his legacy. “No,” said one woman to this last question. “It will make him even more popular.”

One can’t help but feel for the people who fall into Kimmel’s trap. Some appear willing to say just about anything on camera to hide their cluelessness about the subject at hand (which, of course, has the opposite effect). Others seem eager to please, not wanting to let the interviewer down by giving the most boringly appropriate response: I don’t know. But for some of these interviewees, the trap may be an even deeper one. The most confident-sounding respondents often seem to think they do have some clue—as if there is some fact, some memory, or some intuition that assures them their answer is reasonable. [Continue reading…]

Facebooktwittermail

Cooperation is what makes us human

Kat McGowan writes: Tales about the origins of our species always start off like this: A small band of hunter-gatherers roams the savannah, loving, warring, and struggling for survival under the African sun. They do not start like this: A fat guy falls off a New York City subway platform onto the tracks.

But what happens next is a quintessential story of who we are as human beings.

On Feb. 17, 2013, around 2:30 a.m., Garrett O’Hanlon, a U.S. Air Force Academy cadet third class, was out celebrating his 22nd birthday in New York City. He and his sister were in the subway waiting for a train when a sudden silence came over the platform, followed by a shriek. People pointed down to the tracks.

O’Hanlon turned and saw a man sprawled facedown on the tracks. “The next thing that happened, I was on the tracks, running toward him,” he says. “I honestly didn’t have a thought process.”

O’Hanlon grabbed the unconscious man by the shoulders, lifting his upper body off the tracks, but struggled to move him. He was deadweight. According to the station clock, the train would arrive in less than two minutes. From the platform, O’Hanlon’s sister was screaming at him to save himself.

Suddenly other arms were there: Personal trainer Dennis Codrington Jr. and his friend Matt Foley had also jumped down to help. “We grabbed him, one by the legs, one by the shoulders, one by the chest,” O’Hanlon says. They got the man to the edge of the platform, where a dozen or more people muscled him up and over. More hands seized the rescuers’ arms and shoulders, helping them up to safety as well.

In the aftermath of the rescue, O’Hanlon says he has been surprised that so many people have asked him why he did it. “I get stunned by the question,” he says. In his view, anybody else would’ve done the same thing. “I feel like it’s a normal reaction,” he says. “To me that’s just what people do.”

More precisely, it is something only people do, according to developmental psychologist Michael Tomasello, codirector of the Max Planck Institute for Evolutionary Anthropology.

For decades Tomasello has explored what makes humans distinctive. His conclusion? We cooperate. Many species, from ants to orcas to our primate cousins, cooperate in the wild. But Tomasello has identified a special form of cooperation. In his view, humans alone are capable of shared intentionality—they intuitively grasp what another person is thinking and act toward a common goal, as the subway rescuers did. This supremely human cognitive ability, Tomasello says, launched our species on its extraordinary trajectory. It forged language, tools, and cultures—stepping-stones to our colonization of every corner of the planet. [Continue reading…]

Facebooktwittermail

The faster we go, the more time we lose

Mark C. Taylor writes: “Sleeker. Faster. More Intuitive” (The New York Times); “Welcome to a world where speed is everything” (Verizon FiOS); “Speed is God, and time is the devil” (chief of Hitachi’s portable-computer division). In “real” time, life speeds up until time itself seems to disappear—fast is never fast enough, everything has to be done now, instantly. To pause, delay, stop, slow down is to miss an opportunity and to give an edge to a competitor. Speed has become the measure of success—faster chips, faster computers, faster networks, faster connectivity, faster news, faster communications, faster transactions, faster deals, faster delivery, faster product cycles, faster brains, faster kids. Why are we so obsessed with speed, and why can’t we break its spell?

The cult of speed is a modern phenomenon. In “The Futurist Manifesto” in 1909, Filippo Tommaso Marionetti declared, “We say that the splendor of the world has been enriched by a new beauty: the beauty of speed.” The worship of speed reflected and promoted a profound shift in cultural values that occurred with the advent of modernity and modernization. With the emergence of industrial capitalism, the primary values governing life became work, efficiency, utility, productivity, and competition. When Frederick Winslow Taylor took his stopwatch to the factory floor in the early 20th century to increase workers’ efficiency, he began a high-speed culture of surveillance so memorably depicted in Charlie Chaplin’s Modern Times. Then, as now, efficiency was measured by the maximization of rapid production through the programming of human behavior.

With the transition from mechanical to electronic technologies, speed increased significantly. The invention of the telegraph, telephone, and stock ticker liberated communication from the strictures imposed by the physical means of conveyance. Previously, messages could be sent no faster than people, horses, trains, or ships could move. By contrast, immaterial words, sounds, information, and images could be transmitted across great distances at very high speed. During the latter half of the 19th century, railway and shipping companies established transportation networks that became the backbone of national and international information networks. When the trans-Atlantic cable (1858) and transcontinental railroad (1869) were completed, the foundation for the physical infrastructure of today’s digital networks was in place.

Fast-forward 100 years. During the latter half of the 20th century, information, communications, and networking technologies expanded rapidly, and transmission speed increased exponentially. But more than data and information were moving faster. Moore’s Law, according to which the speed of computer chips doubles every two years, now seems to apply to life itself. Plugged in 24/7/365, we are constantly struggling to keep up but are always falling further behind. The faster we go, the less time we seem to have. As our lives speed up, stress increases, and anxiety trickles down from managers to workers, and parents to children. [Continue reading…]

Facebooktwittermail

Richard Dawkins has lost it: ignorant sexism gives atheists a bad name

Adam Lee writes: I became an atheist on my own, but it was Richard Dawkins who strengthened and confirmed my decision. For a long time, I admired his insightful science writing, his fierce polemics, his uncompromising passion for the truth. When something I’d written got a (brief) mention in The God Delusion, it was one of the high points of my life.

So, I’m not saying this is easy, but I have to say it: Richard Dawkins, I’m just not that into you anymore.

The atheist movement – a loosely-knit community of conference-goers, advocacy organizations, writers and activists – has been wracked by infighting the last few years over its persistent gender imbalance and the causes of it. Many female atheists have explained that they don’t get more involved because of the casual sexism endemic to the movement: parts of it see nothing problematic about hosting conferences with all-male speakers or having all-male leadership – and that’s before you get to the vitriolic and dangerous sexual harassment, online and off, that’s designed to intimidate women into silence.

Richard Dawkins has involved himself in some of these controversies, and rarely for the better – as with his infamous “Dear Muslima” letter in 2011, in which he essentially argued that, because women in Muslim countries suffer more from sexist mistreatment, women in the west shouldn’t speak up about sexual harassment or physical intimidation. There was also his sneer at women who advocate anti-sexual harassment policies.

But over the last few months, Dawkins showed signs of détente with his feminist critics – even progress. He signed a joint letter with the writer Ophelia Benson, denouncing and rejecting harassment; he even apologized for the “Dear Muslima” letter. On stage at a conference in Oxford in August, Dawkins claimed to be a feminist and said that everyone else should be, too.

Then another prominent male atheist, Sam Harris, crammed his foot in his mouth and said that atheist activism lacks an “estrogen vibe” and was “to some degree intrinsically male”. And, just like that, the brief Dawkins Spring was over. [Continue reading…]

Facebooktwittermail

Ants are cool but teach us nothing

E.O. Wilson writes: For nearly seven decades, starting in boyhood, I’ve studied hundreds of kinds of ants around the world, and this qualifies me, I believe, to offer some advice on ways their lives can be applied to ours. I’ll start with the question I’m most often asked: “What can I do about the ants in my kitchen?” My response comes from the heart: Watch your step, be careful of little lives. Ants especially like honey, tuna and cookie crumbs. So put down bits of those on the floor, and watch as the first scout finds the bait and reports back to her colony by laying an odor trail. Then, as a little column follows her out to the food, you will see social behavior so strange it might be on another planet. Think of kitchen ants not as pests or bugs, but as your personal guest superorganism.

Another question I hear a lot is, “What can we learn of moral value from the ants?” Here again I will answer definitively: nothing. Nothing at all can be learned from ants that our species should even consider imitating. For one thing, all working ants are female. Males are bred and appear in the nest only once a year, and then only briefly. They are pitiful creatures with wings, huge eyes, small brains and genitalia that make up a large portion of their rear body segment. They have only one function in life: to inseminate the virgin queens during the nuptial season. They are built to be robot flying sexual missiles. Upon mating or doing their best to mate, they are programmed to die within hours, usually as victims of predators.

Many kinds of ants eat their dead — and their injured, too. You may have seen ant workers retrieve nestmates that you have mangled or killed underfoot (accidentally, I hope), thinking it battlefield heroism. The purpose, alas, is more sinister. [Continue reading…]

Facebooktwittermail

Will misogyny bring down the atheist movement?

Mark Oppenheimer writes: Several women told me that women new to the movement were often warned about the intentions of certain older men, especially [Michael] Shermer [the founder of Skeptic magazine]. Two more women agreed to go on the record, by name, with their Shermer stories… These stories help flesh out a man who, whatever his progressive views on science and reason, is decidedly less evolved when it comes to women.

Yet Shermer remains a leader in freethought — arguably the leader. And in his attitudes, he is hardly an exception. Hitchens, the best-selling author of God Is Not Great, who died in 2011, wrote a notorious Vanity Fair article called “Why Women Aren’t Funny.” Richard Dawkins, another author whose books have brought atheism to the masses, has alienated many women — and men — by belittling accusations of sexism in the movement; he seems to go out of his way to antagonize feminists generally, and just this past July 29 he tweeted, “Date rape is bad. Stranger rape at knifepoint is worse. If you think that’s an endorsement of date rape, go away and learn how to think.” And Penn Jillette, the talking half of the Penn and Teller duo, famously revels in using words like “cunt.”

The reality of sexism in freethought is not limited to a few famous leaders; it has implications throughout the small but quickly growing movement. Thanks to the internet, and to popular authors like Dawkins, Hitchens, and Sam Harris, atheism has greater visibility than at any time since the 18th-century Enlightenment. Yet it is now cannibalizing itself. For the past several years, Twitter, Facebook, Reddit, and online forums have become hostile places for women who identify as feminists or express concern about widely circulated tales of sexism in the movement. Some women say they are now harassed or mocked at conventions, and the online attacks — which include Jew-baiting, threats of anal rape, and other pleasantries — are so vicious that two activists I spoke with have been diagnosed with post-traumatic stress disorder. One of these women has been bedridden for two years.

To those outside the community, freethought would seem an unlikely candidate for this sort of internal strife. Aren’t atheists and agnostics supposed to be liberal, forward-thinking types? But from the beginning, there has been a division in freethought between the humanists, who see atheism as one part of a larger progressive vision for society, and the libertarians, for whom the banishment of God sits comfortably with capitalism, gun rights, and free-speech absolutism. One group sees men like Michael Shermer as freethought’s big problem, while the other sees defending them as crucial to freethought’s mission. [Continue reading…]

Facebooktwittermail

ISIS is about to destroy Biblical history in Iraq

Christopher Dickey reports that soon after ISIS took control of Mosul: the minions of the self-appointed caliph of the freshly self-declared Islamic State, Abu Bakr al-Baghdadi, paid a visit to the Mosul Museum. It has been closed for years for restoration, ever since it was looted along with many of Iraq’s other institutions in the wake of the culturally oblivious American-led invasion of 2003. But the Mosul Museum was on the verge of reopening, at last, and the full collection had been stored there.

“These groups of terrorists—their arrival was a brutal shock, with no warning,” Iraqi National Museum Director Qais Hussein Rashid told me when he visited Paris last week with a mission pleading for international help. “We were not able to take preventive measures.”

Indeed, museum curators and staff were no better prepared than any other part of the Iraqi government. They could have learned from al-Baghdadi’s operations in neighboring Syria that a major source of revenue for his insurgency has been the sale of looted antiquities on the black market. As reported in The Guardian, a windfall of intelligence just before Mosul fell revealed that al-Baghdadi had accumulated a $2 billion war chest, in part by selling off ancient artifacts from captured Syrian sites. But the Iraqi officials concerned with antiquities said the Iraqi intelligence officers privy to that information have not shared it with them.

So the risk now — the virtual certainty, in fact — is that irreplaceable history will be annihilated or sold into the netherworld of corrupt and cynical collectors. And it was plain when I met with Rashid and his colleagues that they are desperate to stop it, but have neither the strategy nor the resources to do so. [Continue reading…]

Facebooktwittermail

Maya Angelou: American titan who lived as though there were no tomorrow

Following the death of Maya Angelou, Gary Younge writes: By the time she reached 40 she had been a professional dancer, prostitute, madam, lecturer, activist, singer and editor. She had worked with Martin Luther King and Malcolm X, lived in Ghana and Egypt, toured Europe with a dance troupe and settled in pretty much every region of the United States. And then she wrote about it, the whole time crafting a path as a poet, epigrammist and performer. “My life has been long,” she wrote in one her last books. “And believing that life loves the liver of it, I have dared to try many things, sometimes trembling, but daring still.”

In a subsequent interview I described her as the “Desiderata in human form” and “a professional hopemonger”. She lived as though there were no tomorrow. And now that there really is no tomorrow, for her, we are left to contemplate – for us as well as her – where daring can get you.

But with her passing, America has not just lost a talented Renaissance woman and gifted raconteur. It has lost a connection to its recent past that had helped it make sense of its present. At a time when so many Americans seek to travel ‘color blind’, and free from the baggage of the nation’s racial history, here she stood, tall, straight and true: a black woman from the south intimately connected to the transformative people and politics who helped shape much of America’s racial landscape.

A woman determined to give voice to both frustration and a militancy without being so consumed by either that she could not connect with those who did not instinctively relate to it. A woman who, in her own words, was determined to go through life with “passion, compassion, humor and some style”, and would use all those attributes and more to remind America of where this frustration and militancy was coming from.

She described the 9/11 attacks as a “hate crime”, and said: “Living in a state of terror was new to many white people in America, but black people have been living in a state of terror in this country for more than 400 years.” [Continue reading…]

Facebooktwittermail

The cloud of unknowing

Karl Taro Greenfeld writes: I can’t help it. Every few weeks, my wife mentions the latest book her book club is reading, and no matter what it is, whether I’ve read it or not, I offer an opinion of the work, based entirely on … what, exactly? Often, these are books I’ve not even read a review or essay about, yet I freely hold forth on the grandiosity of Cheryl Strayed or the restrained sentimentality of Edwidge Danticat. These data motes are gleaned, apparently, from the ether — or, more realistically, from various social media feeds.

What was Solange Knowles’s elevator attack on Jay-Z about? I didn’t watch the security-camera video on TMZ — it would have taken too long — but I scrolled through enough chatter to know that Solange had scrubbed her Instagram feed of photos of her sister, Beyoncé. How about this season of “Game of Thrones” and that nonconsensual intercourse in the crypt? I don’t watch the show, but I’ve scanned the recaps on Vulture.com, and I am prepared to argue that this was deeply offensive. Is Pope Francis a postmodern pontiff? I’ve never listened to one of his homilies nor watched his recent “60 Minutes” appearance, but I’ve seen plenty of his @Pontifex tweets retweeted, so I’m ready to say his position on inequality and social justice is remarkably progressive.

It’s never been so easy to pretend to know so much without actually knowing anything. We pick topical, relevant bits from Facebook, Twitter or emailed news alerts, and then regurgitate them. Instead of watching “Mad Men” or the Super Bowl or the Oscars or a presidential debate, you can simply scroll through someone else’s live-tweeting of it, or read the recaps the next day. Our cultural canon is becoming determined by whatever gets the most clicks.

In his 1987 book “Cultural Literacy: What Every American Needs to Know,” E. D. Hirsch Jr. listed 5,000 essential concepts and names — 1066, Babbitt, Pickwickian — that educated people should be familiar with. (Or at least that’s what I believe he wrote, not having actually read the book.) Mr. Hirsch’s book, along with its contemporary “The Closing of the American Mind” by Allan Bloom, made the point that cultural literacy — Mr. Bloom’s canon — was the bedrock of our agreed-upon values.

What we all feel now is the constant pressure to know enough, at all times, lest we be revealed as culturally illiterate. So that we can survive an elevator pitch, a business meeting, a visit to the office kitchenette, a cocktail party, so that we can post, tweet, chat, comment, text as if we have seen, read, watched, listened. What matters to us, awash in petabytes of data, is not necessarily having actually consumed this content firsthand but simply knowing that it exists — and having a position on it, being able to engage in the chatter about it. We come perilously close to performing a pastiche of knowledgeability that is really a new model of know-nothingness. [Continue reading…]

Facebooktwittermail

The mounting casualties in the war of the Anthropocene

Justin E.H. Smith writes: There is a great die-off under way, one that may justly be compared to the disappearance of dinosaurs at the end of the Cretaceous, or the sudden downfall of so many great mammals at the beginning of the Holocene. But how far can such a comparison really take us in assessing the present moment?

The hard data tell us that what is happening to animals right now is part of the same broad historical process that has swept up humans: We are all being homogenized, subjected to uniform standards, domesticated. A curiosity that might help to drive this home: At present, the total biomass of mammals raised for food vastly exceeds the biomass of all mammalian wildlife on the planet (it also exceeds that of the human species itself). This was certainly not the case 10,000 or so years ago, at the dawn of the age of pastoralism.

It is hard to know where exactly, or even inexactly, to place the boundary between prehistory and history. Indeed, some authors argue that the very idea of prehistory is a sort of artificial buffer zone set up to protect properly human society from the vast expanse of mere nature that preceded us. But if we must set up a boundary, I suggest the moment when human beings began to dominate and control other large mammals for their own, human ends.

We tend to think about history as human history. Yet a suitably wide-focused perspective reveals that nothing in the course of human affairs makes complete sense without some account of animal actors. History has, in fact, been a question of human-animal interaction all along. Cherchez la vache is how the anthropologist E.E. Evans-­Pritchard argued that the social life of the cattle-herding Nuer of southern Sudan might best be summed up — “look for the cow” — but one could probably, without much stretching, extend that principle to human society in general. The cattle that now outweigh us are a mirror of our political and economic crisis, just as cattle were once a mirror of the sociocosmic harmony that characterized Nuer life. [Continue reading…]

Facebooktwittermail