Jason Lynch writes: Thanksgiving is a day when more than 100 million Americans will observe the most honored of traditions: gathering with family and friends to watch as many as 15 hours straight of TV.
More than any other major American holiday, Thanksgiving has become a TV-centric day, where people seem to spend far more time in front of the television than they do at the dinner table. And the broadcast networks are taking advantage of that rapt audience through marquee programs that last year attracted more than 114 million viewers.
The TV turkey day festivities kick off at 9am with the Macy’s Thanksgiving Day Parade on NBC, which averaged 22.4 million viewers last year, its largest audience since 2001. NBC Research estimates that 43.2 million people watched at least a portion of the parade. An additional 7.5 million CBS viewers watched that network’s unofficial coverage of the New York City event, billed as The Thanksgiving Day Parade on CBS. The parade concludes at 12pm, and segues into NBC’s coverage of The National Dog Show, which drew 9.2 million viewers in 2012. NBC Research estimates that 19.3 million viewers took in at least part of the Dog Show. [Continue reading...]
“Before They Pass Away,” by British photographer Jimmy Nelson, is described by an Amazon reviewer as “an essential item on everyone’s coffee table.”
It’s ironically fitting that this description comes from a “place” whose name — at least in the U.S. — now more frequently refers to the online mega-store rather than to the South American region. An indication perhaps that we care more about what we buy that what we breath.
Leaving aside the question as to whether anything can be said to be essential on a coffee table, the fact that a record of vanishing peoples would be trivialized by being ascribed this value says a lot about why they are vanishing.
Are we to superficially mourn the loss of cultures yet simultaneously be glad that something was preserved in the form of exquisite photographs? Content, perhaps, that before their demise we were able to snatch images of their exotic dress and thereby from the comfort of a couch somehow enhance our own appreciation of a world gradually being lost?
One could view cultural loss as a representation of cultural failure — that those under threat are those who proved least capable of adaptation. Or, one can see the failure as ours — that this represents yet another frontier in the destructive impact of those who have claimed global cultural domination and in so doing are busy destroying the atmosphere, the biosphere, and the ethnosphere.
Automation takes many forms and as members of a culture that reveres technology, we generally perceive automation in terms of its output: what it accomplishes, be that through manufacturing, financial transactions, flying aircraft, and so forth.
But automation doesn’t merely accomplish things for human beings; it simultaneously changes us by externalizing intelligence. The intelligence required by a person is transferred to a machine with its embedded commands, allowing the person to turn his intelligence elsewhere — or nowhere.
Automation is invariably sold on the twin claims that it offers greater efficiency, while freeing people from tedious tasks so that — at least in theory — they can give their attention to something more fulfilling.
There’s no disputing the efficiency argument — there could never have been such a thing as mass production without automation — but the promise of freedom has always been oversold. Automation has resulted in the creation of many of the most tedious, soul-destroying forms of labor in human history.
Automated systems are, however, never perfect, and when they break, they reveal the corrupting effect they have had on human intelligence — intelligence whose skilful application has atrophied through lack of use.
Nicholas Carr writes: On the evening of February 12, 2009, a Continental Connection commuter flight made its way through blustery weather between Newark, New Jersey, and Buffalo, New York. As is typical of commercial flights today, the pilots didn’t have all that much to do during the hour-long trip. The captain, Marvin Renslow, manned the controls briefly during takeoff, guiding the Bombardier Q400 turboprop into the air, then switched on the autopilot and let the software do the flying. He and his co-pilot, Rebecca Shaw, chatted — about their families, their careers, the personalities of air-traffic controllers — as the plane cruised uneventfully along its northwesterly route at 16,000 feet. The Q400 was well into its approach to the Buffalo airport, its landing gear down, its wing flaps out, when the pilot’s control yoke began to shudder noisily, a signal that the plane was losing lift and risked going into an aerodynamic stall. The autopilot disconnected, and the captain took over the controls. He reacted quickly, but he did precisely the wrong thing: he jerked back on the yoke, lifting the plane’s nose and reducing its airspeed, instead of pushing the yoke forward to gain velocity. Rather than preventing a stall, Renslow’s action caused one. The plane spun out of control, then plummeted. “We’re down,” the captain said, just before the Q400 slammed into a house in a Buffalo suburb.
The crash, which killed all 49 people on board as well as one person on the ground, should never have happened. A National Transportation Safety Board investigation concluded that the cause of the accident was pilot error. The captain’s response to the stall warning, the investigators reported, “should have been automatic, but his improper flight control inputs were inconsistent with his training” and instead revealed “startle and confusion.” An executive from the company that operated the flight, the regional carrier Colgan Air, admitted that the pilots seemed to lack “situational awareness” as the emergency unfolded.
The Buffalo crash was not an isolated incident. An eerily similar disaster, with far more casualties, occurred a few months later. On the night of May 31, an Air France Airbus A330 took off from Rio de Janeiro, bound for Paris. The jumbo jet ran into a storm over the Atlantic about three hours after takeoff. Its air-speed sensors, coated with ice, began giving faulty readings, causing the autopilot to disengage. Bewildered, the pilot flying the plane, Pierre-Cédric Bonin, yanked back on the stick. The plane rose and a stall warning sounded, but he continued to pull back heedlessly. As the plane climbed sharply, it lost velocity. The airspeed sensors began working again, providing the crew with accurate numbers. Yet Bonin continued to slow the plane. The jet stalled and began to fall. If he had simply let go of the control, the A330 would likely have righted itself. But he didn’t. The plane dropped 35,000 feet in three minutes before hitting the ocean. All 228 passengers and crew members died.
The first automatic pilot, dubbed a “metal airman” in a 1930 Popular Science article, consisted of two gyroscopes, one mounted horizontally, the other vertically, that were connected to a plane’s controls and powered by a wind-driven generator behind the propeller. The horizontal gyroscope kept the wings level, while the vertical one did the steering. Modern autopilot systems bear little resemblance to that rudimentary device. Controlled by onboard computers running immensely complex software, they gather information from electronic sensors and continuously adjust a plane’s attitude, speed, and bearings. Pilots today work inside what they call “glass cockpits.” The old analog dials and gauges are mostly gone. They’ve been replaced by banks of digital displays. Automation has become so sophisticated that on a typical passenger flight, a human pilot holds the controls for a grand total of just three minutes. What pilots spend a lot of time doing is monitoring screens and keying in data. They’ve become, it’s not much of an exaggeration to say, computer operators.
And that, many aviation and automation experts have concluded, is a problem. Overuse of automation erodes pilots’ expertise and dulls their reflexes, leading to what Jan Noyes, an ergonomics expert at Britain’s University of Bristol, terms “a de-skilling of the crew.” No one doubts that autopilot has contributed to improvements in flight safety over the years. It reduces pilot fatigue and provides advance warnings of problems, and it can keep a plane airborne should the crew become disabled. But the steady overall decline in plane crashes masks the recent arrival of “a spectacularly new type of accident,” says Raja Parasuraman, a psychology professor at George Mason University and a leading authority on automation. When an autopilot system fails, too many pilots, thrust abruptly into what has become a rare role, make mistakes. Rory Kay, a veteran United captain who has served as the top safety official of the Air Line Pilots Association, put the problem bluntly in a 2011 interview with the Associated Press: “We’re forgetting how to fly.” The Federal Aviation Administration has become so concerned that in January it issued a “safety alert” to airlines, urging them to get their pilots to do more manual flying. An overreliance on automation, the agency warned, could put planes and passengers at risk.
The experience of airlines should give us pause. It reveals that automation, for all its benefits, can take a toll on the performance and talents of those who rely on it. The implications go well beyond safety. Because automation alters how we act, how we learn, and what we know, it has an ethical dimension. The choices we make, or fail to make, about which tasks we hand off to machines shape our lives and the place we make for ourselves in the world. That has always been true, but in recent years, as the locus of labor-saving technology has shifted from machinery to software, automation has become ever more pervasive, even as its workings have become more hidden from us. Seeking convenience, speed, and efficiency, we rush to off-load work to computers without reflecting on what we might be sacrificing as a result. [Continue reading...]
Now if we think of automation as a form of forgetfulness, we will see that it extends much more deeply into civilization than just its modern manifestations through mechanization and digitization.
In the beginning was the Word and later came the Fall: the point at which language — the primary tool for shaping, expressing and sharing human intelligence — was cut adrift from the human mind and given autonomy in the form of writing.
Through the written word, thought can be immortalized and made universal. No other mechanism could have ever had such a dramatic effect on the exchange of ideas. Without writing, there would have been no such thing as humanity. But we also incurred a loss and because we have such little awareness of this loss, we might find it hard to imagine that preliterate people possessed forms of intelligence we now lack.
Plato described what writing would do — and by extension, what would happen to pilots.
In Phaedrus, he describes an exchange between the god Thamus, king and ruler of all Egypt, and the god Theuth, who has invented writing. Theuth, who is very proud of what he has created says: “This invention, O king, will make the Egyptians wiser and will improve their memories; for it is an elixir of memory and wisdom that I have discovered.” But Thamus points out that while one man has the ability to invent, the ability to judge an invention’s usefulness or harmfulness belongs to another.
If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance, for by telling them of many things without teaching them you will make them seem to know much, while for the most part they know nothing, and as men filled, not with wisdom, but with the conceit of wisdom, they will be a burden to their fellows.
Bedazzled by our ingenuity and its creations, we are fast forgetting the value of this quality that can never be implanted in a machine (or a text): wisdom.
Even the word itself is beginning to sound arcane — as though it should be reserved for philosophers and storytellers and is no longer something we should all strive to possess.
Robert Twigger writes: I travelled with Bedouin in the Western Desert of Egypt. When we got a puncture, they used tape and an old inner tube to suck air from three tyres to inflate a fourth. It was the cook who suggested the idea; maybe he was used to making food designed for a few go further. Far from expressing shame at having no pump, they told me that carrying too many tools is the sign of a weak man; it makes him lazy. The real master has no tools at all, only a limitless capacity to improvise with what is to hand. The more fields of knowledge you cover, the greater your resources for improvisation.
We hear the descriptive words psychopath and sociopath all the time, but here’s a new one: monopath. It means a person with a narrow mind, a one-track brain, a bore, a super-specialist, an expert with no other interests — in other words, the role-model of choice in the Western world. You think I jest? In June, I was invited on the Today programme on BBC Radio 4 to say a few words on the river Nile, because I had a new book about it. The producer called me ‘Dr Twigger’ several times. I was flattered, but I also felt a sense of panic. I have never sought or held a PhD. After the third ‘Dr’, I gently put the producer right. And of course, it was fine — he didn’t especially want me to be a doctor. The culture did. My Nile book was necessarily the work of a generalist. But the radio needs credible guests. It needs an expert — otherwise why would anyone listen?
The monopathic model derives some of its credibility from its success in business. In the late 18th century, Adam Smith (himself an early polymath who wrote not only on economics but also philosophy, astronomy, literature and law) noted that the division of labour was the engine of capitalism. His famous example was the way in which pin-making could be broken down into its component parts, greatly increasing the overall efficiency of the production process. But Smith also observed that ‘mental mutilation’ followed the too-strict division of labour. Or as Alexis de Tocqueville wrote: ‘Nothing tends to materialise man, and to deprive his work of the faintest trace of mind, more than extreme division of labour.’ [Continue reading...]
Henry Grabar writes: The host collects phones at the door of the dinner party. At a law firm, partners maintain a no-device policy at meetings. Each day, a fleet of vans assembles outside New York’s high schools, offering, for a small price, to store students’ contraband during the day. In situations where politeness and concentration are expected, backlash is mounting against our smartphones.
In public, of course, it’s a free country. It’s hard to think of a place beyond the sublime darkness of the movie theater where phone use is shunned, let alone regulated. (Even the cinematic exception is up for debate.) At restaurants, phones occupy that choice tablecloth real estate once reserved for a pack of cigarettes. In truly public space — on sidewalks, in parks, on buses and on trains — we move face down, our phones cradled like amulets.
No observer can fail to notice how deeply this development has changed urban life. A deft user can digitally enhance her experience of the city. She can study a map; discover an out-of-the-way restaurant; identify the trees that line the block and the architect who designed the building at the corner. She can photograph that building, share it with friends, and in doing so contribute her observations to a digital community. On her way to the bus (knowing just when it will arrive) she can report the existence of a pothole and check a local news blog.
It would be unfair to say this person isn’t engaged in the city; on the contrary, she may be more finely attuned to neighborhood history and happenings than her companions. But her awareness is secondhand: She misses the quirks and cues of the sidewalk ballet, fails to make eye contact, and limits her perception to a claustrophobic one-fifth of normal. Engrossed in the virtual, she really isn’t here with the rest of us.
Consider the case of a recent murder on a San Francisco train. On Sept. 23, in a crowded car, a man pulls a pistol from his jacket. In Vivian Ho’s words: “He raises the gun, pointing it across the aisle, before tucking it back against his side. He draws it out several more times, once using the hand holding the gun to wipe his nose. Dozens of passengers stand and sit just feet away — but none reacts. Their eyes, focused on smartphones and tablets, don’t lift until the gunman fires a bullet into the back of a San Francisco State student getting off the train.” [Continue reading...]
Luke Massey writes: Slavoj Žižek is brimming with thought. Each idea sprays out of the controversial Slovenian philosopher and cultural theorist in a jet of words. He is like a water balloon, perforated in so many areas that its content gushes out in all directions.
The result is that, as an interviewer, trying to give direction to the tide is a joyfully hopeless enterprise. Perhaps more significantly, the same seems to be true for Žižek himself.
We meet in a room with one glass wall – an apt setting for a discussion of freedom, ideology, surveillance and ‘80s dystopias on film. Picturehouse HQ is playing host to our discussion, on the launch of Žižek’s new film The Pervert’s Guide to Ideology.
Before I even ask my first question, Slavoj is off: he tells me that I’m better than some interviewers he’s met. The fact that I’ve barely spoken yet doesn’t seem a barrier to that. [Continue reading...]
Here’s a story in English:
A sheep that had no wool saw horses, one of them pulling a heavy wagon, one carrying a big load, and one carrying a man quickly. The sheep said to the horses: “My heart pains me, seeing a man driving horses.” The horses said: “Listen, sheep, our hearts pain us when we see this: a man, the master, makes the wool of the sheep into a warm garment for himself. And the sheep has no wool.” Having heard this, the sheep fled into the plain.
And here is “a very educated approximation” of how that story might have sounded if spoken in Proto-Indo-European about 6,500 years ago:
(Read more at Archeology.)
Ethan Watters writes: In the summer of 1995, a young graduate student in anthropology at UCLA named Joe Henrich traveled to Peru to carry out some fieldwork among the Machiguenga, an indigenous people who live north of Machu Picchu in the Amazon basin. The Machiguenga had traditionally been horticulturalists who lived in single-family, thatch-roofed houses in small hamlets composed of clusters of extended families. For sustenance, they relied on local game and produce from small-scale farming. They shared with their kin but rarely traded with outside groups.
While the setting was fairly typical for an anthropologist, Henrich’s research was not. Rather than practice traditional ethnography, he decided to run a behavioral experiment that had been developed by economists. Henrich used a “game”—along the lines of the famous prisoner’s dilemma—to see whether isolated cultures shared with the West the same basic instinct for fairness. In doing so, Henrich expected to confirm one of the foundational assumptions underlying such experiments, and indeed underpinning the entire fields of economics and psychology: that humans all share the same cognitive machinery—the same evolved rational and psychological hardwiring.
The test that Henrich introduced to the Machiguenga was called the ultimatum game. The rules are simple: in each game there are two players who remain anonymous to each other. The first player is given an amount of money, say $100, and told that he has to offer some of the cash, in an amount of his choosing, to the other subject. The second player can accept or refuse the split. But there’s a hitch: players know that if the recipient refuses the offer, both leave empty-handed. North Americans, who are the most common subjects for such experiments, usually offer a 50-50 split when on the giving end. When on the receiving end, they show an eagerness to punish the other player for uneven splits at their own expense. In short, Americans show the tendency to be equitable with strangers—and to punish those who are not.
Among the Machiguenga, word quickly spread of the young, square-jawed visitor from America giving away money. The stakes Henrich used in the game with the Machiguenga were not insubstantial—roughly equivalent to the few days’ wages they sometimes earned from episodic work with logging or oil companies. So Henrich had no problem finding volunteers. What he had great difficulty with, however, was explaining the rules, as the game struck the Machiguenga as deeply odd.
When he began to run the game it became immediately clear that Machiguengan behavior was dramatically different from that of the average North American. To begin with, the offers from the first player were much lower. In addition, when on the receiving end of the game, the Machiguenga rarely refused even the lowest possible amount. “It just seemed ridiculous to the Machiguenga that you would reject an offer of free money,” says Henrich. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game.”
The potential implications of the unexpected results were quickly apparent to Henrich. He knew that a vast amount of scholarly literature in the social sciences—particularly in economics and psychology—relied on the ultimatum game and similar experiments. At the heart of most of that research was the implicit assumption that the results revealed evolved psychological traits common to all humans, never mind that the test subjects were nearly always from the industrialized West. Henrich realized that if the Machiguenga results stood up, and if similar differences could be measured across other populations, this assumption of universality would have to be challenged.
Henrich had thought he would be adding a small branch to an established tree of knowledge. It turned out he was sawing at the very trunk. He began to wonder: What other certainties about “human nature” in social science research would need to be reconsidered when tested across diverse populations? [Continue reading...]
Laura Helmuth writes: The fundamental structure of human populations has changed exactly twice in evolutionary history. The second time was in the past 150 years, when the average lifespan doubled in most parts of the world. The first time was in the Paleolithic, probably around 30,000 years ago. That’s when old people were basically invented.
Throughout hominid history, it was exceedingly rare for individuals to live more than 30 years. Paleoanthropologists can examine teeth to estimate how old a hominid was when it died, based on which teeth are erupted, how worn down they are, and the amount of a tissue called dentin. Anthropologist Rachel Caspari of Central Michigan University used teeth to identify the ratio of old to young people in Australopithecenes from 3 million to 1.5 million years ago, early Homo species from 2 million to 500,000 years ago, and Neanderthals from 130,000 years ago. Old people — old here means older than 30 (sorry) — were a vanishingly small part of the population. When she looked at modern humans from the Upper Paleolithic, about 30,000 years ago, though, she found the ratio reversed — there were twice as many adults who died after age 30 as those who died young.
The Upper Paleolithic is also when modern humans really started flourishing. That’s one of the times the population boomed and humans created complex art, used symbols, and colonized even inhospitable environments. (The modern humans she studied lived in Europe during some of the bitterest millennia of the last Ice Age.) Caspari says it wasn’t a biological change that allowed people to start living reliably to their 30s and beyond. (When she looked at other populations of Neanderthals and Homo sapiens that lived in the same place and time, the two different species had similar proportions of old people, suggesting the change was not genetic.) Instead, it was culture. Something about how people were living made it possible to survive into old age, maybe the way they found or stored food or built shelters, who knows. That’s all lost — pretty much all we have of them is teeth — but once humans found a way to keep old people around, everything changed.
Old people are repositories of information, Caspari says. They know about the natural world, how to handle rare disasters, how to perform complicated skills, who is related to whom, where the food and caves and enemies are. They maintain and build intricate social networks. A lot of skills that allowed humans to take over the world take a lot of time and training to master, and they wouldn’t have been perfected or passed along without old people. “They can be great teachers,” Caspari says, “and they allow for more complex societies.” Old people made humans human. [Continue reading...]
While life extension allowed culture to blossom, the proliferation of culture long preceded the emergence of civilization which brought with it the extension and entrenching of ownership, the control of language through script, and the institutionalization of inequality.
While culture allowed people to live longer, civilization extended the lives of some while shortening the lives of others.
Cardiff University: Rapid climate change during the Middle Stone Age, between 80,000 and 40,000 years ago, during the Middle Stone Age, sparked surges in cultural innovation in early modern human populations, according to new research.
The research, published in the journal Nature Communications [21 May], was conducted by a team of scientists from Cardiff University’s School of Earth and Ocean Sciences, the Natural History Museum in London and the University of Barcelona.
The scientists studied a marine sediment core off the coast of South Africa and reconstructed terrestrial climate variability over the last 100,000 years.
Dr Martin Ziegler, Cardiff University School of Earth and Ocean Sciences, said: “We found that South Africa experienced rapid climate transitions toward wetter conditions at times when the Northern Hemisphere experienced extremely cold conditions.”
These large Northern Hemisphere cooling events have previously been linked to a change in the Atlantic Ocean circulation that led to a reduced transport of warm water to the high latitudes in the North. In response to this Northern Hemisphere cooling, large parts of the sub-Saharan Africa experienced very dry conditions.
“Our new data however, contrasts with sub-Saharan Africa and demonstrates that the South African climate responded in the opposite direction, with increasing rainfall, that can be associated with a globally occurring southward shift of the tropical monsoon belt.”
Professor Ian Hall, Cardiff University School of Earth and Ocean Sciences, said: “When the timing of these rapidly occurring wet pulses was compared with the archaeological datasets, we found remarkable coincidences.
“The occurrence of several major Middle Stone Age industries fell tightly together with the onset of periods with increased rainfall”
“Similarly, the disappearance of the industries appears to coincide with the transition to drier climatic conditions.”
Professor Chris Stringer of London’s Natural History Museum commented “The correspondence between climatic ameliorations and cultural innovations supports the view that population growth fuelled cultural changes, through increased human interactions”.
The South African archaeological record is so important because it shows some of the oldest evidence for modern behavior in early humans. This includes the use of symbols, which has been linked to the development of complex language, and personal adornments made of seashells.
“The quality of the southern African data allowed us to make these correlations between climate and behavioural change, but it will require comparable data from other areas before we can say whether this region was uniquely important in the development of modern human culture” added Professor Stringer.
The new study presents the most convincing evidence so far that abrupt climate change was instrumental in this development.
The research was supported by the UK Natural Environment Research Council and is part of the international Gateways training network, funded by the 7th Framework Programme of the European Union.
Archbishop Desmond Tutu: A person with Ubuntu is open and available to others, affirming of others, does not feel threatened that others are able and good, for he or she has a proper self-assurance that comes from knowing that he or she belongs in a greater whole and is diminished when others are humiliated or diminished, when others are tortured or oppressed.
One of the sayings in our country is Ubuntu – the essence of being human. Ubuntu speaks particularly about the fact that you can’t exist as a human being in isolation. It speaks about our interconnectedness. You can’t be human all by yourself, and when you have this quality – Ubuntu – you are known for your generosity.
We think of ourselves far too frequently as just individuals, separated from one another, whereas you are connected and what you do affects the whole world. When you do well, it spreads out; it is for the whole of humanity.
The psychiatrist, Jeffrey P. Kahn, suggests that the flowering of civilization may have been fueled by the creation of beer, a practice that could have evolved as early as 10,000 years ago providing occasional relief from the constraints of social conformity.
Once the effects of these early brews were discovered, the value of beer (as well as wine and other fermented potions) must have become immediately apparent. With the help of the new psychopharmacological brew, humans could quell the angst of defying those herd instincts. Conversations around the campfire, no doubt, took on a new dimension: the painfully shy, their angst suddenly quelled, could now speak their minds.
But the alcohol would have had more far-ranging effects, too, reducing the strong herd instincts to maintain a rigid social structure. In time, humans became more expansive in their thinking, as well as more collaborative and creative. A night of modest tippling may have ushered in these feelings of freedom — though, the morning after, instincts to conform and submit would have kicked back in to restore the social order.
I don’t find this a particularly persuasive line of speculation. It seems much more likely that beer served a role in sustaining the social order rather than freeing the imagination.
Records from 5,000 years ago show that enslaved farm laborers were being provided by their masters with a staple diet of barley gruel and weak beer — provisions barely sufficient to prevent starvation. The beer — not unlike the most popular watery brews of today — seems like it might have served more as a tool of pacification than a liberator of creativity. Indeed, if the advent of civilization opened up new avenues for exploring the human spirit for a newly emerging creative class, it simultaneously created the need for a new class of workers who would obediently follow directions without plotting insurrections.
Knowledge about botanical tools for expanding consciousness most likely long preceded civilization. On several continents evidence of the use of hallucinogenic mushrooms can be found in rock art and rock art itself appears to go back as far as 40,000 years. Whatever social, ritual, or religious function such art may have performed it appears to express the kind of creative exuberance suggesting that for these primeval artists the creative act was an end in itself. And whether that creativity was unleashed by hallucinogens is perhaps besides the point. Clearly, human beings required neither civilization nor beer in order to become creative.
Civilization is celebrated because among other things it led to the creation of writing, yet in terms of creativity the transition from rock art to writing was regressive. The former served in enabling a magical transmutation: the ephemeral, intangible stuff of imagination was turned into physical form. Writing, on the other hand, initially served as a tool for exploitation. It allowed claims of ownership and laws to be set in stone. Its function at the beginning of civilization was to shackle the imagination and codify authority.
If beer was essential for pacifying slaves, it may also have functioned in defining a boundary that legitimized alcohol while prohibiting hallucinogens. Just as the U.S. army demonstrated when experimenting with the use of LSD in chemical warfare, the potential such drugs have for undermining conventions of social order suggest they would commonly be perceived as a threat to civilization.
If beer was civilizing this might say less about the socially liberating effect of alcohol than it says about the need for social elites to limit the ambitions of those upon whose labor they depend. A central nervous system depressant could be employed as a way to release slaves from their leg irons by shackling their brains.
Global Mail reports: Tucked away in the sandstone ridges of the rugged tropics near Australia’s north-eastern tip, the ochre “bullymen” with their big penises and staring eyes still cling to the rock.
These are secret paintings, made by Aboriginal men who were driven from their lowlands by colonials hungry for gold, and who were then harassed in the hills by the Native Mounted Police, both black troopers and their white officers.
The locals painted the police, or “bullymen”, onto the rock, in the belief that these works would conquer the enemy through sorcery, as Tommy George, a descendant of the “black trackers” now in his 80s (and the last surviving speaker of Agu Alaya, or Taipan Snake language) has explained to archaeologist Noelene Cole.
The Aboriginal fugitives believed that the paintings were a weapon that gave them power over the armed lawmen, George explained.
“It was so deeply embedded in culture that they could use the rock paintings to kill people – that’s how powerful the art was.
“The paintings were there to kill the police.
“It shows the power of that visual culture. They thought it was as strong as guns,” Cole said.
But now these poignant paintings, along with hundreds of other Aboriginal works in this remote Queensland gallery, which United Nations Educational, Scientific and Cultural Organisation (UNESCO) has recognised as one of the great rock-art precincts of the globe, are under a new threat from another power source: mining.
Prospectors have targeted the land beneath and around the art because it is rich in exploitable resources. These paintings are emblematic of the perils befalling rock art throughout Australia, where the resources sector has been booming for more than a decade, fuelled by China’s insatiable energy needs.
This is a recurring Australian story, as mining, industry and urbanisation surge across a landscape which harbours millions of images at more than 100,000 rock-art sites. State and territory heritage laws have proved weak in protecting the works, under governments keen to cash in on the mining bonanza. [Continue reading...]