Category Archives: Culture

Paleolithic parenting and animated GIFs

The creation of the moving image represents a technical advance in the arts comparable with the invention of the steam engine during the industrial revolution.

The transition from static to moving imagery was a watershed event in human history, through which people discovered a new way of capturing the visible world — or so it seemed.

It turns out, however, that long before the advent of civilization, our Paleolithic forebears figured out that movement seen in living creatures around them could, by cunning means, be captured in crafted illusions of movement.


Let’s run with the hypothesis that this 14,000 year old artifact is indeed a toy. What does this tell us about its creator and the children for whom it was made? [Continue reading at my new site: Attention to the Unseen]

Facebooktwittermail

The deep history of the domestication and enslavement of humans

Steven Mithen writes: When our ancestors began to control fire, most likely somewhere in Africa around 400,000 years ago, the planet was set on a new course. We have little idea and even less evidence of how early humans made fire; perhaps they carried around smouldering bundles of leaves from forest fires, or captured the sparks thrown off when chipping stone or rubbing sticks together. However it happened, the human control of fire made an indelible mark on the earth’s ecosystems, and marked the beginning of the Anthropocene – the epoch in which humans have had a significant impact on the planet.

In Against the Grain James Scott describes these early stages as a ‘“thin” Anthropocene’, but ever since, the Anthropocene has been getting thicker. New layers of human impact were added by the adoption of farming about ten thousand years ago, the invention of the steam engine around 1780, and the dropping of the atomic bomb in 1945. Today the Anthropocene is so dense that we have virtually lost sight of anything that could be called ‘the natural world’.

Fire changed humans as well as the world. Eating cooked food transformed our bodies; we developed a much shorter digestive tract, meaning that more metabolic energy was available to grow our brains. At the same time, Homo sapiens became domesticated by its dependence on fire for warmth, protection and fuel. If this was the start of human progress towards ‘civilisation’, then – according to the conventional narrative – the next step was the invention of agriculture around ten thousand years ago. Farming, it is said, saved us from a dreary nomadic Stone Age hunter-gatherer existence by allowing us to settle down, build towns and develop the city-states that were the centres of early civilisations. People flocked to them for the security, leisure and economic opportunities gained from living within thick city walls. The story continues with the collapse of the city-states and barbarian insurgency, plunging civilised worlds – ancient Mesopotamia, China, Mesoamerica – into their dark ages. Thus civilisations rise and fall. Or so we are told.

The perfectly formed city-state is the ideal, deeply ingrained in the Western psyche, on which our notion of the nation-state is founded, ultimately inspiring Donald Trump’s notion of a ‘city’ wall to keep out the barbarian Mexican horde, and Brexiters’ desire to ‘take back control’ from insurgent European bureaucrats. But what if the conventional narrative is entirely wrong? What if ancient ruins testify to an aberration in the normal state of human affairs rather than a glorious and ancient past to whose achievements we should once again aspire? What if the origin of farming wasn’t a moment of liberation but of entrapment? Scott offers an alternative to the conventional narrative that is altogether more fascinating, not least in the way it omits any self-congratulation about human achievement. His account of the deep past doesn’t purport to be definitive, but it is surely more accurate than the one we’re used to, and it implicitly exposes the flaws in contemporary political ideas that ultimately rest on a narrative of human progress and on the ideal of the city/nation-state. [Continue reading…]

Facebooktwittermail

Our relentless consumption is trashing the planet

George Monbiot writes: Everyone wants everything – how is that going to work? The promise of economic growth is that the poor can live like the rich and the rich can live like the oligarchs. But already we are bursting through the physical limits of the planet that sustains us. Climate breakdown, soil loss, the collapse of habitats and species, the sea of plastic, insectageddon: all are driven by rising consumption. The promise of private luxury for everyone cannot be met: neither the physical nor the ecological space exists.

But growth must go on: this is everywhere the political imperative. And we must adjust our tastes accordingly. In the name of autonomy and choice, marketing uses the latest findings in neuroscience to break down our defences. Those who seek to resist must, like the Simple Lifers in Brave New World, be silenced – in this case by the media.

With every generation, the baseline of normalised consumption shifts. Thirty years ago, it was ridiculous to buy bottled water, where tap water is clean and abundant. Today, worldwide, we use a million plastic bottles a minute.

Every Friday is a Black Friday, every Christmas a more garish festival of destruction. Among the snow saunas, portable watermelon coolers and smartphones for dogs with which we are urged to fill our lives, my #extremecivilisation prize now goes to the PancakeBot: a 3D batter printer that allows you to eat the Mona Lisa, the Taj Mahal, or your dog’s bottom every morning. In practice, it will clog up your kitchen for a week until you decide you don’t have room for it. For junk like this, we’re trashing the living planet, and our own prospects of survival. Everything must go. [Continue reading…]

Facebooktwittermail

How Yamnaya nomads and other herding cultures became early forces of globalization

Science News reports: Nomadic herders living on western Asia’s hilly grasslands made a couple of big moves east and west around 5,000 years ago. These were not typical, back-and-forth treks from one seasonal grazing spot to another. These people blazed new trails.

A technological revolution had transformed travel for ancient herders around that time. Of course they couldn’t make online hotel reservations. Trip planners would have searched in vain for a Steppe Depot stocked with essential tools and supplies. The closest thing to a traveler’s pit stop was a mountain stream and a decent grazing spot for cattle. Yet, unlike anyone before, these hardy people had the means to move — wheels, wagons and horses.

Here’s how the journeys may have played out: At a time when rainfall dwindled and grasslands in western Asia turned brown, oxen-pulled wagons loaded with personal belongings rolled west, following greener pastures into central and northern Europe. Other carts rumbled east as far as Siberia’s Altai Mountains, where Russia, China, Mongolia and Kazakhstan meet today. Families of men, women and children may have piled on board. Or travelers may have been mostly men, who married women from farming villages along the way. Cattle, sheep and goats undoubtedly trailed along with whoever made these trips, under the watchful guidance of horse riders. Wagons served as mobile homes while on the move and during periodic stops to let animals graze.

These journeys, by people now known as the Yamnaya, transformed human genes and cultures across a huge swath of Europe and Asia. Yamnaya people left their mark from Ireland to China’s western border, across roughly 4,000 kilometers. [Continue reading…]

Facebooktwittermail

Consciousness began when the gods stopped speaking

Veronique Greenwood writes: Julian Jaynes was living out of a couple of suitcases in a Princeton dorm in the early 1970s. He must have been an odd sight there among the undergraduates, some of whom knew him as a lecturer who taught psychology, holding forth in a deep baritone voice. He was in his early 50s, a fairly heavy drinker, untenured, and apparently uninterested in tenure. His position was marginal. “I don’t think the university was paying him on a regular basis,” recalls Roy Baumeister, then a student at Princeton and today a professor of psychology at Florida State University. But among the youthful inhabitants of the dorm, Jaynes was working on his masterpiece, and had been for years.

From the age of 6, Jaynes had been transfixed by the singularity of conscious experience. Gazing at a yellow forsythia flower, he’d wondered how he could be sure that others saw the same yellow as he did. As a young man, serving three years in a Pennsylvania prison for declining to support the war effort, he watched a worm in the grass of the prison yard one spring, wondering what separated the unthinking earth from the worm and the worm from himself. It was the kind of question that dogged him for the rest of his life, and the book he was working on would grip a generation beginning to ask themselves similar questions.

The Origin of Consciousness in the Breakdown of the Bicameral Mind, when it finally came out in 1976, did not look like a best-seller. But sell it did. It was reviewed in science magazines and psychology journals, Time, The New York Times, and the Los Angeles Times. It was nominated for a National Book Award in 1978. New editions continued to come out, as Jaynes went on the lecture circuit. Jaynes died of a stroke in 1997; his book lived on. In 2000, another new edition hit the shelves. It continues to sell today.

In the beginning of the book, Jaynes asks, “This consciousness that is myself of selves, that is everything, and yet nothing at all—what is it? And where did it come from? And why?” Jaynes answers by unfurling a version of history in which humans were not fully conscious until about 3,000 years ago, instead relying on a two-part, or bicameral, mind, with one half speaking to the other in the voice of the gods with guidance whenever a difficult situation presented itself. The bicameral mind eventually collapsed as human societies became more complex, and our forebears awoke with modern self-awareness, complete with an internal narrative, which Jaynes believes has its roots in language. [Continue reading…]

Facebooktwittermail

The case against civilization

John Lanchester writes: Science and technology: we tend to think of them as siblings, perhaps even as twins, as parts of STEM (for “science, technology, engineering, and mathematics”). When it comes to the shiniest wonders of the modern world—as the supercomputers in our pockets communicate with satellites—science and technology are indeed hand in glove. For much of human history, though, technology had nothing to do with science. Many of our most significant inventions are pure tools, with no scientific method behind them. Wheels and wells, cranks and mills and gears and ships’ masts, clocks and rudders and crop rotation: all have been crucial to human and economic development, and none historically had any connection with what we think of today as science. Some of the most important things we use every day were invented long before the adoption of the scientific method. I love my laptop and my iPhone and my Echo and my G.P.S., but the piece of technology I would be most reluctant to give up, the one that changed my life from the first day I used it, and that I’m still reliant on every waking hour—am reliant on right now, as I sit typing—dates from the thirteenth century: my glasses. Soap prevented more deaths than penicillin. That’s technology, not science.

In “Against the Grain: A Deep History of the Earliest States,” James C. Scott, a professor of political science at Yale, presents a plausible contender for the most important piece of technology in the history of man. It is a technology so old that it predates Homo sapiens and instead should be credited to our ancestor Homo erectus. That technology is fire. We have used it in two crucial, defining ways. The first and the most obvious of these is cooking. As Richard Wrangham has argued in his book “Catching Fire,” our ability to cook allows us to extract more energy from the food we eat, and also to eat a far wider range of foods. Our closest animal relative, the chimpanzee, has a colon three times as large as ours, because its diet of raw food is so much harder to digest. The extra caloric value we get from cooked food allowed us to develop our big brains, which absorb roughly a fifth of the energy we consume, as opposed to less than a tenth for most mammals’ brains. That difference is what has made us the dominant species on the planet.

The other reason fire was central to our history is less obvious to contemporary eyes: we used it to adapt the landscape around us to our purposes. Hunter-gatherers would set fires as they moved, to clear terrain and make it ready for fast-growing, prey-attracting new plants. They would also drive animals with fire. They used this technology so much that, Scott thinks, we should date the human-dominated phase of earth, the so-called Anthropocene, from the time our forebears mastered this new tool.

We don’t give the technology of fire enough credit, Scott suggests, because we don’t give our ancestors much credit for their ingenuity over the long period—ninety-five per cent of human history—during which most of our species were hunter-gatherers. “Why human fire as landscape architecture doesn’t register as it ought to in our historical accounts is perhaps that its effects were spread over hundreds of millennia and were accomplished by ‘precivilized’ peoples also known as ‘savages,’ ” Scott writes. To demonstrate the significance of fire, he points to what we’ve found in certain caves in southern Africa. The earliest, oldest strata of the caves contain whole skeletons of carnivores and many chewed-up bone fragments of the things they were eating, including us. Then comes the layer from when we discovered fire, and ownership of the caves switches: the human skeletons are whole, and the carnivores are bone fragments. Fire is the difference between eating lunch and being lunch.

Anatomically modern humans have been around for roughly two hundred thousand years. For most of that time, we lived as hunter-gatherers. Then, about twelve thousand years ago, came what is generally agreed to be the definitive before-and-after moment in our ascent to planetary dominance: the Neolithic Revolution. This was our adoption of, to use Scott’s word, a “package” of agricultural innovations, notably the domestication of animals such as the cow and the pig, and the transition from hunting and gathering to planting and cultivating crops. The most important of these crops have been the cereals—wheat, barley, rice, and maize—that remain the staples of humanity’s diet. Cereals allowed population growth and the birth of cities, and, hence, the development of states and the rise of complex societies.

The story told in “Against the Grain” heavily revises this widely held account. Scott’s specialty is not early human history. His work has focussed on a skeptical, peasant’s-eye view of state formation; the trajectory of his interests can be traced in the titles of his books, from “The Moral Economy of the Peasant” to “The Art of Not Being Governed.” His best-known book, “Seeing Like a State,” has become a touchstone for political scientists, and amounts to a blistering critique of central planning and “high modernism,” the idea that officials at the center of a state know better than the people they are governing. Scott argues that a state’s interests and the interests of subjects are often not just different but opposite. Stalin’s project of farm collectivization “served well enough as a means whereby the state could determine cropping patterns, fix real rural wages, appropriate a large share of whatever grain was produced, and politically emasculate the countryside”; it also killed many millions of peasants.

Scott’s new book extends these ideas into the deep past, and draws on existing research to argue that ours is not a story of linear progress, that the time line is much more complicated, and that the causal sequences of the standard version are wrong. He focusses his account on Mesopotamia—roughly speaking, modern-day Iraq—because it is “the heartland of the first ‘pristine’ states in the world,” the term “pristine” here meaning that these states bore no watermark from earlier settlements and were the first time any such social organizations had existed. They were the first states to have written records, and they became a template for other states in the Near East and in Egypt, making them doubly relevant to later history.

The big news to emerge from recent archeological research concerns the time lag between “sedentism,” or living in settled communities, and the adoption of agriculture. Previous scholarship held that the invention of agriculture made sedentism possible. The evidence shows that this isn’t true: there’s an enormous gap—four thousand years—separating the “two key domestications,” of animals and cereals, from the first agrarian economies based on them. Our ancestors evidently took a good, hard look at the possibility of agriculture before deciding to adopt this new way of life. They were able to think it over for so long because the life they lived was remarkably abundant. Like the early civilization of China in the Yellow River Valley, Mesopotamia was a wetland territory, as its name (“between the rivers”) suggests. In the Neolithic period, Mesopotamia was a delta wetland, where the sea came many miles inland from its current shore.

This was a generous landscape for humans, offering fish and the animals that preyed on them, fertile soil left behind by regular flooding, migratory birds, and migratory prey travelling near river routes. The first settled communities were established here because the land offered such a diverse web of food sources. If one year a food source failed, another would still be present. The archeology shows, then, that the “Neolithic package” of domestication and agriculture did not lead to settled communities, the ancestors of our modern towns and cities and states. Those communities had been around for thousands of years, living in the bountiful conditions of the wetlands, before humanity committed to intensive agriculture. Reliance on a single, densely planted cereal crop was much riskier, and it’s no wonder people took a few millennia to make the change.

So why did our ancestors switch from this complex web of food supplies to the concentrated production of single crops? We don’t know, although Scott speculates that climatic stress may have been involved. Two things, however, are clear. The first is that, for thousands of years, the agricultural revolution was, for most of the people living through it, a disaster. The fossil record shows that life for agriculturalists was harder than it had been for hunter-gatherers. Their bones show evidence of dietary stress: they were shorter, they were sicker, their mortality rates were higher. Living in close proximity to domesticated animals led to diseases that crossed the species barrier, wreaking havoc in the densely settled communities. Scott calls them not towns but “late-Neolithic multispecies resettlement camps.” Who would choose to live in one of those? Jared Diamond called the Neolithic Revolution “the worst mistake in human history.” The startling thing about this claim is that, among historians of the era, it isn’t very controversial. [Continue reading…]

Facebooktwittermail

Food is about far more than bodily sustenance

By Tina Moffat and Charlene Mohammed

Fatima,* a refugee from Somalia who is a newcomer to Canada, has been having trouble in her local supermarket. Back home, she was accustomed to milk fresh from the cow. “In Canada I don’t even know if it’s real milk or fake milk,” she said. “I don’t know the difference. Is there milk that has pork-related ingredients in it?”

Life for new immigrants is hard in many ways. But one thing that is rarely recognized is the dramatic shift for newcomers in what they eat. People who are used to eating freshly killed chickens and seasonal vegetables—and drinking milk from their cows—are suddenly faced with an unfamiliar selection of produce, a range of processed foods, and a plethora of nonperishable goods from the food bank (if they need them) that are in some cases so odd that they are perceived as “poison.”

Food is at the heart of culture: It is at the center of gatherings ranging from weddings to funerals, and it’s a critical part of everyday life. Not only are ingredients and recipes important but so are people’s foodways and customs. In many countries, it is common to cook a large pot of food in anticipation of uninvited guests; those who have extra food share it, and they expect to have food shared in return. Such social arrangements can increase food security in the community.

Some immigrants and refugees who settle in Western urban centers find that they do not have enough resources to meet their food needs. As defined by the Food and Agricultural Organization (FAO), “food security exists when all people, at all times, have physical and economic access to sufficient, safe, and nutritious food to meet their dietary needs and food preferences for an active and healthy life.” Many officials, however, approach this problem as one of “hunger” with a limited understanding of food insecurity that focuses on providing sufficient food for survival—and nothing more. Our research shows that newcomers’ experiences with food insecurity—based on the stories they share—are about much more than satisfying their physical needs; food consumption has many social and cultural dimensions as well.

Continue reading

Facebooktwittermail

Western philosophy is racist

Bryan W Van Norden writes: Mainstream philosophy in the so-called West is narrow-minded, unimaginative, and even xenophobic. I know I am levelling a serious charge. But how else can we explain the fact that the rich philosophical traditions of China, India, Africa, and the Indigenous peoples of the Americas are completely ignored by almost all philosophy departments in both Europe and the English-speaking world?

Western philosophy used to be more open-minded and cosmopolitan. The first major translation into a European language of the Analects, the saying of Confucius (551-479 BCE), was done by Jesuits, who had extensive exposure to the Aristotelian tradition as part of their rigorous training. They titled their translation Confucius Sinarum Philosophus, or Confucius, the Chinese Philosopher (1687).

One of the major Western philosophers who read with fascination Jesuit accounts of Chinese philosophy was Gottfried Wilhelm Leibniz (1646-1716). He was stunned by the apparent correspondence between binary arithmetic (which he invented, and which became the mathematical basis for all computers) and the I Ching, or Book of Changes, the Chinese classic that symbolically represents the structure of the Universe via sets of broken and unbroken lines, essentially 0s and 1s. (In the 20th century, the psychoanalyst Carl Jung was so impressed with the I Ching that he wrote a philosophical foreword to a translation of it.) Leibniz also said that, while the West has the advantage of having received Christian revelation, and is superior to China in the natural sciences, ‘certainly they surpass us (though it is almost shameful to confess this) in practical philosophy, that is, in the precepts of ethics and politics adapted to the present life and the use of mortals’.

The German philosopher Christian Wolff echoed Leibniz in the title of his public lecture Oratio de Sinarum Philosophia Practica, or Discourse on the Practical Philosophy of the Chinese (1721). Wolff argued that Confucius showed that it was possible to have a system of morality without basing it on either divine revelation or natural religion. Because it proposed that ethics can be completely separated from belief in God, the lecture caused a scandal among conservative Christians, who had Wolff relieved of his duties and exiled from Prussia. However, his lecture made him a hero of the German Enlightenment, and he immediately obtained a prestigious position elsewhere. In 1730, he delivered a second public lecture, De Rege Philosophante et Philosopho Regnante, or On the Philosopher King and the Ruling Philosopher, which praised the Chinese for consulting ‘philosophers’ such as Confucius and his later follower Mengzi (fourth century BCE) about important matters of state.

Chinese philosophy was also taken very seriously in France. One of the leading reformers at the court of Louis XV was François Quesnay (1694-1774). He praised Chinese governmental institutions and philosophy so lavishly in his work Despotisme de la China (1767) that he became known as ‘the Confucius of Europe’. Quesnay was one of the originators of the concept of laissez-faire economics, and he saw a model for this in the sage-king Shun, who was known for governing by wúwéi (non-interference in natural processes). The connection between the ideology of laissez-faire economics and wúwéi continues to the present day. In his State of the Union address in 1988, the US president Ronald Reagan quoted a line describing wúwéi from the Daodejing, which he interpreted as a warning against government regulation of business. (Well, I didn’t say that every Chinese philosophical idea was a good idea.)

Leibniz, Wolff and Quesnay are illustrations of what was once a common view in European philosophy. In fact, as Peter K J Park notes in Africa, Asia, and the History of Philosophy: Racism in the Formation of the Philosophical Canon (2014), the only options taken seriously by most scholars in the 18th century were that philosophy began in India, that philosophy began in Africa, or that both India and Africa gave philosophy to Greece.

So why did things change? As Park convincingly argues, Africa and Asia were excluded from the philosophical canon by the confluence of two interrelated factors. On the one hand, defenders of the philosophy of Immanuel Kant (1724-1804) consciously rewrote the history of philosophy to make it appear that his critical idealism was the culmination toward which all earlier philosophy was groping, more or less successfully.

On the other hand, European intellectuals increasingly accepted and systematised views of white racial superiority that entailed that no non-Caucasian group could develop philosophy. [Continue reading…]

Facebooktwittermail

Volcanic eruptions in Alaska could have impacted lives of ancient Egyptians

The Washington Post reports: Did volcanoes in Russia, Greenland and Alaska affect the lives of ancient Egyptians?

It may sound improbable, but according to a new study, the answer is yes.

In a paper published in Nature Communications, a team of researchers shows that volcanic eruptions in high northern latitudes of the globe can affect the Nile watershed, causing the flow of one of the world’s mightiest rivers to slow.

This, in turn, could keep the lower Nile from flooding in the late summer months — a regular occurrence on which ancient Egyptians relied to irrigate their crops.

No Nile flooding meant no irrigation, which meant a bad year in the fields, low food supplies and, ultimately, civic unrest, researchers say.

“It’s a bizarre concept that Alaskan volcanoes were screwing up the Nile, but in fact that’s what happened,” said Joseph Manning, a historian at Yale University who worked on the study. [Continue reading…]

Facebooktwittermail

U.S. withdraws from UNESCO, saying it’s biased against Israel

Bloomberg reports: The Trump administration withdrew the U.S. from the United Nations cultural organization, saying it’s biased against Israel and citing its decision to admit the Palestinian territories as a member state.

The decision to quit the U.N. Educational, Scientific and Cultural Organization, which the U.S. co-founded in 1945, “was not taken lightly,” State Department spokeswoman Heather Nauert said in a statement Thursday. She cited the need for “fundamental reform in the organization, and continuing anti-Israel bias at UNESCO.”

The U.S. hasn’t been paying dues to UNESCO since 2011, when President Barack Obama’s administration stopped providing about $72 million a year after the Paris-based organization accepted Palestine as a full member. The arrears total almost $543 million, according to UNESCO. U.S. laws bar funding for any UN agency that gives Palestinians the status of a nation, and the U.S. lost its voting privilege in the organization in 2013.

That decision threw the organization into financial crisis because the U.S. had accounted for more than 20 percent of UNESCO’s annual budget. The U.S. also withdrew from the organization in 1984 but rejoined in 2003. [Continue reading…]

 

Facebooktwittermail

Chimpanzees learn to use tools on their own, no teaching required

Leah Froats writes: As it turns out, chimpanzees don’t need to see in order to do, no matter what the old mantra might lead you to believe.

A common belief among researchers is that chimps need to watch other members of their communities use tools before they can pick the behavior up. In a study published in PeerJ in September, researchers from the University of Birmingham, and the University of Tübingen challenged this belief and checked to see if it would hold for a specific kind of tool use.

They attempted to recreate a behavior commonly found in the wild: the use of sticks to scoop algae from the water to eat. Would chimpanzees that were unfamiliar with this behavior be able to figure it out on their own? [Continue reading…]

Facebooktwittermail

Return of the city-state

Jamie Bartlett writes: If you’d been born 1,500 years ago in southern Europe, you’d have been convinced that the Roman empire would last forever. It had, after all, been around for 1,000 years. And yet, following a period of economic and military decline, it fell apart. By 476 CE it was gone. To the people living under the mighty empire, these events must have been unthinkable. Just as they must have been for those living through the collapse of the Pharaoh’s rule or Christendom or the Ancien Régime.

We are just as deluded that our model of living in ‘countries’ is inevitable and eternal. Yes, there are dictatorships and democracies, but the whole world is made up of nation-states. This means a blend of ‘nation’ (people with common attributes and characteristics) and ‘state’ (an organised political system with sovereignty over a defined space, with borders agreed by other nation-states). Try to imagine a world without countries – you can’t. Our sense of who we are, our loyalties, our rights and obligations, are bound up in them.

Which is all rather odd, since they’re not really that old. Until the mid-19th century, most of the world was a sprawl of empires, unclaimed land, city-states and principalities, which travellers crossed without checks or passports. As industrialisation made societies more complex, large centralised bureaucracies grew up to manage them. Those governments best able to unify their regions, store records, and coordinate action (especially war) grew more powerful vis-à-vis their neighbours. Revolutions – especially in the United States (1776) and France (1789) – helped to create the idea of a commonly defined ‘national interest’, while improved communications unified language, culture and identity. Imperialistic expansion spread the nation-state model worldwide, and by the middle of the 20th century it was the only game in town. There are now 193 nation-states ruling the world.

But the nation-state with its borders, centralised governments, common people and sovereign authority is increasingly out of step with the world. And as Karl Marx observed, if you change the dominant mode of production that underpins a society, the social and political structure will change too. [Continue reading…]

Facebooktwittermail

‘Uncontacted’ Amazon tribe members are reported killed in Brazil

The New York Times reports: They were members of an uncontacted tribe gathering eggs along the river in a remote part of the Amazon. Then, it appears, they had the bad luck of running into gold miners.

Now, federal prosecutors in Brazil have opened an investigation into the reported massacre of about 10 members of the tribe, the latest evidence that threats to endangered indigenous groups are on the rise in the country.

The Brazilian agency on indigenous affairs, Funai, said it had lodged a complaint with the prosecutor’s office in the state of Amazonas after the gold miners went to a bar in a near the border with Colombia, and bragged about the killings. They brandished a hand-carved paddle that they said had come from the tribe, the agency said.

“It was crude bar talk,” said Leila Silvia Burger Sotto-Maior, Funai’s coordinator for uncontacted and recently contacted tribes. “They even bragged about cutting up the bodies and throwing them in the river.”

The miners, she said, claimed that “they had to kill them or be killed.”

Ms. Sotto-Maior said the killings were reported to have taken place last month. The indigenous affairs bureau conducted some initial interviews in the town and then took the case to the police.

“There is a lot of evidence, but it needs to be proven,” she said.

The prosecutor in charge of the case, Pablo Luz de Beltrand, confirmed that an investigation had begun, but said he could not discuss the details of the case while it was underway. He said the episode was alleged to have occurred in the Javari Valley — the second-largest indigenous reserve in Brazil — in the remote west.

“We are following up, but the territories are big and access is limited,” Mr. Beltrand said. “These tribes are uncontacted — even Funai has only sporadic information about them. So it’s difficult work that requires all government departments working together.”

Mr. Beltrand said it was the second such episode that he was investigating this year. The first reported killing of uncontacted Indians in the region occurred in February, and that case is still open. “It was the first time that we’d had this kind of case in this region,” he said in a telephone interview. “It’s not something that was happening before.”

Survival International, a global indigenous rights group, warned that given the small sizes of the uncontacted Amazon tribes, this latest episode could mean that a significant percentage of a remote ethnic group was wiped out.

“If the investigation confirms the reports, it will be yet another genocidal massacre resulting directly from the Brazilian government’s failure to protect isolated tribes — something that is guaranteed in the Constitution,” said Sarah Shenker, a senior campaigner with the rights group.

Under Brazil’s president, Michel Temer, funding for indigenous affairs has been slashed. In April, Funai closed five of the 19 bases that it uses to monitor and protect isolated tribes, and reduced staffing at others. The bases are used to prevent invasions by loggers and miners and to communicate with recently contacted tribes. [Continue reading…]

Facebooktwittermail

How Silicon Valley is erasing your individuality

Franklin Foer writes: Until recently, it was easy to define our most widely known corporations. Any third-grader could describe their essence. Exxon sells gas; McDonald’s makes hamburgers; Walmart is a place to buy stuff. This is no longer so. Today’s ascendant monopolies aspire to encompass all of existence. Google derives from googol, a number (1 followed by 100 zeros) that mathematicians use as shorthand for unimaginably large quantities. Larry Page and Sergey Brin founded Google with the mission of organizing all knowledge, but that proved too narrow. They now aim to build driverless cars, manufacture phones and conquer death. Amazon, which once called itself “the everything store,” now produces television shows, owns Whole Foods and powers the cloud. The architect of this firm, Jeff Bezos, even owns this newspaper.

Along with Facebook, Microsoft and Apple, these companies are in a race to become our “personal assistant.” They want to wake us in the morning, have their artificial intelligence software guide us through our days and never quite leave our sides. They aspire to become the repository for precious and private items, our calendars and contacts, our photos and documents. They intend for us to turn unthinkingly to them for information and entertainment while they catalogue our intentions and aversions. Google Glass and the Apple Watch prefigure the day when these companies implant their artificial intelligence in our bodies. Brin has mused, “Perhaps in the future, we can attach a little version of Google that you just plug into your brain.”

More than any previous coterie of corporations, the tech monopolies aspire to mold humanity into their desired image of it. They think they have the opportunity to complete the long merger between man and machine — to redirect the trajectory of human evolution. [Continue reading…]

Facebooktwittermail

The modern state, not ideas, brought about religious freedom

Mark Koyama writes: Religious freedom has become an emblematic value in the West. Embedded in constitutions and championed by politicians and thinkers across the political spectrum, it is to many an absolute value, something beyond question. Yet how it emerged, and why, remains widely misunderstood.

According to the conventional narrative, freedom of religion arose in the West in the wake of devastating wars fought over religion. It was catalysed by powerful arguments from thinkers such as John Locke, Baruch Spinoza, Pierre Bayle and Voltaire. These philosophers and political theorists responded to the brutality of the religious wars with support for radical notions of toleration and religious freedom. Their liberal ideals then became embedded in the political institutions of the West, following the American and French Revolutions.

In broad outline, such is the account accepted by most political philosophers and social scientists. But the evidence does not support this emphasis on the power of ideas in shaping the rise of religious freedom, and underestimates the decisive role played by institutions.

The ideas of the philosophers were indeed important. In his Dictionnaire Historique et Critique (1697), Bayle pointed out that if one religion claimed to be the only true faith, it by implication possessed the right to persecute all the others, and all other faiths possessed an equal right to make such a claim. Showing the inherent volatility, for society, of such religious-truth claims, Bayle also argued that if people turned out to be mistaken about their religion, they could hardly be guilty of sin for nonetheless trying, in their sincerity, to observe its dictates.

Locke argued that true faith could not be compelled. It followed, he claimed, that restricting the rights of religious minorities should only be done for reasons of state, that is, not for reasons of faith or salvation. Voltaire took a no less effective course, relentlessly documenting and mocking cases of religious persecution. Time and again, he made zealots and enforcers of religious dogma look ridiculous. These are compelling and consequential ideas, and worthy of continued study and reading.

But focusing on these ideas does not fully explain how religious freedom came to the West. The intellectual importance of Bayle, Locke and Voltaire does not mean that their ideas were central to religious freedom as it developed and came to be in actual political and social life. [Continue reading…]

Facebooktwittermail

What Germany can teach the U.S. about remembering an ugly past without glorifying it

Fred Kaplan writes: President Donald Trump tweeted on Thursday that he’s “sad to see the history and culture of our great country being ripped apart with the removal of our beautiful statues and monuments”—thus furnishing further proof that he knows nothing about history or culture or beauty, much less the reason why monuments are built in the first place.

As many have pointed out, the statues of Confederate officers that scar the cities of the South (and too many spots in the North as well) were erected not in the immediate wake of the Civil War but rather decades later, during the revival of the Ku Klux Klan, as a show of force—from the rulers to the ruled—that the old guard, though defeated in battle, was still in charge.

Trump and all those who find his appeals to historical preservation persuasive should go to Berlin, a city of vast and multiple horrors throughout its history, yet also a city that is facing those horrors head-on, unflinchingly. The city memorializes not its discarded leaders but rather their victims. And instead of mounting old warlords on pedestals (there is nothing “beautiful” about a man on horseback, whether Confederate, Nazi, or Communist), the city displays the full record of their crimes against humanity. [Continue reading…]

Facebooktwittermail

Are men seen as ‘more American’ than women?

File 20170605 31028 11j2uv9
Protesters hold signs at the Chicago Women’s March in January 2017.
John W. Iwanski, CC BY-NC

By Laura Van Berkel, University of Cologne; Ludwin Molina, University of Kansas, and Sahana Mukherjee, Gettysburg College

Women make up 50.8 percent of the U.S. population and have equal voting rights, yet are politically underrepresented. The country has never had a female president or vice president. Only 3.5 percent of Supreme Court justices have been women, and women make up only 20 percent of Congress.

Studies have shown that within a country, groups with more power often feel greater ownership over it. Because they control actual resources, like money, and symbolic resources, like writing history, they’re better able to shape the culture in their image. For example, because Christianity is the most prominent religion in the United States, Christmas is a federal holiday.

Because men hold more power than women in the United States, we wanted to explore a simple question: Would people tend to think of men as “more American” than women? And, if so, how does this influence the way American women identify with their country?

Continue reading

Facebooktwittermail

‘I think we like our phones more than we like actual people’

Jean M Twenge writes: One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

I’ve been researching generational differences for 25 years, starting when I was a 22-year-old doctoral student in psychology. Typically, the characteristics that come to define a generation appear gradually, and along a continuum. Beliefs and behaviors that were already rising simply continue to do so. Millennials, for instance, are a highly individualistic generation, but individualism had been increasing since the Baby Boomers turned on, tuned in, and dropped out. I had grown accustomed to line graphs of trends that looked like modest hills and valleys. Then I began studying Athena’s generation.

Around 2012, I noticed abrupt shifts in teen behaviors and emotional states. The gentle slopes of the line graphs became steep mountains and sheer cliffs, and many of the distinctive characteristics of the Millennial generation began to disappear. In all my analyses of generational data—some reaching back to the 1930s—I had never seen anything like it.

At first I presumed these might be blips, but the trends persisted, across several years and a series of national surveys. The changes weren’t just in degree, but in kind. The biggest difference between the Millennials and their predecessors was in how they viewed the world; teens today differ from the Millennials not just in their views but in how they spend their time. The experiences they have every day are radically different from those of the generation that came of age just a few years before them.

What happened in 2012 to cause such dramatic shifts in behavior? It was after the Great Recession, which officially lasted from 2007 to 2009 and had a starker effect on Millennials trying to find a place in a sputtering economy. But it was exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.

The more I pored over yearly surveys of teen attitudes and behaviors, and the more I talked with young people like Athena, the clearer it became that theirs is a generation shaped by the smartphone and by the concomitant rise of social media. [Continue reading…]

Facebooktwittermail