Expansion of tillage through use of draft animals led to the growth of inequality across the Old World

Researchers at Washington State University and 13 other institutions have found that the arc of prehistory bends towards economic inequality.

In the largest study of its kind, the researchers saw disparities in wealth mount with the rise of agriculture, specifically the domestication of plants and large animals, and increased social organization.

Their findings, published last month in the journal Nature, have profound implications for contemporary society, as inequality repeatedly leads to social disruption, even collapse, said Tim Kohler, lead author and WSU Regents professor of archaeology and evolutionary anthropology at WSU. The United States, he noted, currently has one of the highest levels of inequality in the history of the world.

“Inequality has a lot of subtle and potentially pernicious effects on societies,” Kohler said.

The study gathered data from 63 archaeological sites or groups of sites. Comparing house sizes within each site, researchers assigned Gini coefficients, common measures of inequality developed more than a century ago by the Italian statistician and sociologist Corrado Gini. In theory, a country with complete wealth equality would have a Gini coefficient of 0, while a country with all the wealth concentrated in one household would get a 1.

The researchers found that hunter-gatherer societies typically had low wealth disparities, with a median Gini of .17. Their mobility would make it hard to accumulate wealth, let alone pass it on to subsequent generations. Horticulturalists — small-scale, low-intensity farmers — had a median Gini of .27. Larger scale agricultural societies had a media Gini of .35.

To the researchers’ surprise, inequality kept rising in the Old World, while it hit a plateau in the New World, said Kohler. The researchers attribute this to the ability of Old World societies “to literally harness big domesticated mammals like cattle and eventually horses and water buffalo,” Kohler said.

Draft animals, which were not available in the New World, let richer farmers till more land and expand into new areas. This increased their wealth while ultimately creating a class of landless peasants.

“These processes increased inequality by operating on both ends of the wealth distribution, increasing the holdings of the rich while decreasing the holdings of the poor,” the researchers write.

The Old World also saw the arrival of bronze metallurgy and a mounted warrior elite that increased Ginis through large houses and territorial conquests. [Continue reading…]

Facebooktwittermail

The forgotten art of squatting

Rosie Spinks writes: Sentences that start with the phrase “A guru once told me…” are, more often than not, eye-roll-inducing. But recently, while resting in malasana, or a deep squat, in an East London yoga class, I was struck by the second half of the instructor’s sentence: “A guru once told me that the problem with the West is they don’t squat.”

This is plainly true. In much of the developed world, resting is synonymous with sitting. We sit in desk chairs, eat from dining chairs, commute seated in cars or on trains, and then come home to watch Netflix from comfy couches. With brief respites for walking from one chair to another, or short intervals for frenzied exercise, we spend our days mostly sitting. This devotion to placing our backsides in chairs makes us an outlier, both globally and historically. In the past half century, epidemiologists have been forced to shift how they study movement patterns. In modern times, the sheer amount of sitting we do is a separate problem from the amount of exercise we get.

Our failure to squat has biomechanical and physiological implications, but it also points to something bigger. In a world where we spend so much time in our heads, in the cloud, on our phones, the absence of squatting leaves us bereft of the grounding force that the posture has provided since our hominid ancestors first got up off the floor. In other words: If what we want is to be well, it might be time for us to get low.

To be clear, squatting isn’t just an artifact of our evolutionary history. A large swath of the planet’s population still does it on a daily basis, whether to rest, to pray, to cook, to share a meal, or to use the toilet. (Squat-style toilets are the norm in Asia, and pit latrines in rural areas all over the world require squatting.) As they learn to walk, toddlers from New Jersey to Papua New Guinea squat—and stand up from a squat—with grace and ease. In countries where hospitals are not widespread, squatting is also a position associated with that most fundamental part of life: birth.

It’s not specifically the West that no longer squats; it’s the rich and middle classes all over the world. My Quartz colleague, Akshat Rathi, originally from India, remarked that the guru’s observation would be “as true among the rich in Indian cities as it is in the West.”

But in Western countries, entire populations—rich and poor—have abandoned the posture. On the whole, squatting is seen as an undignified and uncomfortable posture—one we avoid entirely. At best, we might undertake it during Crossfit, pilates or while lifting at the gym, but only partially and often with weights (a repetitive maneuver that’s hard to imagine being useful 2.5 million years ago). This ignores the fact that deep squatting as a form of active rest is built in to both our evolutionary and developmental past: It’s not that you can’t comfortably sit in a deep squat, it’s just that you’ve forgotten how. [Continue reading…]

Facebooktwittermail

The deep history of the domestication and enslavement of humans

Steven Mithen writes: When our ancestors began to control fire, most likely somewhere in Africa around 400,000 years ago, the planet was set on a new course. We have little idea and even less evidence of how early humans made fire; perhaps they carried around smouldering bundles of leaves from forest fires, or captured the sparks thrown off when chipping stone or rubbing sticks together. However it happened, the human control of fire made an indelible mark on the earth’s ecosystems, and marked the beginning of the Anthropocene – the epoch in which humans have had a significant impact on the planet.

In Against the Grain James Scott describes these early stages as a ‘“thin” Anthropocene’, but ever since, the Anthropocene has been getting thicker. New layers of human impact were added by the adoption of farming about ten thousand years ago, the invention of the steam engine around 1780, and the dropping of the atomic bomb in 1945. Today the Anthropocene is so dense that we have virtually lost sight of anything that could be called ‘the natural world’.

Fire changed humans as well as the world. Eating cooked food transformed our bodies; we developed a much shorter digestive tract, meaning that more metabolic energy was available to grow our brains. At the same time, Homo sapiens became domesticated by its dependence on fire for warmth, protection and fuel. If this was the start of human progress towards ‘civilisation’, then – according to the conventional narrative – the next step was the invention of agriculture around ten thousand years ago. Farming, it is said, saved us from a dreary nomadic Stone Age hunter-gatherer existence by allowing us to settle down, build towns and develop the city-states that were the centres of early civilisations. People flocked to them for the security, leisure and economic opportunities gained from living within thick city walls. The story continues with the collapse of the city-states and barbarian insurgency, plunging civilised worlds – ancient Mesopotamia, China, Mesoamerica – into their dark ages. Thus civilisations rise and fall. Or so we are told.

The perfectly formed city-state is the ideal, deeply ingrained in the Western psyche, on which our notion of the nation-state is founded, ultimately inspiring Donald Trump’s notion of a ‘city’ wall to keep out the barbarian Mexican horde, and Brexiters’ desire to ‘take back control’ from insurgent European bureaucrats. But what if the conventional narrative is entirely wrong? What if ancient ruins testify to an aberration in the normal state of human affairs rather than a glorious and ancient past to whose achievements we should once again aspire? What if the origin of farming wasn’t a moment of liberation but of entrapment? Scott offers an alternative to the conventional narrative that is altogether more fascinating, not least in the way it omits any self-congratulation about human achievement. His account of the deep past doesn’t purport to be definitive, but it is surely more accurate than the one we’re used to, and it implicitly exposes the flaws in contemporary political ideas that ultimately rest on a narrative of human progress and on the ideal of the city/nation-state. [Continue reading…]

Facebooktwittermail

Our relentless consumption is trashing the planet

George Monbiot writes: Everyone wants everything – how is that going to work? The promise of economic growth is that the poor can live like the rich and the rich can live like the oligarchs. But already we are bursting through the physical limits of the planet that sustains us. Climate breakdown, soil loss, the collapse of habitats and species, the sea of plastic, insectageddon: all are driven by rising consumption. The promise of private luxury for everyone cannot be met: neither the physical nor the ecological space exists.

But growth must go on: this is everywhere the political imperative. And we must adjust our tastes accordingly. In the name of autonomy and choice, marketing uses the latest findings in neuroscience to break down our defences. Those who seek to resist must, like the Simple Lifers in Brave New World, be silenced – in this case by the media.

With every generation, the baseline of normalised consumption shifts. Thirty years ago, it was ridiculous to buy bottled water, where tap water is clean and abundant. Today, worldwide, we use a million plastic bottles a minute.

Every Friday is a Black Friday, every Christmas a more garish festival of destruction. Among the snow saunas, portable watermelon coolers and smartphones for dogs with which we are urged to fill our lives, my #extremecivilisation prize now goes to the PancakeBot: a 3D batter printer that allows you to eat the Mona Lisa, the Taj Mahal, or your dog’s bottom every morning. In practice, it will clog up your kitchen for a week until you decide you don’t have room for it. For junk like this, we’re trashing the living planet, and our own prospects of survival. Everything must go. [Continue reading…]

Facebooktwittermail

The case against civilization

John Lanchester writes: Science and technology: we tend to think of them as siblings, perhaps even as twins, as parts of STEM (for “science, technology, engineering, and mathematics”). When it comes to the shiniest wonders of the modern world—as the supercomputers in our pockets communicate with satellites—science and technology are indeed hand in glove. For much of human history, though, technology had nothing to do with science. Many of our most significant inventions are pure tools, with no scientific method behind them. Wheels and wells, cranks and mills and gears and ships’ masts, clocks and rudders and crop rotation: all have been crucial to human and economic development, and none historically had any connection with what we think of today as science. Some of the most important things we use every day were invented long before the adoption of the scientific method. I love my laptop and my iPhone and my Echo and my G.P.S., but the piece of technology I would be most reluctant to give up, the one that changed my life from the first day I used it, and that I’m still reliant on every waking hour—am reliant on right now, as I sit typing—dates from the thirteenth century: my glasses. Soap prevented more deaths than penicillin. That’s technology, not science.

In “Against the Grain: A Deep History of the Earliest States,” James C. Scott, a professor of political science at Yale, presents a plausible contender for the most important piece of technology in the history of man. It is a technology so old that it predates Homo sapiens and instead should be credited to our ancestor Homo erectus. That technology is fire. We have used it in two crucial, defining ways. The first and the most obvious of these is cooking. As Richard Wrangham has argued in his book “Catching Fire,” our ability to cook allows us to extract more energy from the food we eat, and also to eat a far wider range of foods. Our closest animal relative, the chimpanzee, has a colon three times as large as ours, because its diet of raw food is so much harder to digest. The extra caloric value we get from cooked food allowed us to develop our big brains, which absorb roughly a fifth of the energy we consume, as opposed to less than a tenth for most mammals’ brains. That difference is what has made us the dominant species on the planet.

The other reason fire was central to our history is less obvious to contemporary eyes: we used it to adapt the landscape around us to our purposes. Hunter-gatherers would set fires as they moved, to clear terrain and make it ready for fast-growing, prey-attracting new plants. They would also drive animals with fire. They used this technology so much that, Scott thinks, we should date the human-dominated phase of earth, the so-called Anthropocene, from the time our forebears mastered this new tool.

We don’t give the technology of fire enough credit, Scott suggests, because we don’t give our ancestors much credit for their ingenuity over the long period—ninety-five per cent of human history—during which most of our species were hunter-gatherers. “Why human fire as landscape architecture doesn’t register as it ought to in our historical accounts is perhaps that its effects were spread over hundreds of millennia and were accomplished by ‘precivilized’ peoples also known as ‘savages,’ ” Scott writes. To demonstrate the significance of fire, he points to what we’ve found in certain caves in southern Africa. The earliest, oldest strata of the caves contain whole skeletons of carnivores and many chewed-up bone fragments of the things they were eating, including us. Then comes the layer from when we discovered fire, and ownership of the caves switches: the human skeletons are whole, and the carnivores are bone fragments. Fire is the difference between eating lunch and being lunch.

Anatomically modern humans have been around for roughly two hundred thousand years. For most of that time, we lived as hunter-gatherers. Then, about twelve thousand years ago, came what is generally agreed to be the definitive before-and-after moment in our ascent to planetary dominance: the Neolithic Revolution. This was our adoption of, to use Scott’s word, a “package” of agricultural innovations, notably the domestication of animals such as the cow and the pig, and the transition from hunting and gathering to planting and cultivating crops. The most important of these crops have been the cereals—wheat, barley, rice, and maize—that remain the staples of humanity’s diet. Cereals allowed population growth and the birth of cities, and, hence, the development of states and the rise of complex societies.

The story told in “Against the Grain” heavily revises this widely held account. Scott’s specialty is not early human history. His work has focussed on a skeptical, peasant’s-eye view of state formation; the trajectory of his interests can be traced in the titles of his books, from “The Moral Economy of the Peasant” to “The Art of Not Being Governed.” His best-known book, “Seeing Like a State,” has become a touchstone for political scientists, and amounts to a blistering critique of central planning and “high modernism,” the idea that officials at the center of a state know better than the people they are governing. Scott argues that a state’s interests and the interests of subjects are often not just different but opposite. Stalin’s project of farm collectivization “served well enough as a means whereby the state could determine cropping patterns, fix real rural wages, appropriate a large share of whatever grain was produced, and politically emasculate the countryside”; it also killed many millions of peasants.

Scott’s new book extends these ideas into the deep past, and draws on existing research to argue that ours is not a story of linear progress, that the time line is much more complicated, and that the causal sequences of the standard version are wrong. He focusses his account on Mesopotamia—roughly speaking, modern-day Iraq—because it is “the heartland of the first ‘pristine’ states in the world,” the term “pristine” here meaning that these states bore no watermark from earlier settlements and were the first time any such social organizations had existed. They were the first states to have written records, and they became a template for other states in the Near East and in Egypt, making them doubly relevant to later history.

The big news to emerge from recent archeological research concerns the time lag between “sedentism,” or living in settled communities, and the adoption of agriculture. Previous scholarship held that the invention of agriculture made sedentism possible. The evidence shows that this isn’t true: there’s an enormous gap—four thousand years—separating the “two key domestications,” of animals and cereals, from the first agrarian economies based on them. Our ancestors evidently took a good, hard look at the possibility of agriculture before deciding to adopt this new way of life. They were able to think it over for so long because the life they lived was remarkably abundant. Like the early civilization of China in the Yellow River Valley, Mesopotamia was a wetland territory, as its name (“between the rivers”) suggests. In the Neolithic period, Mesopotamia was a delta wetland, where the sea came many miles inland from its current shore.

This was a generous landscape for humans, offering fish and the animals that preyed on them, fertile soil left behind by regular flooding, migratory birds, and migratory prey travelling near river routes. The first settled communities were established here because the land offered such a diverse web of food sources. If one year a food source failed, another would still be present. The archeology shows, then, that the “Neolithic package” of domestication and agriculture did not lead to settled communities, the ancestors of our modern towns and cities and states. Those communities had been around for thousands of years, living in the bountiful conditions of the wetlands, before humanity committed to intensive agriculture. Reliance on a single, densely planted cereal crop was much riskier, and it’s no wonder people took a few millennia to make the change.

So why did our ancestors switch from this complex web of food supplies to the concentrated production of single crops? We don’t know, although Scott speculates that climatic stress may have been involved. Two things, however, are clear. The first is that, for thousands of years, the agricultural revolution was, for most of the people living through it, a disaster. The fossil record shows that life for agriculturalists was harder than it had been for hunter-gatherers. Their bones show evidence of dietary stress: they were shorter, they were sicker, their mortality rates were higher. Living in close proximity to domesticated animals led to diseases that crossed the species barrier, wreaking havoc in the densely settled communities. Scott calls them not towns but “late-Neolithic multispecies resettlement camps.” Who would choose to live in one of those? Jared Diamond called the Neolithic Revolution “the worst mistake in human history.” The startling thing about this claim is that, among historians of the era, it isn’t very controversial. [Continue reading…]

Facebooktwittermail

More than 1 billion ‘invisible people’ worldwide have no proof of identity

AFP reports: More than 1.1 billion people worldwide officially don’t exist — going about their daily lives without proof of identity.

The issue leaves a significant fraction of the global population deprived of health and education services.

Among these “invisible people” — many of whom live primarily in Africa and Asia — more than one-third are children susceptible to violence whose births have not been registered, the World Bank’s “Identification for Development” (ID4D) program recently warned.

The problem is particularly acute in geographical areas whose residents face poverty, discrimination, epidemics or armed conflicts.

Vyjayanti Desai, who manages the ID4D program, said the issue arises from a number of factors, but cited the distance between people and government services in developing areas as major.

For populations near the Peruvian Amazon, for example, traveling to an administrative service can take some five days of transit by boat, according to Carolina Trivelli, Peru’s former development minister.

Many families are also simply not informed about the importance of birth registration — and the consequences of non-registration, which can include the denial of basic rights and benefits, or an increased likelihood of marrying or entering into the labor force underage.

And even if parents are aware of the need to declare a birth, costs can be crippling, said Anne-Sophie Lois, representative at the United Nations in Geneva and director of the children’s aid organization Plan International.

As a result, millions of children in Africa and Asia first encounter the administration only once they reach school age.

But “birth certificates are often needed to enroll in school” or take national exams, Lois said. [Continue reading…]

Facebooktwittermail

A giant insect ecosystem is collapsing due to humans. It’s a catastrophe

Michael McCarthy writes: Thirty-five years ago an American biologist Terry Erwin conducted an experiment to count insect species. Using an insecticide “fog”, he managed to extract all the small living things in the canopies of 19 individuals of one species of tropical tree, Luehea seemannii, in the rainforest of Panama. He recorded about 1,200 separate species, nearly all of them coleoptera (beetles) and many new to science; and he estimated that 163 of these would be found on Luehea seemannii only.

He calculated that as there are about 50,000 species of tropical tree, if that figure of 163 was typical for all the other trees, there would be more than eight million species, just of beetles, in the tropical rainforest canopy; and as beetles make up about 40% of all the arthropods, the grouping that contains the insects and the other creepy-crawlies from spiders to millipedes, the total number of such species in the canopy might be 20 million; and as he estimated the canopy fauna to be separate from, and twice as rich as, the forest floor, for the tropical forest as a whole the number of species might be 30 million.

Yes, 30 million. It was one of those extraordinary calculations, like Edwin Hubble’s of the true size of the universe, which sometimes stop us in our tracks.

Erwin reported that he was shocked by his conclusions and entomologists have argued over them ever since. But about insects, his findings make two things indisputably clear. One is that there are many, many more types than the million or so hitherto described by science, and probably many more than the 10m species sometimes postulated as an uppermost figure; and the second is that this is far and away the most successful group of creatures the Earth has ever seen.

They are multitudinous almost beyond our imagining. They thrive in soil, water, and air; they have triumphed for hundreds of millions of years in every continent bar Antarctica, in every habitat but the ocean. And it is their success – staggering, unparalleled and seemingly endless – which makes all the more alarming the great truth now dawning upon us: insects as a group are in terrible trouble and the remorselessly expanding human enterprise has become too much, even for them.

Does it matter? Oh yes. Most of our fruit crops are insect-pollinated, as are the vast majority of our wild plants
The astonishing report highlighted in the Guardian, that the biomass of flying insects in Germany has dropped by three quarters since 1989, threatening an “ecological Armageddon”, is the starkest warning yet; but it is only the latest in a series of studies which in the last five years have finally brought to public attention the real scale of the problem. [Continue reading…]

Facebooktwittermail

Insectageddon: Farming is more catastrophic than climate breakdown

George Monbiot writes: Which of these would you name as the world’s most pressing environmental issue? Climate breakdown, air pollution, water loss, plastic waste or urban expansion? My answer is none of the above. Almost incredibly, I believe that climate breakdown takes third place, behind two issues that receive only a fraction of the attention.

This is not to downgrade the danger presented by global heating – on the contrary, it presents an existential threat. It is simply that I have come to realise that two other issues have such huge and immediate impacts that they push even this great predicament into third place.

One is industrial fishing, which, all over the blue planet, is now causing systemic ecological collapse. The other is the erasure of non-human life from the land by farming.

And perhaps not only non-human life. According to the UN Food and Agriculture Organisation, at current rates of soil loss, driven largely by poor farming practice, we have just 60 years of harvests left. And this is before the Global Land Outlook report, published in September, found that productivity is already declining on 20% of the world’s cropland.

The impact on wildlife of changes in farming practice (and the expansion of the farmed area) is so rapid and severe that it is hard to get your head round the scale of what is happening. A study published this week in the journal Plos One reveals that flying insects surveyed on nature reserves in Germany have declined by 76% in 27 years. The most likely cause of this Insectageddon is that the land surrounding those reserves has become hostile to them: the volume of pesticides and the destruction of habitat have turned farmland into a wildlife desert.

It is remarkable that we need to rely on a study in Germany to see what is likely to have been happening worldwide: long-term surveys of this kind simply do not exist elsewhere. This failure reflects distorted priorities in the funding of science. There is no end of grants for research on how to kill insects, but hardly any money for discovering what the impacts of this killing might be. Instead, the work has been left – as in the German case – to recordings by amateur naturalists.

But anyone of my generation (ie in the second bloom of youth) can see and feel the change. We remember the “moth snowstorm” that filled the headlight beams of our parents’ cars on summer nights (memorialised in Michael McCarthy’s lovely book of that name). Every year I collected dozens of species of caterpillars and watched them grow and pupate and hatch. This year I tried to find some caterpillars for my children to raise. I spent the whole summer looking and, aside from the cabbage whites on our broccoli plants, found nothing in the wild but one garden tiger larva. Yes, one caterpillar in one year. I could scarcely believe what I was seeing – or rather, not seeing.

Insects, of course, are critical to the survival of the rest of the living world. Knowing what we now know, there is nothing surprising about the calamitous decline of insect-eating birds. Those flying insects – not just bees and hoverflies but species of many different families – are the pollinators without which a vast tract of the plant kingdom, both wild and cultivated, cannot survive. The wonders of the living planet are vanishing before our eyes. [Continue reading…]

Out of sight, out of mind — the issue here is not just generational in the sense experienced by those of us old enough to remember insects, birds, and other creatures in greater numbers. The issue is above all one that springs from the physical separation between humans and nature in a world where humans experience life predominantly inside cities and predominantly as the seemingly most commonplace species.

I happen to live in a town where squirrels undoubtedly outnumber humans and where bears can show up in the most unexpected places and yet even here, for most people most of the time, nature remains in the background of human affairs.

While the rapid demise of flying insects should provoke alarm in anyone with even just a rudimentary understanding of the interdependence of species, a more commonplace response is likely to be that this loss signifies a welcome reduction in unwanted pests — fewer mosquitoes, fewer flies, and less irritants to complain about.

When it comes to human appreciation for non-human forms of life, insects get short shrift.

Butterflies are admired and yet most people would be hard pressed to name a single species, let alone recognize and appreciate any species in its larval form.

Bees are appreciated as productive, yet potentially dangerous and to most people indistinguishable from wasps.

Ants are lauded in the abstract as exemplars of industry and complex social organization and yet bound to suffer swift extermination when they turn up where they’re unwelcome.

Even so, the objective truth that insects would grasp if they had the cognitive capacities to do so is that the most prolific forms of life that have lived sustainably on this planet for hundreds of millions of years are now at risk from the life-threatening effects of human infestation.

No, this isn’t an argument for the elimination of humans, but as the late-comers on the stage of life, we have to do a hell of a lot better learning how to harmoniously co-exist with the creatures around us. Not only do their lives depend on this, but so do ours.

Facebooktwittermail

The fall of Harvey Weinstein should be a moment to challenge extreme masculinity

Rebecca Solnit writes: This past week was not a good week for women. In the United States, it was reported that a man who allegedly raped a 12-year-old girl was granted joint custody of the resultant eight-year-old boy being raised by his young mother.

Earlier in the week, the severed head and legs of Swedish journalist Kim Wall, who disappeared after entering inventor Peter Madsen’s submarine, were discovered near Copenhagen. A hard drive belonging to Madsen, Danish police said, was loaded with videos showing women being decapitated alive.

A Swedish model received rape threats for posing in an Adidas advertisement with unshaven legs. The University of Southern California’s dean of medicine was dumped after reports resurfaced that he had sexually harrassed a young medical researcher in 2003. A number of men at liberal publications were revealed to have contacted Milo Yiannopoulos, urging him to attack women – “Please mock this fat feminist,” wrote a senior male staff writer at Vice’s women’s channel, since fired. And, of course, movie mogul Harvey Weinstein was described by the New York Times as a serial sexual harasser; his alleged offences, according to a TV journalist, including trapping her in a hallway, where he masturbated until he ejaculated into a potted plant.

This week, the New Yorker ran a follow-up story by Ronan Farrow (the biological son of Woody Allen, who has repudiated his father for his treatment of his sisters), expanding the charges women have made against Weinstein to include sexual assault. He quotes one young woman who said “he forced me to perform oral sex on him” after she showed up for a meeting. She added, “I have nightmares about him to this day.” Weinstein denies any non-consensual sex.

Saturday 7 October was the first anniversary of the release of the tape in which the United States president boasted about sexually assaulting women; 11 women then came forward to accuse Donald Trump. And last week began with the biggest mass shooting in modern US history, carried out by a man reported to have routinely verbally abused his girlfriend: domestic violence is common in the past of mass shooters.

Underlying all these attacks is a lack of empathy, a will to dominate, and an entitlement to control, harm and even take the lives of others. Though there is a good argument that mental illness is not a sufficient explanation – and most mentally ill people are nonviolent – mass shooters and rapists seem to have a lack of empathy so extreme it constitutes a psychological disorder. At this point in history, it seems to be not just a defect from birth, but a characteristic many men are instilled with by the culture around them. It seems to be the precondition for causing horrific suffering and taking pleasure in it as a sign of one’s own power and superiority, in regarding others as worthless, as yours to harm or eliminate. [Continue reading…]

Facebooktwittermail

Disengaged boys grow up to become disillusioned men

Amanda Ripley writes: Jordan has never had a female minister of education, women make up less than a fifth of its workforce, and women hold just 4 percent of board seats at public companies there. But, in school, Jordanian girls are crushing their male peers. The nation’s girls outperform its boys in just about every subject and at every age level. At the University of Jordan, the country’s largest university, women outnumber men by a ratio of two to one—and earn higher grades in math, engineering, computer-information systems, and a range of other subjects.

In fact, across the Arab world, women now earn more science degrees on a percentage basis than women in the United States. In Saudi Arabia alone, women earn half of all science degrees. And yet, most of those women are unlikely to put their degrees to paid use for very long.

This is baffling on the most obvious levels. In the West, researchers have long believed that future prospects incentivize students to invest in school. The conventional wisdom is that girls do better in school as women acquire more legal and political rights in society. But many Middle Eastern women do not go on to have long professional careers after graduating; they spend much of their lives working at home as wives and mothers. Fewer than one in every five workers is female in Jordan, Qatar, Saudi Arabia, the United Arab Emirates, and Oman.

This spring, I went to the Middle East to try to understand why girls are doing so much better in school, despite living in quintessentially patriarchal societies. Or, put another way, why boys are doing so badly.

It’s part of a pattern that is creeping across the globe: Wherever girls have access to school, they seem to eventually do better than boys. In 2015, teenage girls outperformed boys on a sophisticated reading test in 69 countries—every place in which the test was administered. In America, girls are more likely to take Advanced Placement tests, to graduate from high school, and to go to college, and women continue their education over a year longer than men. These are all glaring disparities in a world that values higher-order skills more than ever before. Natasha Ridge, the executive director of the Sheikh Saud bin Saqr Al Qasimi Foundation for Policy Research in the United Arab Emirates, has studied gender and education around the world. In the United Kingdom and the United States, Ridge believes she can draw a dotted line between the failure of boys to thrive in school and votes for Brexit and for Donald Trump. Disengaged boys grow up to become disillusioned men, Ridge says, left out of the progress they see around them.

And the gender gap in the Middle East represents a particularly extreme version of this trend.

“If you give girls a quality education, they will mostly run with it and do amazing things. It propels them,” says Ridge, one of the few researchers to have written extensively about the gender gap in the Arab world. But for boys, especially low-income boys, access to school has not had the same effect. “These boys struggle to find a connection between school and life,” she says, “and school is increasingly seen as a waste of time.”

Motivation is the dark matter of education. It’s everywhere but impossible to see. Motivation helps explain why some countries get impressive education results despite child poverty and lackluster teaching, while others get mediocre results despite universal health care and free iPads. When kids believe in school, as any teacher will tell you, everything gets easier. So it’s crucial to understand the motivation to learn and how it works in the lives of real boys and girls. Because the slow slipping away of boys’ interest in education represents a profound failure of schools and society. And the implications are universally terrible. All over the world, poorly educated men are more likely to be unemployed, to have physical- and mental-health problems, to commit acts of violence against their families, and to go to prison. They are less likely to marry but quite likely to father children. [Continue reading…]

Facebooktwittermail

The future of life necessitates that we rise way beyond the nationalist viewpoint

Yuval Noah Harari writes: Though human beings are social animals, for millions of years they lived in small, intimate communities numbering no more than a few dozen people. Even today, as the evolutionary biologist Robin Dunbar has shown, most human beings find it impossible properly to know more than 150 individuals, irrespective of how many Face­book “friends” they boast. Human beings easily develop loyalty to small, intimate groups such as a tribe, an infantry company or a family business, but it is hardly natural for them to be loyal to millions of strangers. Such mass loyalties have appeared only in the past few thousand years as a means of solving practical problems that no single tribe could solve by itself. Ancient Egypt was created to help human beings gain control of the River Nile, and ancient China coalesced to help the people restrain the turbulent Yellow River.

Nations solved some problems and created new ones. In particular, big nations led to big wars. Yet people were willing to pay the price in blood, because nations provided them with unprecedented levels of security and prosperity. In the 19th and early 20th centuries the nationalist deal still looked very attractive. Nationalism was leading to horrendous conflicts on an unprecedented scale, but modern nation states also built systems of health care, education and welfare. National health services made Passchendaele and Verdun seem worthwhile.

Yet the invention of nuclear weapons sharply tilted the balance of the deal. After Hiroshima, people no longer feared that nationalism would lead to mere war: they began to fear it would lead to nuclear war. Total annihilation has a way of ­sharpening people’s minds, and thanks in no small measure to the atomic bomb, the impossible happened and the nationalist genie was squeezed at least halfway back into its bottle. Just as the ancient villagers of the Yellow River Basin redirected some of their loyalty from local clans to a much bigger nation that restrained the dangerous river, so in the nuclear age a global community gradually developed over and above the various nations because only such a community could restrain the nuclear demon.

In the 1964 US presidential campaign, Lyndon B Johnson aired the “Daisy” advertisement, one of the most successful pieces of propaganda in the annals of television. The advert opens with a little girl picking and counting the petals of a daisy, but when she reaches ten, a metallic male voice takes over, counting back from ten to zero as in a missile launch countdown. Upon it reaching zero, the bright flash of a nuclear explosion fills the screen, and Candidate Johnson addresses the American public: “These are the stakes – to make a world in which all of God’s children can live, or to go into the dark. We must either love each other. Or we must die.” We often associate the slogan “Make love, not war” with the late-1960s counterculture, but already in 1964 it was accepted wisdom, even among hard-nosed politicians such as Johnson. [Continue reading…]

Facebooktwittermail

After a decade of reduction, global hunger is rising again due to conflict and climate change

Quartz reports: After a decade of progress made to cut the number of undernourished people on Earth, global hunger appears to be rising again.

The primary driver of growing hunger is the increase of conflicts around the world, many of which have been compounded by climate change, according to the 2017 State of Food Security and Nutrition report published by the United Nations Food and Agricultural Organization (FAO) on Sep. 15.

Among the 815 million undernourished people—representing more than one in 10 people alive today—more than 489 million live in parts of the world afflicted by armed conflicts. Many of these are regions that have suffered years of violence, including the Horn of Africa, the Great Lakes of Africa, and the parts of the Middle East affected by the Syrian War. Countries outside these regions that have faced similar ongoing conflict include South Sudan, Yemen, Cameroon, Chad, Nigeria, Afghanistan, Pakistan, and India. [Continue reading…]

Facebooktwittermail

There are an estimated 40 million slaves in the world. Where do they live and what do they do?

The Washington Post reports: Slavery is not a thing of the past. A report released Tuesday by the U.N.-affiliated International Labor Office (ILO) and the Walk Free Foundation estimates that there were 40.3 million people in some form of modern slavery around the world on any given day last year.

But by its very nature, the accuracy of that figure is hard to gauge. Slavery tends to be a hidden, illegal practice — one in which the victim’s ability to speak out is limited. The authors of the Global Estimates of Modern Slavery study admit there are gaps in the available information: Although extensive United Nations data has been used in the study, some countries and sub-national regions are missing.

“It’s difficult or even impossible to do research in areas of high conflict,” said Fiona David, Walk Free Foundation’s executive director of global research, pointing to areas such as Syria or northern Nigeria that had to be excluded from the study. Because of this, David said, the estimate of 40.3 million is probably conservative. [Continue reading…]

Facebooktwittermail

The stunning underwater picture this photographer wishes ‘didn’t exist’

Lindsey Bever writes: The powerful and poignant image shows a tiny sea horse holding tightly onto a pink, plastic cotton swab in blue-green waters around Indonesia.

California nature photographer Justin Hofman snapped the picture late last year off the coast of Sumbawa, an Indonesian island in the Lesser Sunda Islands chain. The 33-year-old, from Monterey, Calif., said a colleague pointed out the pocket-size sea creature, which he estimated to be about 1.5 inches tall — so small, in fact, that Hofman said he almost didn’t reach for his camera.

“The wind started to pick up and the sea horse started to drift. It first grabbed onto a piece of sea grass,” Hofman said Thursday in a phone interview.

Hofman started shooting.

“Eventually more and more trash and debris started to move through,” he said, adding that the critter lost its grip, then latched onto a white, wispy piece of a plastic bag. “The next thing it grabbed was a Q-Tip.”

Hofman said he wishes the picture “didn’t exist” — but it does; and now, he said, he feels responsible “to make sure it gets to as many eyes as possible.” [Continue reading…]

Facebooktwittermail

Why do Americans know so little about the world the U.S. has shaped?

Ishaan Tharoor writes: In a recent excerpt from her new book, American journalist Suzy Hansen described her bemusement when a friend in Istanbul suggested to her that the terrorist attacks on Sept. 11, 2001, had been somehow planned by the U.S. government.

“Come on, you don’t believe that,” said Hansen.

“Why not?” snapped back her friend, identified as Emre. “I do.”

“But it’s a conspiracy theory.”

Emre laughed and said: “Americans always dismiss these things as conspiracy theories. It’s the rest of the world who have had to deal with your conspiracies.”

This pronouncement prompted Hansen, an accomplished storyteller and reporter who has written powerfully about recent political events in Turkey, to reflect on what may underlie her friend’s animus. Her much-acclaimed new book, “Notes from a Foreign Country: An American Abroad in a Post-American World,” is a memoir of a young American who moves abroad and slowly grapples with how the rest of the world sees her nation — and how little her nation really sees the world.

She looks in particular at the extent to which U.S. foreign policy has shaped politics, societies and the fates of ordinary people elsewhere. In one anecdote, when Hansen asks an Iraqi man what his country “was like in the 1980s and 1990s, when he was growing up,” he replies: “I am always amazed when Americans ask me this. How is it that you know nothing about us when you had so much to do with what became of our lives?” [Continue reading…]

Facebooktwittermail

This is how our world could end

Peter Brannen writes: Many of us share some dim apprehension that the world is flying out of control, that the centre cannot hold. Raging wildfires, once-in-1,000-years storms and lethal heatwaves have become fixtures of the evening news – and all this after the planet has warmed by less than 1C above preindustrial temperatures. But here’s where it gets really scary.

If humanity burns through all its fossil fuel reserves, there is the potential to warm the planet by as much as 18C and raise sea levels by hundreds of feet. This is a warming spike of an even greater magnitude than that so far measured for the end-Permian mass extinction. If the worst-case scenarios come to pass, today’s modestly menacing ocean-climate system will seem quaint. Even warming to one-fourth of that amount would create a planet that would have nothing to do with the one on which humans evolved or on which civilisation has been built. The last time it was 4C warmer there was no ice at either pole and sea level was 80 metres higher than it is today.

I met University of New Hampshire paleoclimatologist Matthew Huber at a diner near his campus in Durham, New Hampshire. Huber has spent a sizable portion of his research career studying the hothouse of the early mammals and he thinks that in the coming centuries we might be heading back to the Eocene climate of 50 million years ago, when there were Alaskan palm trees and alligators splashed in the Arctic Circle.

“The modern world will be much more of a killing field,” he said. “Habitat fragmentation today will make it much more difficult to migrate. But if we limit it below 10C of warming, at least you don’t have widespread heat death.”

In 2010, Huber and his co-author, Steven Sherwood, published one of the most ominous science papers in recent memory, An Adaptability Limit to Climate Change Due to Heat Stress.

“Lizards will be fine, birds will be fine,” Huber said, noting that life has thrived in hotter climates than even the most catastrophic projections for anthropogenic global warming. This is one reason to suspect that the collapse of civilisation might come long before we reach a proper biological mass extinction. Life has endured conditions that would be unthinkable for a highly networked global society partitioned by political borders. Of course we’re understandably concerned about the fate of civilisation and Huber says that, mass extinction or not, it’s our tenuous reliance on an ageing and inadequate infrastructure, perhaps, most ominously, on power grids, coupled with the limits of human physiology that may well bring down our world. [Continue reading…]

Facebooktwittermail

We can’t thrive in a world without darkness

Rebecca Boyle writes: Sound dominated my senses as we left the village of San Pedro de Atacama and walked into the desert night. The crunch of shoes on gravel underlay our voices, which were hushed to avoid waking any households or street dogs. Our small group of astronomy writers was escaping from light and, without any flashlights or streetlamps, we struggled to see, so our other senses were heightened. Land that looked red by day was now monochromatic, the rods in our retinas serving as our only visual input.

After about 15 minutes of hiking, we stopped to take some pictures of the sky. I fumbled with my gear and tried to get my bearings, but everything was alien. I was horribly jet-lagged after 10 hours hunched against the window of a 757, another two-hour flight north from Santiago and a two-hour bus ride, and it wan’t just my oxygen-hungry brain that put me out of sorts. The Atacama Desert looked like Mars as drawn by Dr Seuss; I was surrounded by wrong-coloured cliffs and swirling rock formations. But I was determined to photograph something even more bizarre: the Large Magellanic Cloud, a dwarf galaxy you can see only from the southern hemisphere. I perched my camera on a rock and aimed at the sky, but the cosmic smudge would not resolve in my viewfinder. I stood, brushed dirt from my jeans, and looked up.

The unfamiliar sky momentarily took away what little breath I had left at 8,000 feet in elevation. Above the horizon was the conspicuous Southern Cross. Orion was there, too, but looked as disoriented as I felt, upside down to the world. And there were so many constellations I’d never seen, with hopeful, Latinate names such as Dorado and Reticulum. Countless stars blazed into view as I stared into the smear of the Milky Way.

To most people who have travelled outside the developed world – whether to camp or to meditate or to hunt – such bright and plentiful stars are a glorious sight. But this beauty instilled in me a creeping sense of guilt. At home, 1,500 miles north, I wouldn’t recognise such spangled heavens. From where I live in the American Midwest, the stars might as well not exist. After journeying millions of years, their light is swallowed by city glare and my porch lantern. Those that make it through will still fail: not even bright Betelgeuse can outshine my iPhone. Yet I am an astronomy writer, a person who thinks about stars and planets all the time. What does my neglect of the night sky say about the rest of humanity?

‘We are all descended from astronomers,’ the astrophysicist Neil deGrasse Tyson intones in the rebooted version of the TV show Cosmos. This is as poetic as it is true. Everyone owns the night sky; it was the one natural realm all our ancestors could see and know intimately. No river, no grand mountain or canyon, not even the oceans can claim that. But since Edison’s light bulbs colonised our cities, the vast majority of humans has ceased to see those skies. More than 60 per cent of the world, and fully 99 per cent of the US and Europe, lives under a yellowy sky polluted with light. [Continue reading…]

Facebooktwittermail

The real threat of artificial intelligence

Kai-Fu Lee writes: What worries you about the coming world of artificial intelligence?

Too often the answer to this question resembles the plot of a sci-fi thriller. People worry that developments in A.I. will bring about the “singularity” — that point in history when A.I. surpasses human intelligence, leading to an unimaginable revolution in human affairs. Or they wonder whether instead of our controlling artificial intelligence, it will control us, turning us, in effect, into cyborgs.

These are interesting issues to contemplate, but they are not pressing. They concern situations that may not arise for hundreds of years, if ever. At the moment, there is no known path from our best A.I. tools (like the Google computer program that recently beat the world’s best player of the game of Go) to “general” A.I. — self-aware computer programs that can engage in common-sense reasoning, attain knowledge in multiple domains, feel, express and understand emotions and so on.

This doesn’t mean we have nothing to worry about. On the contrary, the A.I. products that now exist are improving faster than most people realize and promise to radically transform our world, not always for the better. They are only tools, not a competing form of intelligence. But they will reshape what work means and how wealth is created, leading to unprecedented economic inequalities and even altering the global balance of power.

It is imperative that we turn our attention to these imminent challenges.

What is artificial intelligence today? Roughly speaking, it’s technology that takes in huge amounts of information from a specific domain (say, loan repayment histories) and uses it to make a decision in a specific case (whether to give an individual a loan) in the service of a specified goal (maximizing profits for the lender). Think of a spreadsheet on steroids, trained on big data. These tools can outperform human beings at a given task.

This kind of A.I. is spreading to thousands of domains (not just loans), and as it does, it will eliminate many jobs. Bank tellers, customer service representatives, telemarketers, stock and bond traders, even paralegals and radiologists will gradually be replaced by such software. Over time this technology will come to control semiautonomous and autonomous hardware like self-driving cars and robots, displacing factory workers, construction workers, drivers, delivery workers and many others.

Unlike the Industrial Revolution and the computer revolution, the A.I. revolution is not taking certain jobs (artisans, personal assistants who use paper and typewriters) and replacing them with other jobs (assembly-line workers, personal assistants conversant with computers). Instead, it is poised to bring about a wide-scale decimation of jobs — mostly lower-paying jobs, but some higher-paying ones, too. [Continue reading…]

Facebooktwittermail