On not being there: The data-driven body at work and at play

Rebecca Lemov writes: The protagonist of William Gibson’s 2014 science-fiction novel The Peripheral, Flynne Fisher, works remotely in a way that lends a new and fuller sense to that phrase. The novel features a double future: One set of characters inhabits the near future, ten to fifteen years from the present, while another lives seventy years on, after a breakdown of the climate and multiple other systems that has apocalyptically altered human and technological conditions around the world.

In that “further future,” only 20 percent of the Earth’s human population has survived. Each of these fortunate few is well off and able to live a life transformed by healing nanobots, somaticized e-mail (which delivers messages and calls to the roof of the user’s mouth), quantum computing, and clean energy. For their amusement and profit, certain “hobbyists” in this future have the Borgesian option of cultivating an alternative path in history — it’s called “opening up a stub” — and mining it for information as well as labor.

Flynne, the remote worker, lives on one of those paths. A young woman from the American Southeast, possibly Appalachia or the Ozarks, she favors cutoff jeans and resides in a trailer, eking out a living as a for-hire sub playing video games for wealthy aficionados. Recruited by a mysterious entity that is beta-testing drones that are doing “security” in a murky skyscraper in an unnamed city, she thinks at first that she has been taken on to play a kind of video game in simulated reality. As it turns out, she has been employed to work in the future as an “information flow” — low-wage work, though the pay translates to a very high level of remuneration in the place and time in which she lives.

What is of particular interest is the fate of Flynne’s body. Before she goes to work she must tend to its basic needs (nutrition and elimination), because during her shift it will effectively be “vacant.” Lying on a bed with a special data-transmitting helmet attached to her head, she will be elsewhere, inhabiting an ambulatory robot carapace — a “peripheral” — built out of bio-flesh that can receive her consciousness.

Bodies in this data-driven economic backwater of a future world economy are abandoned for long stretches of time — disposable, cheapened, eerily vacant in the temporary absence of “someone at the helm.” Meanwhile, fleets of built bodies, grown from human DNA, await habitation.

Alex Rivera explores similar territory in his Mexican sci-fi film The Sleep Dealer (2008), set in a future world after a wall erected on the US–Mexican border has successfully blocked migrants from entering the United States. Digital networks allow people to connect to strangers all over the world, fostering fantasies of physical and emotional connection. At the same time, low-income would-be migrant workers in Tijuana and elsewhere can opt to do remote work by controlling robots building a skyscraper in a faraway city, locking their bodies into devices that transmit their labor to the site. In tank-like warehouses, lined up in rows of stalls, they “jack in” by connecting data-transmitting cables to nodes implanted in their arms and backs. Their bodies are in Mexico, but their work is in New York or San Francisco, and while they are plugged in and wearing their remote-viewing spectacles, their limbs move like the appendages of ghostly underwater creatures. Their life force drained by the taxing labor, these “sleep dealers” end up as human discards.

What is surprising about these sci-fi conceits, from “transitioning” in The Peripheral to “jacking in” in The Sleep Dealer, is how familiar they seem, or at least how closely they reflect certain aspects of contemporary reality. Almost daily, we encounter people who are there but not there, flickering in and out of what we think of as presence. A growing body of research explores the question of how users interact with their gadgets and media outlets, and how in turn these interactions transform social relationships. The defining feature of this heavily mediated reality is our presence “elsewhere,” a removal of at least part of our conscious awareness from wherever our bodies happen to be. [Continue reading…]

facebooktwittermail

Each of us is, genetically, more microbial than human

The New York Times reports: Since 2007, when scientists announced plans for a Human Microbiome Project to catalog the micro-organisms living in our body, the profound appreciation for the influence of such organisms has grown rapidly with each passing year. Bacteria in the gut produce vitamins and break down our food; their presence or absence has been linked to obesity, inflammatory bowel disease and the toxic side effects of prescription drugs. Biologists now believe that much of what makes us human depends on microbial activity. The two million unique bacterial genes found in each human microbiome can make the 23,000 genes in our cells seem paltry, almost negligible, by comparison. “It has enormous implications for the sense of self,” Tom Insel, the director of the National Institute of Mental Health, told me. “We are, at least from the standpoint of DNA, more microbial than human. That’s a phenomenal insight and one that we have to take seriously when we think about human development.”

Given the extent to which bacteria are now understood to influence human physiology, it is hardly surprising that scientists have turned their attention to how bacteria might affect the brain. Micro-organisms in our gut secrete a profound number of chemicals, and researchers like [Mark] Lyte have found that among those chemicals are the same substances used by our neurons to communicate and regulate mood, like dopamine, serotonin and gamma-aminobutyric acid (GABA). [Continue reading…]

facebooktwittermail

Why the modern world is bad for your brain

Daniel J Levitin writes: Our brains are busier than ever before. We’re assaulted with facts, pseudo facts, jibber-jabber, and rumour, all posing as information. Trying to figure out what you need to know and what you can ignore is exhausting. At the same time, we are all doing more. Thirty years ago, travel agents made our airline and rail reservations, salespeople helped us find what we were looking for in shops, and professional typists or secretaries helped busy people with their correspondence. Now we do most of those things ourselves. We are doing the jobs of 10 different people while still trying to keep up with our lives, our children and parents, our friends, our careers, our hobbies, and our favourite TV shows.

Our smartphones have become Swiss army knife–like appliances that include a dictionary, calculator, web browser, email, Game Boy, appointment calendar, voice recorder, guitar tuner, weather forecaster, GPS, texter, tweeter, Facebook updater, and flashlight. They’re more powerful and do more things than the most advanced computer at IBM corporate headquarters 30 years ago. And we use them all the time, part of a 21st-century mania for cramming everything we do into every single spare moment of downtime. We text while we’re walking across the street, catch up on email while standing in a queue – and while having lunch with friends, we surreptitiously check to see what our other friends are doing. At the kitchen counter, cosy and secure in our domicile, we write our shopping lists on smartphones while we are listening to that wonderfully informative podcast on urban beekeeping.

But there’s a fly in the ointment. Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion. Earl Miller, a neuroscientist at MIT and one of the world experts on divided attention, says that our brains are “not wired to multitask well… When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.” So we’re not actually keeping a lot of balls in the air like an expert juggler; we’re more like a bad amateur plate spinner, frantically switching from one task to another, ignoring the one that is not right in front of us but worried it will come crashing down any minute. Even though we think we’re getting a lot done, ironically, multitasking makes us demonstrably less efficient. [Continue reading…]

facebooktwittermail

How water, paradoxically, creates the land we walk on

Julia Rosen writes: It’s no secret that water shapes the world around us. Rivers etch great canyons into the Earth’s surface, while glaciers reorganize the topography of entire mountain ranges. But water’s influence on the landscape runs much deeper than this: Water explains why we have land in the first place.

You might think of land as the bits of crust that just happen to jut up above sea level, but that’s mostly not the case. Earth’s continents rise above the seas in part because they are actually made of different stuff than the seafloor. Oceanic crust consists of dense, black basalt, which rides low in the mantle — like a wet log in a river — and eventually sinks back into Earth’s interior. But continental crust floats like a cork, thanks to one special rock: granite. If we didn’t have granite to lift the continents up, a vast ocean would cover our entire planet, with barely any land to speak of.

Gritty, gray granite and its rocky relatives dominate the continents. It forms the sheer walls of Yosemite Valley and the chiseled faces of Mount Rushmore (and also gleams from many a kitchen counter and shower stall). If you don’t see granite at the surface, you can bet it’s hiding just a few kilometers below your feet, unless you’re cruising over the middle of the ocean in a boat or plane. But what’s special about granite is that it’s relatively buoyant, for a rock—and that to make it, you need water. [Continue reading…]

facebooktwittermail

The art of attention: John Berger at 88

Philip Maughan writes: In 1967, while working with the Swiss photographer Jean Mohr on A Fortunate Man, a book about a country GP serving a deprived community in the Forest of Dean, Gloucestershire, John Berger began to reconsider what the role of a writer should be. “He does more than treat [his patients] when they are ill,” Berger wrote of John Sassall, a man whose proximity to suffering and poverty deeply affected him (he later committed suicide). The rural doctor assumes a democratic function, in Berger’s eyes, one he describes in consciously literary terms. “He is the objective witness of their lives,” he says. “The clerk of their records.”

The next five years marked a transition in Berger’s life. By 1972, when the groundbreaking art series Ways of Seeing aired on BBC television, Berger had been living on the Continent for over a decade. He won the Booker Prize for his novel G. the same year, announcing to an astonished audience at the black-tie ceremony in London that he would divide his prize money between the Black Panther Party (he denounced Booker McConnell’s historic links with plantations and indentured labour in the Caribbean) and the funding of his next project with Mohr, A Seventh Man, recording the experiences of migrant workers across Europe.

This is the point at which, for some in England, Berger became a more distant figure. He moved from Switzerland to a remote village in the French Alps two years later. “He thinks and feels what the community incoherently knows,” Berger wrote of Sassall, the “fortunate man”. After time spent working on A Seventh Man, those words were just as applicable to the writer himself. It was Berger who had become a “clerk”, collecting stories from the voiceless and dispossessed – peasants, migrants, even animals – a self-effacing role he would continue to occupy for the next 43 years.

The life and work of John Berger represents a challenge. How best to describe the output of a writer whose bibliography, according to Wikipedia, contains ten “novels”, four “plays”, three collections of “poetry” and 33 books labelled “other”?

“A kind of vicarious autobiography and a history of our time as refracted through the prism of art,” is how the writer Geoff Dyer introduced a selection of Berger’s non-fiction in 2001, though the category doesn’t quite fit. “To separate fact and ­imagination, event and feeling, protagonist and narrator, is to stay on dry land and never put to sea,” Berger wrote in 1991 in a manifesto (of sorts) inspired by James Joyce’s Ulysses, a book he first read, in French, at the age of 14. [Continue reading…]

facebooktwittermail

Everything we thought we knew about the genome is turning out to be wrong

Claire Ainsworth writes: Ask me what a genome is, and I, like many science writers, might mutter about it being the genetic blueprint of a living creature. But then I’ll confess that “blueprint” is a lousy metaphor since it implies that the genome is two-dimensional, prescriptive and unresponsive.

Now two new books about the genome show the limitation of that metaphor for something so intricate, complex, multilayered and dynamic. Both underscore the risks of taking metaphors too literally, not just in undermining popular understanding of science, but also in trammelling scientific enquiry. They are for anyone interested in how new discoveries and controversies will transform our understanding of biology and of ourselves.

John Parrington is an associate professor in molecular and cellular pharmacology at the University of Oxford. In The Deeper Genome, he provides an elegant, accessible account of the profound and unexpected complexities of the human genome, and shows how many ideas developed in the 20th century are being overturned.

Take DNA. It’s no simple linear code, but an intricately wound, 3D structure that coils and uncoils as its genes are read and spliced in myriad ways. Forget genes as discrete, protein-coding “beads on a string”: only a tiny fraction of the genome codes for proteins, and anyway, no one knows exactly what a gene is any more.[Continue reading…]

facebooktwittermail

DNA deciphers roots of modern Europeans

Carl Zimmer writes: For centuries, archaeologists have reconstructed the early history of Europe by digging up ancient settlements and examining the items that their inhabitants left behind. More recently, researchers have been scrutinizing something even more revealing than pots, chariots and swords: DNA.

On Wednesday in the journal Nature, two teams of scientists — one based at the University of Copenhagen and one based at Harvard University — presented the largest studies to date of ancient European DNA, extracted from 170 skeletons found in countries from Spain to Russia. Both studies indicate that today’s Europeans descend from three groups who moved into Europe at different stages of history.

The first were hunter-gatherers who arrived some 45,000 years ago in Europe. Then came farmers who arrived from the Near East about 8,000 years ago.

Finally, a group of nomadic sheepherders from western Russia called the Yamnaya arrived about 4,500 years ago. The authors of the new studies also suggest that the Yamnaya language may have given rise to many of the languages spoken in Europe today. [Continue reading…]

facebooktwittermail

Even atheists intuitively believe in a creator

sunflower

Tom Jacobs writes: Since the discoveries of Darwin, evidence has gradually mounted refuting the notion that the natural world is the product of a deity or other outside designer. Yet this idea remains firmly lodged in the human brain.

Just how firmly is the subject of newly published research, which finds even self-proclaimed atheists instinctively think of natural phenomena as being purposefully created.

The findings “suggest that there is a deeply rooted natural tendency to view nature as designed,” writes a research team led by Elisa Järnfelt of Newman University. They also provide evidence that, in the researchers’ words, “religious non-belief is cognitively effortful.” [Continue reading…]

facebooktwittermail

Direct connection discovered between the brain and the immune system

Science Daily reports: In a stunning discovery that overturns decades of textbook teaching, researchers at the University of Virginia School of Medicine have determined that the brain is directly connected to the immune system by vessels previously thought not to exist. That such vessels could have escaped detection when the lymphatic system has been so thoroughly mapped throughout the body is surprising on its own, but the true significance of the discovery lies in the effects it could have on the study and treatment of neurological diseases ranging from autism to Alzheimer’s disease to multiple sclerosis.

“Instead of asking, ‘How do we study the immune response of the brain?’ ‘Why do multiple sclerosis patients have the immune attacks?’ now we can approach this mechanistically. Because the brain is like every other tissue connected to the peripheral immune system through meningeal lymphatic vessels,” said Jonathan Kipnis, PhD, professor in the UVA Department of Neuroscience and director of UVA’s Center for Brain Immunology and Glia (BIG). “It changes entirely the way we perceive the neuro-immune interaction. We always perceived it before as something esoteric that can’t be studied. But now we can ask mechanistic questions.”

“We believe that for every neurological disease that has an immune component to it, these vessels may play a major role,” Kipnis said. “Hard to imagine that these vessels would not be involved in a [neurological] disease with an immune component.”

Kevin Lee, PhD, chairman of the UVA Department of Neuroscience, described his reaction to the discovery by Kipnis’ lab: “The first time these guys showed me the basic result, I just said one sentence: ‘They’ll have to change the textbooks.’ There has never been a lymphatic system for the central nervous system, and it was very clear from that first singular observation — and they’ve done many studies since then to bolster the finding — that it will fundamentally change the way people look at the central nervous system’s relationship with the immune system.” [Continue reading…]

facebooktwittermail

When the song dies

facebooktwittermail

The problem of translation

Gideon Lewis-Kraus writes: One Enlightenment aspiration that the science-­fiction industry has long taken for granted, as a necessary intergalactic conceit, is the universal translator. In a 1967 episode of “Star Trek,” Mr. Spock assembles such a device from spare parts lying around the ship. An elongated chrome cylinder with blinking red-and-green indicator lights, it resembles a retracted light saber; Captain Kirk explains how it works with an off-the-cuff disquisition on the principles of Chomsky’s “universal grammar,” and they walk outside to the desert-­island planet of Gamma Canaris N, where they’re being held hostage by an alien. The alien, whom they call The Companion, materializes as a fraction of sparkling cloud. It looks like an orange Christmas tree made of vaporized mortadella. Kirk grips the translator and addresses their kidnapper in a slow, patronizing, put-down-the-gun tone. The all-­powerful Companion is astonished.

“My thoughts,” she says with some confusion, “you can hear them.”

The exchange emphasizes the utopian ambition that has long motivated universal translation. The Companion might be an ion fog with coruscating globules of viscera, a cluster of chunky meat-parts suspended in aspic, but once Kirk has established communication, the first thing he does is teach her to understand love. It is a dream that harks back to Genesis, of a common tongue that perfectly maps thought to world. In Scripture, this allowed for a humanity so well ­coordinated, so alike in its understanding, that all the world’s subcontractors could agree on a time to build a tower to the heavens. Since Babel, though, even the smallest construction projects are plagued by terrible delays. [Continue reading…]

facebooktwittermail

The dysevolution of humanity

Jeff Wheelwright writes: I sat in my padded desk chair, hunched over, alternately entering notes on my computer and reading a book called The Story of the Human Body. It was the sort of book guaranteed to make me increasingly, uncomfortably aware of my own body. I squirmed to relieve an ache in my lower back. When I glanced out the window, the garden looked fuzzy. Where were my glasses? My toes felt hot and itchy: My athlete’s foot was flaring up again.

I returned to the book. “This chapter focuses on just three behaviors … that you are probably doing right now: wearing shoes, reading, and sitting.” OK, I was. What could be more normal?

According to the author, a human evolutionary biologist at Harvard named Daniel Lieberman, shoes, books and padded chairs are not normal at all. My body had good reason to complain because it wasn’t designed for these accessories. Too much sitting caused back pain. Too much focusing on books and computer screens at a young age fostered myopia. Enclosed, cushioned shoes could lead to foot problems, including bunions, fungus between the toes and plantar fasciitis, an inflammation of the tissue below weakened arches.

Those are small potatoes compared with obesity, Type 2 diabetes, osteoporosis, heart disease and many cancers also on the rise in the developed and developing parts of the world. These serious disorders share several characteristics: They’re chronic, noninfectious, aggravated by aging and strongly influenced by affluence and culture. Modern medicine has come up with treatments for them, but not solutions; the deaths and disabilities continue to climb.
lieberman

An evolutionary perspective is critical to understanding the body’s pitfalls in a time of plenty, Lieberman suggests. [Continue reading…]

facebooktwittermail

Why do Americans waste so much food?

facebooktwittermail

Extreme athletes gain control through fear – and sometimes pay the price

By Tim Woodman, Bangor University; Lew Hardy, Bangor University, and Matthew Barlow, Bangor University

The death of famed “daredevil” climber and base jumper Dean Potter has once again raised the idea that all high-risk sportspeople are hedonistic thrill seekers. Our research into extreme athletes shows this view is simplistic and wrong.

It’s about attitudes to risk. In his famous Moon speech in 1962, John F Kennedy said:

Many years ago the great British explorer George Mallory, who was to die on Mount Everest, was asked [by a New York Times journalist] why did he want to climb it. He said, ‘Because it is there.’ Well, space is there, and we’re going to climb it, and the moon and the planets are there, and new hopes for knowledge and peace are there …

Humans have evolved through taking risks. In fact, most human actions can be conceptualised as containing an element of risk: as we take our first step, we risk falling down; as we try a new food, we risk being disgusted; as we ride a bicycle, we risk falling over; as we go on a date, we risk being rejected; and as we travel to the moon, we risk not coming back.

Human endeavour and risk are intertwined. So it is not surprising that despite the increasingly risk-averse society that we live in, many people crave danger and risk – a life less sanitised.

[Read more…]

facebooktwittermail

A handful of Bronze-Age men could have fathered two thirds of Europeans

By Daniel Zadik, University of Leicester

For such a large and culturally diverse place, Europe has surprisingly little genetic variety. Learning how and when the modern gene-pool came together has been a long journey. But thanks to new technological advances a picture is slowly coming together of repeated colonisation by peoples from the east with more efficient lifestyles.

In a new study, we have added a piece to the puzzle: the Y chromosomes of the majority of European men can be traced back to just three individuals living between 3,500 and 7,300 years ago. How their lineages came to dominate Europe makes for interesting speculation. One possibility could be that their DNA rode across Europe on a wave of new culture brought by nomadic people from the Steppe known as the Yamnaya.

Stone Age Europe

The first-known people to enter Europe were the Neanderthals – and though they have left some genetic legacy, it is later waves who account for the majority of modern European ancestry. The first “anatomically modern humans” arrived in the continent around 40,000 years ago. These were the Palaeolithic hunter-gatherers sometimes called the Cro-Magnons. They populated Europe quite sparsely and lived a lifestyle not very different from that of the Neanderthals they replaced.

Then something revolutionary happened in the Middle East – farming, which allowed for enormous population growth. We know that from around 8,000 years ago a wave of farming and population growth exploded into both Europe and South Asia. But what has been much less clear is the mechanism of this spread. How much was due to the children of the farmers moving into new territories and how much was due to the neighbouring hunter-gathers adopting this new way of life?

[Read more…]

facebooktwittermail

The oldest stone tools yet discovered are unearthed in Kenya

Smithsonian magazine: Approximately 3.3 million years ago someone began chipping away at a rock by the side of a river. Eventually, this chipping formed the rock into a tool used, perhaps, to prepare meat or crack nuts. And this technological feat occurred before humans even showed up on the evolutionary scene.

That’s the conclusion of an analysis published today in Nature of the oldest stone tools yet discovered. Unearthed in a dried-up riverbed in Kenya, the shards of scarred rock, including what appear to be early hammers and cutting instruments, predate the previous record holder by around 700,000 years. Though it’s unclear who made the tools, the find is the latest and most convincing in a string of evidence that toolmaking began before any members of the Homo genus walked the Earth.

“This discovery challenges the idea that the main characters that make us human — making stone tools, eating more meat, maybe using language — all evolved at once in a punctuated way, near the origins of the genus Homo,” says Jason Lewis, a paleoanthropologist at Rutgers University and co-author of the study. [Continue reading…]

facebooktwittermail

Americans should start paying attention to their quality of death

Lauren Alix Brown writes: At the end, they both required antipsychotics. Each had become unrecognizable to their families.

On the day that Sandy Bem, a Cornell psychology professor, 65, was diagnosed with Alzheimer’s, she decided that she would take her own life before the disease obliterated her entirely. As Robin Marantz Henig writes in the New York Times Magazine, Bem said, “I want to live only for as long as I continue to be myself.”

When she was 34, Nicole Teague was diagnosed with metastatic ovarian cancer. Her husband Matthew writes about the ordeal in Esquire: “We don’t tell each other the truth about dying, as a people. Not real dying. Real dying, regular and mundane dying, is so hard and so ugly that it becomes the worst thing of all: It’s grotesque. It’s undignified. No one ever told me the truth about it, not once.”

Matthew tells the truth, and it is horrifying. Over the course of two years, Nicole’s body becomes a rejection of the living. Extensive wounds on her abdomen from surgery expel feces and fistulas filled with food. Matthew spends his days tending to her needs, packing her wounds with ribbon, administering morphine and eventually Dilaudid; at night he goes into a closet, wraps a blanket around his head, stuffs it into a pile of dirty laundry, and screams.

These two stories bring into sharp focus what it looks like when an individual and her family shepherd death, instead of a team of doctors and a hospital. It’s a conversation that is being had more frequently in the US as the baby boomer population ages (pdf) and more Americans face end-of-life choices. As a nation, we are learning — in addition to our quality of life, we should pay attention to the quality of our death. [Continue reading…]

facebooktwittermail

Why the singularity is greatly exaggerated

Ken Goldberg, Professor of Industrial Engineering and Operations at the University of California, interviewed by Jeanne Carstensen.

In 1968, Marvin Minsky said, “Within a generation we will have intelligent computers like HAL in the film, 2001.” What made him and other early AI proponents think machines would think like humans?

Even before Moore’s law there was the idea that computers are going to get faster and their clumsy behavior is going to get a thousand times better. It’s what Ray Kurzweil now claims. He says, “OK, we’re moving up this curve in terms of the number of neurons, number of processing units, so by this projection we’re going to be at super-human levels of intelligence.” But that’s deceptive. It’s a fallacy. Just adding more speed or neurons or processing units doesn’t mean you end up with a smarter or more capable system. What you need are new algorithms, new ways of understanding a problem. In the area of creativity, it’s not at all clear that a faster computer is going to get you there. You’re just going to come up with more bad, bland, boring things. That ability to distinguish, to filter out what’s interesting, that’s still elusive.

Today’s computers, though, can generate an awful lot of connections in split seconds.

But generating is fairly easy and testing pretty hard. In Robert Altman’s movie, The Player, they try to combine two movies to make a better one. You can imagine a computer that just takes all movie titles and tries every combination of pairs, like Reservoir Dogs meets Casablanca. I could write that program right now on my laptop and just let it run. It would instantly generate all possible combinations of movies and there will be some good ones. But recognizing them, that’s the hard part.

That’s the part you need humans for.

Right, the Tim Robbins movie exec character says, “I listen to stories and decide if they’ll make good movies or not.” The great majority of combinations won’t work, but every once in a while there’s one that is both new and interesting. In early AI it seemed like the testing was going to be easy. But we haven’t been able to figure out the filtering.

Can’t you write a creativity algorithm?

If you want to do variations on a theme, like Thomas Kinkade, sure. Take our movie machine. Let’s say there have been 10,000 movies — that’s 10,000 squared, or 100 million combinations of pairs of movies. We can build a classifier that would look at lots of pairs of successful movies and do some kind of inference on it so that it could learn what would be successful again. But it would be looking for patterns that are already existent. It wouldn’t be able to find that new thing that was totally out of left field. That’s what I think of as creativity — somebody comes up with something really new and clever. [Continue reading…]

facebooktwittermail